Science.gov

Sample records for 320-row multi-detector computed

  1. Image Quality of Coronary Computed Tomography Angiography with 320-Row Area Detector Computed Tomography in Children with Congenital Heart Disease.

    PubMed

    Tada, Akihiro; Sato, Shuhei; Kanie, Yuichiro; Tanaka, Takashi; Inai, Ryota; Akagi, Noriaki; Morimitsu, Yusuke; Kanazawa, Susumu

    2016-03-01

    The objective of this study was to assess factors affecting image quality of 320-row computed tomography angiography (CTA) of coronary arteries in children with congenital heart disease (CHD). We retrospectively reviewed 28 children up to 3 years of age with CHD who underwent prospective electrocardiography (ECG)-gated 320-row CTA with iterative reconstruction. We assessed image quality of proximal coronary artery segments using a five-point scale. Age, body weight, average heart rate, and heart rate variability were recorded and compared between two groups: patients with good diagnostic image quality in all four coronary artery segments and patients with at least one coronary artery segment with nondiagnostic image quality. Altogether, 96 of 112 segments (85.7 %) had diagnostic-quality images. Patients with nondiagnostic segments were significantly younger (10.0 ± 11.6 months) and had lower body weight (5.9 ± 2.9 kg) (each p < 0.05) than patients with diagnostic image quality of all four segments (20.6 ± 13.8 months and 8.4 ± 2.5 kg, respectively; each p < 0.05). Differences in heart rate and heart rate variability between the two imaging groups were not significant. Receiver operating characteristic analyses for predicting patients with nondiagnostic image quality revealed an optimal body weight cutoff of ≤5.6 kg and an optimal age cutoff of ≤12.5 months. Prospective ECG-gated 320-row CTA with iterative reconstruction provided feasible image quality of coronary arteries in children with CHD. Younger age and lower body weight were factors that led to poorer image quality of coronary arteries.

  2. Entrance surface dose measurements using a small OSL dosimeter with a computed tomography scanner having 320 rows of detectors.

    PubMed

    Takegami, Kazuki; Hayashi, Hiroaki; Yamada, Kenji; Mihara, Yoshiki; Kimoto, Natsumi; Kanazawa, Yuki; Higashino, Kousaku; Yamashita, Kazuta; Hayashi, Fumio; Okazaki, Tohru; Hashizume, Takuya; Kobayashi, Ikuo

    2017-03-01

    Entrance surface dose (ESD) measurements are important in X-ray computed tomography (CT) for examination, but in clinical settings it is difficult to measure ESDs because of a lack of suitable dosimeters. We focus on the capability of a small optically stimulated luminescence (OSL) dosimeter. The aim of this study is to propose a practical method for using an OSL dosimeter to measure the ESD when performing a CT examination. The small OSL dosimeter has an outer width of 10 mm; it is assumed that a partial dose may be measured because the slice thickness and helical pitch can be set to various values. To verify our method, we used a CT scanner having 320 rows of detectors and checked the consistencies of the ESDs measured using OSL dosimeters by comparing them with those measured using Gafchromic™ films. The films were calibrated using an ionization chamber on the basis of half-value layer estimation. On the other hand, the OSL dosimeter was appropriately calibrated using a practical calibration curve previously proposed by our group. The ESDs measured using the OSL dosimeters were in good agreement with the reference ESDs from the Gafchromic™ films. Using these data, we also estimated the uncertainty of ESDs measured with small OSL dosimeters. We concluded that a small OSL dosimeter can be considered suitable for measuring the ESD with an uncertainty of 30 % during CT examinations in which pitch factors below 1.000 are applied.

  3. Comparison of cerebral blood flow data obtained by computed tomography (CT) perfusion with that obtained by xenon CT using 320-row CT.

    PubMed

    Takahashi, Satoshi; Tanizaki, Yoshio; Kimura, Hiroaki; Akaji, Kazunori; Kano, Tadashige; Suzuki, Kentaro; Takayama, Youhei; Kanzawa, Takao; Shidoh, Satoka; Nakazawa, Masaki; Yoshida, Kazunari; Mihara, Ban

    2015-03-01

    Cerebral blood flow (CBF) data obtained by computed tomography perfusion (CTP) imaging have been shown to be qualitative data rather than quantitative, in contrast with data obtained by other imaging methods, such as xenon CT (XeCT) imaging. Thus, interpatient comparisons of CBF values themselves obtained by CTP may be inaccurate. In this study, we have compared CBF ratios as well as CBF values obtained from CTP-CBF data to those obtained from XeCT-CBF data for the same patients to determine CTP-CBF parameters that can be used for interpatient comparisons. The data used in the present study were obtained as volume data using 320-row CT. The volume data were applied to an automated region of interest-determining software (3DSRT, version 3.5.2 ) and converted to 59 slices of 2 mm interval standardized images. In the present study, we reviewed 10 patients with occlusive cerebrovascular diseases (CVDs) undergoing both CTP and XeCT in the same period. Our study shows that ratios of CBF measurements, such as hemodynamic stress distribution (perforator-to-cortical flow ratio of middle cerebral artery [MCA] region) or the left/right ratio for the region of the MCA, calculated using CTP data have been shown to correlate well with the same ratios calculated using XeCT data. These results suggest that such CBF ratios could be useful for generating interpatient comparisons of CTP-CBF data obtained by 320-row CT among patients with occlusive CVD.

  4. Diagnostic performance of combined noninvasive coronary angiography and myocardial perfusion imaging using 320 row detector computed tomography: design and implementation of the CORE320 multicenter, multinational diagnostic study.

    PubMed

    Vavere, Andrea L; Simon, Gregory G; George, Richard T; Rochitte, Carlos E; Arai, Andrew E; Miller, Julie M; Di Carli, Marcello; Arbab-Zadeh, Armin; Zadeh, Armin A; Dewey, Marc; Niinuma, Hiroyuki; Laham, Roger; Rybicki, Frank J; Schuijf, Joanne D; Paul, Narinder; Hoe, John; Kuribyashi, Sachio; Sakuma, Hajime; Nomura, Cesar; Yaw, Tan Swee; Kofoed, Klaus F; Yoshioka, Kunihiro; Clouse, Melvin E; Brinker, Jeffrey; Cox, Christopher; Lima, Joao A C

    2011-01-01

    Multidetector coronary computed tomography angiography (CTA) is a promising modality for widespread clinical application because of its noninvasive nature and high diagnostic accuracy as found in previous studies using 64 to 320 simultaneous detector rows. It is, however, limited in its ability to detect myocardial ischemia. In this article, we describe the design of the CORE320 study ("Combined coronary atherosclerosis and myocardial perfusion evaluation using 320 detector row computed tomography"). This prospective, multicenter, multinational study is unique in that it is designed to assess the diagnostic performance of combined 320-row CTA and myocardial CT perfusion imaging (CTP) in comparison with the combination of invasive coronary angiography and single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI). The trial is being performed at 16 medical centers located in 8 countries worldwide. CT has the potential to assess both anatomy and physiology in a single imaging session. The co-primary aim of the CORE320 study is to define the per-patient diagnostic accuracy of the combination of coronary CTA and myocardial CTP to detect physiologically significant coronary artery disease compared with (1) the combination of conventional coronary angiography and SPECT-MPI and (2) conventional coronary angiography alone. If successful, the technology could revolutionize the management of patients with symptomatic CAD.

  5. 320-row coronary computed tomography angiography (CCTA) with automatic exposure control (AEC): effect of 100 kV versus 120 kV on image quality and dose exposure.

    PubMed

    Di Cesare, Ernesto; Gennarelli, Antonio; Di Sibio, Alessandra; Felli, Valentina; Perri, Marco; Splendiani, Alessandra; Gravina, Giovanni Luca; Barile, Antonio; Masciocchi, Carlo

    2016-08-01

    To compare the impact of a 100 kV tube voltage protocol to 120 kV in terms of image quality and radiation dose by a 320 row coronary computed tomography angiography (CCTA) with automatic exposure control (AEC). Using a propensity matched analysis we compared a group of 135 patients scanned using a 100 kV tube voltage protocol with a group of 135 subjects scanned employing a 120 kV tube voltage setting. In all subjects the heart rate (HR) was <65 bpm and all CT scans were acquired using a prospective ECG gating and AEC strategy. Mean effective radiation dose and subjective and objective (Noise or N, signal to noise ratio or SNR, contrast to noise ratio or CNR) image quality, were evaluated. Subjective quality was assessed by two experienced radiologists using a 5-point scale (0: non diagnostic-4: excellent) using the 15-segment American Heart Association (AHA) coronary artery classification. Mean effective dose and noise were non significantly different between the two groups: mean effective dose was 2.89 ± 0.7 mSv in the 100 kV group and 2.80 ± 0.57 mSv in the 120 kV group (p = 0.25) while noise was 28.9 ± 3.3 in the 120 kV group and 29.05 ± 3.6 in the 100 kV group (p = 0.72). Both SNR and CNR were significantly higher in the 100 kV group than in the 120 kV group. This data agrees with the evidence that subjective quality was significantly higher in the 100 kV group in the middle and distal segmental classes. Our study shows that, in using a 320 row CCTA with AEC strategy it is better to employ a 100 kV tube voltage protocol because compared to 120 kV tube voltage setting, it appears to significantly improve both subjective and objective image quality without decreasing the mean effective radiation dose.

  6. Noninvasive coronary angiography by 320-row computed tomography with lower radiation exposure and maintained diagnostic accuracy: comparison of results with cardiac catheterization in a head-to-head pilot investigation.

    PubMed

    Dewey, Marc; Zimmermann, Elke; Deissenrieder, Florian; Laule, Michael; Dübel, Hans-Peter; Schlattmann, Peter; Knebel, Fabian; Rutsch, Wolfgang; Hamm, Bernd

    2009-09-08

    Noninvasive coronary angiography with the use of multislice computed tomography (CT) scanners is feasible with high sensitivity and negative predictive value; however, the radiation exposure associated with this technique is rather high. We evaluated coronary angiography using whole-heart 320-row CT, which avoids exposure-intensive overscanning and overranging. A total of 30 consecutive patients with suspected coronary artery disease referred for clinically indicated conventional coronary angiography (CCA) were included in this prospective intention-to-diagnose study. CT was performed with the use of up to 320 simultaneous detector rows before same-day CCA, which, together with quantitative analysis, served as the reference standard. The per-patient sensitivity and specificity for CT compared with CCA were 100% (95% confidence interval [CI], 72 to 100) and 94% (95% CI, 73 to 100), respectively. Per-vessel versus per-segment sensitivity and specificity were 89% (95% CI, 62 to 98) and 96% (95% CI, 90 to 99) versus 78% (95% CI, 56 to 91) and 98% (95% CI, 96 to 99), respectively. Interobserver agreement between the 2 readers was significantly better for CCA (97% of 121 coronary arteries) than for CT (90%; P=0.04). Percent diameter stenosis determined with the use of CT showed good correlation with CCA (P<0.001, R=0.81) without significant underestimation or overestimation (-3.1+/-24.4%; P=0.08). Intraindividual comparison of CT with CCA revealed a significantly smaller effective radiation dose (median, 4.2 versus 8.5 mSv; P<0.05) and amount of contrast agent required (median, 80 versus 111 mL; P<0.001) for 320-row CT. The majority of patients (87%) indicated that they would prefer CT over CCA for future diagnostic imaging (P<0.001). CT with the use of emerging technology has the potential to significantly reduce the radiation dose and amount of contrast agent required compared with CCA while maintaining high diagnostic accuracy.

  7. Australian diagnostic reference levels for multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony; Marks, Paul; Edmonds, Keith; Tingey, David; Johnston, Peter

    2013-03-01

    The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) is undertaking web based surveys to obtain data to establish national diagnostic reference levels (DRLs) for diagnostic imaging. The first set of DRLs to be established are for multi detector computed tomography (MDCT). The survey samples MDCT dosimetry metrics: dose length product (DLP, mGy.cm) and volume computed tomography dose index (CTDIvol, mGy), for six common protocols/habitus: Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine from individual radiology clinics and platforms. A practice reference level (PRL) for a given platform and protocol is calculated from a compliant survey containing data collected from at least ten patients. The PRL is defined as the median of the DLP/CTDIvol values for a single compliant survey. Australian National DRLs are defined as the 75th percentile of the distribution of the PRLs for each protocol and age group. Australian National DRLs for adult MDCT have been determined in terms of DLP and CTDIvol. In terms of DLP the national DRLs are 1,000 mGy cm, 600 mGy cm, 450 mGy cm, 700 mGy cm, 1,200 mGy cm, and 900 mGy cm for the protocols Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine respectively. Average dose values obtained from the European survey Dose Datamed I reveal Australian doses to be higher by comparison for four out of the six protocols. The survey is ongoing, allowing practices to optimise dose delivery as well as allowing the periodic update of DRLs to reflect changes in technology and technique.

  8. Myocardial perfusion 320-row multidetector computed tomography-guided treatment strategy for the clinical management of patients with recent acute-onset chest pain: Design of the CArdiac cT in the treatment of acute CHest pain (CATCH)-2 randomized controlled trial.

    PubMed

    Sørgaard, Mathias; Linde, Jesper J; Hove, Jens D; Petersen, Jan R; Jørgensen, Tem B S; Abdulla, Jawdat; Heitmann, Merete; Kragelund, Charlotte; Hansen, Thomas Fritz; Udholm, Patricia M; Pihl, Christian; Kühl, J Tobias; Engstrøm, Thomas; Jensen, Jan Skov; Høfsten, Dan E; Kelbæk, Henning; Kofoed, Klaus F

    2016-09-01

    Patients admitted with chest pain are a diagnostic challenge because the majority does not have coronary artery disease (CAD). Assessment of CAD with coronary computed tomography angiography (CCTA) is safe, cost-effective, and accurate, albeit with a modest specificity. Stress myocardial computed tomography perfusion (CTP) has been shown to increase the specificity when added to CCTA, without lowering the sensitivity. This article describes the design of a randomized controlled trial, CATCH-2, comparing a clinical diagnostic management strategy of CCTA alone against CCTA in combination with CTP. Patients with acute-onset chest pain older than 50 years and with at least one cardiovascular risk factor for CAD are being prospectively enrolled to this study from 6 different clinical sites since October 2013. A total of 600 patients will be included. Patients are randomized 1:1 to clinical management based on CCTA or on CCTA in combination with CTP, determining the need for further testing with invasive coronary angiography including measurement of the fractional flow reserve in vessels with coronary artery lesions. Patients are scanned with a 320-row multidetector computed tomography scanner. Decisions to revascularize the patients are taken by the invasive cardiologist independently of the study allocation. The primary end point is the frequency of revascularization. Secondary end points of clinical outcome are also recorded. The CATCH-2 will determine whether CCTA in combination with CTP is diagnostically superior to CCTA alone in the management of patients with acute-onset chest pain. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. 320-row CT renal perfusion imaging in patients with aortic dissection: A preliminary study

    PubMed Central

    Liu, Dongting; Liu, Jiayi; Wen, Zhaoying; Li, Yu; Sun, Zhonghua; Xu, Qin; Fan, Zhanming

    2017-01-01

    Objective To investigate the clinical value of renal perfusion imaging in patients with aortic dissection (AD) using 320-row computed tomography (CT), and to determine the relationship between renal CT perfusion imaging and various factors of aortic dissection. Methods Forty-three patients with AD who underwent 320-row CT renal perfusion before operation were prospectively enrolled in this study. Diagnosis of AD was confirmed by transthoracic echocardiography. Blood flow (BF) of bilateral renal perfusion was measured and analyzed. CT perfusion imaging signs of AD in relation to the type of AD, number of entry tears and the false lumen thrombus were observed and compared. Results The BF values of patients with type A AD were significantly lower than those of patients with type B AD (P = 0.004). No significant difference was found in the BF between different numbers of intimal tears (P = 0.288), but BF values were significantly higher in cases with a false lumen without thrombus and renal arteries arising from the true lumen than in those with thrombus (P = 0.036). The BF values measured between the true lumen, false lumen and overriding groups were different (P = 0.02), with the true lumen group having the highest. Also, the difference in BF values between true lumen and false lumen groups was statistically significant (P = 0.016), while no statistical significance was found in the other two groups (P > 0.05). The larger the size of intimal entry tears, the greater the BF values (P = 0.044). Conclusions This study shows a direct correlation between renal CT perfusion changes and AD, with the size, number of intimal tears, different types of AD, different renal artery origins and false lumen thrombosis, significantly affecting the perfusion values. PMID:28182709

  10. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography.

    PubMed

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center.

  11. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography

    PubMed Central

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center. PMID:26430541

  12. Derivation of Australian diagnostic reference levels for paediatric multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony

    2016-09-01

    Australian National Diagnostic Reference Levels for paediatric multi detector computed tomography were established for three protocols, Head, Chest and AbdoPelvis, across two age groups, Baby/Infant 0-4 years and Child 5-14 years by the Australian Radiation Protection and Nuclear Safety Agency in 2012. The establishment of Australian paediatric DRLs is an important step towards lowering patient CT doses on a national scale. While Adult DRLs were calculated with data collected from the web based Australian National Diagnostic Reference Level Service, no paediatric data was submitted in the first year of service operation. Data from an independent Royal Australian and New Zealand College of Radiologists Quality Use of Diagnostic Imaging paediatric optimisation survey was used. The paediatric DRLs were defined for CTDIvol (mGy) and DLP (mGy·cm) values that referenced the 16 cm PMMA phantom for the Head protocol and the 32 cm PMMA phantom for body protocols for both paediatric age groups. The Australian paediatric DRLs for multi detector computed tomography are for the Head, Chest and AbdoPelvis protocols respectively, 470, 60 and 170 mGy·cm for the Baby/Infant age group, and 600, 110 and 390 mGy·cm for the Child age group. A comparison with published international paediatric DRLs for computed tomography reveal the Australian paediatric DRLs to be lower on average. However, the comparison is complicated by misalignment of defined age ranges. It is the intention of ARPANSA to review the paediatric DRLs in conjunction with a review of the adult DRLs, which should occur within 5 years of their publication.

  13. [Weighted-averaging multi-planar reconstruction method for multi-detector row computed tomography].

    PubMed

    Aizawa, Mitsuhiro; Nishikawa, Keiichi; Sasaki, Keita; Kobayashi, Norio; Yama, Mitsuru; Sano, Tsukasa; Murakami, Shin-ichi

    2012-01-01

    Development of multi-detector row computed tomography (MDCT) has enabled three-dimensions (3D) scanning with minute voxels. Minute voxels improve spatial resolution of CT images. At the same time, however, they increase image noise. Multi-planar reconstruction (MPR) is one of effective 3D-image processing techniques. The conventional MPR technique can adjust slice thickness of MPR images. When a thick slice is used, the image noise is decreased. In this case, however, spatial resolution is deteriorated. In order to deal with this trade-off problem, we have developed the weighted-averaging multi-planar reconstruction (W-MPR) technique to control the balance between the spatial resolution and noise. The weighted-average is determined by the Gaussian-type weighting function. In this study, we compared the performance of W-MPR with that of conventional simple-addition-averaging MPR. As a result, we could confirm that W-MPR can decrease the image noise without significant deterioration of spatial resolution. W-MPR can adjust freely the weight for each slice by changing the shape of the weighting function. Therefore, W-MPR can allow us to select a proper balance of spatial resolution and noise and at the same time produce suitable MPR images for observation of targeted anatomical structures.

  14. Colonic perforation by a transmural and transvalvular migrated retained sponge: multi-detector computed tomography findings.

    PubMed

    Camera, Luigi; Sagnelli, Marco; Guadagno, Paolo; Mainenti, Pier Paolo; Marra, Teresa; Scotto di Santolo, Maria; Fei, Landino; Salvatore, Marco

    2014-04-21

    Transmural migrated retained sponges usually impact at the level of the ileo-cecal valve leading to a small bowel obstruction. Once passed through the ileo-cecal valve, a retained sponge can be propelled forward by peristaltic activity and eliminated with feces. We report the case of a 52-year-old female with a past surgical history and recurrent episodes of abdominal pain and constipation. On physical examination, a generalized resistance was observed with tenderness in the right flank. Contrast-enhanced multi-detector computed tomography findings were consistent with a perforated right colonic diverticulitis with several out-pouchings at the level of the ascending colon and evidence of free air in the right parieto-colic gutter along with an air-fluid collection within the mesentery. In addition, a ring-shaped hyperdense intraluminal material was also noted. At surgery, the ascending colon appeared irregularly thickened and folded with a focal wall interruption and a peri-visceral abscess at the level of the hepatic flexure, but no diverticula were found. A right hemi-colectomy was performed and on dissection of the surgical specimen a retained laparotomy sponge was found in the bowel lumen.

  15. Managing patient dose in multi-detector computed tomography(MDCT). ICRP Publication 102.

    PubMed

    Valentin, J

    2007-01-01

    Computed tomography (CT) technology has changed considerably in recent years with the introduction of increasing numbers of multiple detector arrays. There are several parameters specific to multi-detector computed tomography (MDCT) scanners that increase or decrease patient dose systematically compared to older single detector computed tomography (SDCT) scanners. This document briefly reviews the MDCT technology, radiation dose in MDCT, including differences from SDCT and factors that affect dose, radiation risks, and the responsibilities for patient dose management. The document recommends that users need to understand the relationship between patient dose and image quality and be aware that image quality in CT is often higher than that necessary for diagnostic confidence. Automatic exposure control (AEC) does not totally free the operator from selection of scan parameters, and awareness of individual systems is important. Scanning protocols cannot simply be transferred between scanners from different manufacturers and should be determined for each MDCT. If the image quality is appropriately specified by the user, and suited to the clinical task, there will be a reduction in patient dose for most patients. Understanding of some parameters is not intuitive and the selection of image quality parameter values in AEC systems is not straightforward. Examples of some clinical situation shave been included to demonstrate dose management, e.g. CT examinations of the chest, the heart for coronary calcium quantification and non-invasive coronary angiography, colonography, the urinary tract, children, pregnant patients, trauma cases, and CT guided interventions. CT is increasingly being used to replace conventional x-ray studies and it is important that patient dose is given careful consideration, particularly with repeated or multiple examinations.

  16. Quadricuspid pulmonary valve in an adult patient identified by transthoracic echocardiography and multi-detector computed tomography.

    PubMed

    Jung, Soo-Yeon

    2015-01-01

    Quadricuspid pulmonary valve is a rare congenital heart disease. It is infrequently associated with significant clinical complications and tends to be clinically silent. Because of its benign nature, it has been diagnosed mainly post mortem. Its diagnosis by transthoracic echocardiography is very difficult because of the anatomical features. We describe a case of quadricuspid pulmonary valve diagnosed by transthoracic echocardiography and electrocardiography-gated multi-detector row computed tomography.

  17. Multiphase Multi-Detector Row Computed Tomography Imaging Characteristics of Large (>5 cm) Focal Hepatocellular Carcinoma.

    PubMed

    Blaschke, Eric M; Rao, Vijaya L; Xiong, Lingyun; Te, Helen S; Hart, John; Reddy, K Gautham; Oto, Aytekin

    2016-01-01

    The aim of this study was to describe the multiphase multi-detector row computed tomography (MDCT) imaging findings of large (>5 cm) focal hepatocellular carcinoma (HCC). Following review of the medical records of 321 patients with newly diagnosed HCC who underwent MDCT within the radiology database from January 2007 to November 2014, 27 patients (20 men and 7 women; mean age, 69 [SD, 10.1] years [range, 49-87 years]) with histologically confirmed HCC greater than 5 cm were included in this institutional review board-approved study. Multiphase, dedicated liver MDCT images of these cases were retrospectively reviewed by 2 radiologists in consensus to describe the enhancement characteristics of these lesions. Mean tumor diameter was 8.4 (SD, 2.4) cm (range, 5.2-13.5 cm). Cirrhosis was present in 16 (59%) of 27 patients. Seventeen (85%) of 20 patients with available laboratory data presented with elevated alpha-fetoprotein (median, 97 ng/mL). Twenty-three (85%) of 27 demonstrated either heterogeneous enhancement with gradual fill-in (14/27 [52%]) or peripheral enhancement with centripetal fill-in (9/27 [33%]). Twenty-two (81%) of 27 lacked washout on delayed phase images, and 21 (78%) of 27 demonstrated a pseudocapsule. Twenty-seven of 27 lesions were well defined, 8 (30%) of 27 were exophytic, 15 (56%) of 27 were unifocal, 5 (25%) of 20 cases demonstrated vascular invasion, and 7 (26%) of 27 cases presented with extrahepatic metastases. Large (>5 cm) focal HCC may present as a dominant mass with a pseudocapsule and initial heterogeneous or peripheral enhancement with gradual or centripetal fill-in without washout on multiphase MDCT. Awareness of this variant is important to allow distinction from other benign (eg, hemangioma) and malignant (eg, cholangiocarcinoma) focal liver lesions.

  18. Mapping epicardial fat with multi-detector computed tomography to facilitate percutaneous transepicardial arrhythmia ablation.

    PubMed

    Abbara, Suhny; Desai, Jay C; Cury, Ricardo C; Butler, Javed; Nieman, Koen; Reddy, Vivek

    2006-03-01

    A sizable portion of ventricular tachycardia circuits are epicardial, especially in patients with non-ischemic cardiomyopathy, e.g. Chagas disease. Thus there is a growing interest among the electrophysiologists in transepicardial mapping and myocardial ablation for treatment of arrhythmias. However, increased epicardial fat can be a significant hindrance in procedural success as it can mimic infarct during mapping and can also decrease the effectiveness of ablation. Quantitative knowledge of epicardial fat pre-procedure can potentially significantly facilitate the conduct and outcomes of these procedures. In this study we assessed the epicardial fat distribution and thickness in vivo in 59 patients who underwent multi-detector computed tomography (MDCT) for coronary artery assessment using a 16-slice scanner. Multiplanar reconstructions were obtained in the ventricular short axis at the basal, mid ventricular, and near the apex level, and in a four-chamber view. In the short axis slices, we measured epicardial fat diameter in nine segments, and in the four-chamber view, it was measured in five segments. In grooved segments the maximum fat thickness was recorded, while in non-grooved segments thickness at three equally spaced points were averaged. The results were as follows starting clockwise: superior inter-ventricular (IV) groove (all measurements are in mm, in basal, mid ventricular, and apical levels, respectively) (11.2, 8.6, 7.3), left ventricular (LV) superior lateral wall (1.0, 1.5, 1.7), LV inferior lateral wall (1.3, 2.2, 3.5), inferior IV groove (9.2, 6.5, 6.1), right ventricular (RV) diaphragmatic wall (1.4, 0.2, 1.0), acute margin (9.2, 7.3, 7.8), RV anterior free wall inferior (6.8, 4.0, 4.7), RV anterior free wall superior (6.5, 3.2, 3.1), RV superior wall (5.6, 2.7, 4.0), We measured the following four-chamber segments: LV apex (2.8 mm), left atrio-ventricular (AV) groove (12.7), right AV groove (14.8), RV apex (4.8), and anterior IV groove (7

  19. Low-dose triple-rule-out using 320-row-detector volume MDCT--less contrast medium and lower radiation exposure.

    PubMed

    Durmus, Tahir; Rogalla, Patrik; Lembcke, Alexander; Mühler, Matthias R; Hamm, Bernd; Hein, Patrick A

    2011-07-01

    To investigate image quality of triple-rule-out (TRO) computed tomography (CT) using a 320-row-detector CT system with substantially reduced contrast medium volume at 100 kV. Forty-six consecutive patients with noncritical, acute chest pain underwent 320-row-detector CT using a two-step TRO protocol consisting of a non-spiral, non-gated chest CT acquisition (150 mA) followed by a non-spiral, electrocardiography-gated cardiac acquisition (200-500 mA based on body mass index (BMI)). Data were acquired using a biphasic injection protocol with a total iodinated contrast medium volume of 60 ml (370 mg/ml). Vessel attenuation and effective doses were recorded. Image quality was scored independently by two readers. Mean attenuation was 584 ± 114 Hounsfield units (HU) in the ascending aorta, 335 ± 63HU in the aortic arch, 658 ± 136HU in the pulmonary trunk, and 521 ± 97HU and 549 ± 102HU in the right and left coronary artery, respectively. In all but one patient, attenuation and image quality allowed accurate visualization of the pulmonary arteries, thoracic aorta, and coronary arteries in a single examination. Ninety-six percent of all coronary artery segments were rated diagnostic. Radiation exposure ranged between 2.0 and 3.3 mSv. Using 320-row-detector CT the investigated low-dose TRO protocol resulted in excellent opacification and image quality with substantial reduction of contrast medium volume compared to recently published TRO protocols.

  20. Minimizing the acquisition phase in coronary CT angiography using the second generation 320-row CT.

    PubMed

    Tomizawa, Nobuo; Kanno, Shigeaki; Maeda, Eriko; Akahane, Masaaki; Torigoe, Rumiko; Ohtomo, Kuni

    2014-07-01

    We aimed to compare the radiation dose and image quality of a minimal phase window centered at 77 % compared with a wide phase window in coronary CT angiography using the second-generation 320-row CT. Eighty patients with heart rate ≤75 bpm were retrospectively included. The first 40 patients underwent scanning with a wide phase window (65-85 %), while the last 40 patients underwent scanning with a minimal phase window centered at 77 %. Subjective image quality was graded using a 4-point scale (4 = excellent). Image noise and contrast-to-noise ratio at the proximal segments were also analyzed. The mean effective dose was derived from the dose length product multiplied by a chest conversion coefficient (κ = 0.014 mSv mGy(-1) cm(-1)). Minimal phase window scanning centered at 77 % reduced the radiation dose by 30 % compared with wide phase window scanning (1.7 vs 2.4 mSv, p = 0.0009). The subjective image quality showed no significant difference (3.75 vs 3.76, p = 0.77). No significant difference was observed in the image noise, CT number, and contrast-to-noise ratio. Radiation dose could be reduced while maintaining image quality by use of a minimal phase window centered at 77 % compared with a wide phase window in coronary CT angiography using the second generation 320-row CT.

  1. Technical note: Electrocardiogram electrode repositioning for 320-row coronary CT angiography in patients with regular and recurrent premature ventricular contractions.

    PubMed

    Kondo, Takeshi; Matsutani, Hideyuki; Groarke, John; Takamura, Kazuhisa; Fujimoto, Shinichiro; Rybicki, Frank J; Kumamaru, Kanako K

    2014-01-01

    Arrhythmias can compromise image quality and increase radiation exposure during coronary CT angiography (CTA). However, premature ventricular contractions (PVCs) can occur in a predictable recurrent and regular pattern (ie, bigeminy, trigeminy, quadrigeminy) with post-PVC compensatory pauses. Electrocardiographic (ECG) electrode repositioning can achieve relative amplification of the R waves of PVCs compared with R waves of sinus beats. This technical note describes how simple ECG electrode repositioning, combined with an absolute-delay strategy, facilitated selective R waves of PVC ECG triggering of image acquisition in 6 patients with PVC bigeminy or quadrigeminy at the time of 320-row coronary CTA. All 6 studies were single heartbeat acquisition scans with excellent image quality and a median effective radiation dose of 2.9 mSv (interquartile range, 2.1-3.8 mSv). Standard ECG electrode positions used for 2 patients with PVC bigeminy undergoing coronary CTA were associated with an acquisition over 2 heartbeats and effective radiation doses of 6.8 and 10.3 mSv, respectively. In conclusion, ECG electrode repositioning combined with an absolute-delay strategy for regularly recurring PVCs, such as ventricular bigeminy, facilitates high image quality and lower radiation dose during coronary CTA. This simple and straightforward technique can be considered for all patients with regular and recurrent PVCs undergoing coronary CTA. Copyright © 2014 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  2. The effect of bolus viscosity on laryngeal closure in swallowing: kinematic analysis using 320-row area detector CT.

    PubMed

    Inamoto, Yoko; Saitoh, Eiichi; Okada, Sumiko; Kagaya, Hitoshi; Shibata, Seiko; Ota, Kikuo; Baba, Mikoto; Fujii, Naoko; Katada, Kazuhiro; Wattanapan, Pattra; Palmer, Jeffrey B

    2013-03-01

    The present study examined the effect of bolus viscosity on the onset of laryngeal closure (relative to hyoid elevation), the duration of laryngeal closure, and other key events of swallowing in ten healthy volunteers. All volunteers underwent 320-row area detector computed tomography swallow studies while swallowing 10 ml of honey-thick barium (5 % v/w) and thin barium (5 % v/w) in a 45° reclining position. Three-dimensional images of both consistencies were created in 29 phases at an interval of 0.10 s (100 ms) over a 2.90-s duration. The timing of the motions of the hyoid bone, soft palate, and epiglottis; the opening and closing of the laryngeal vestibule, true vocal cords (TVC), and pharyngoesophageal segment; and the bolus movement were measured and compared between the two consistencies. The result showed differing patterns of bolus movement for thin and thick liquids. With thin liquids, the bolus reached the hypopharynx earlier and stayed in the hypopharynx longer than with thick liquids. Among events of laryngeal closure, only the timing of TVC closure differed significantly between the two consistencies. With thin liquids, TVC closure started earlier and lasted longer than with thick liquids. This TVC movement could reflect a response to the faster flow of thin liquids. The results suggest that bolus viscosity alters the temporal characteristics of swallowing, especially closure of the TVC.

  3. Role of Computer Aided Diagnosis (CAD) in the detection of pulmonary nodules on 64 row multi detector computed tomography.

    PubMed

    Prakashini, K; Babu, Satish; Rajgopal, K V; Kokila, K Raja

    2016-01-01

    To determine the overall performance of an existing CAD algorithm with thin-section computed tomography (CT) in the detection of pulmonary nodules and to evaluate detection sensitivity at a varying range of nodule density, size, and location. A cross-sectional prospective study was conducted on 20 patients with 322 suspected nodules who underwent diagnostic chest imaging using 64-row multi-detector CT. The examinations were evaluated on reconstructed images of 1.4 mm thickness and 0.7 mm interval. Detection of pulmonary nodules, initially by a radiologist of 2 years experience (RAD) and later by CAD lung nodule software was assessed. Then, CAD nodule candidates were accepted or rejected accordingly. Detected nodules were classified based on their size, density, and location. The performance of the RAD and CAD system was compared with the gold standard that is true nodules confirmed by consensus of senior RAD and CAD together. The overall sensitivity and false-positive (FP) rate of CAD software was calculated. Of the 322 suspected nodules, 221 were classified as true nodules on the consensus of senior RAD and CAD together. Of the true nodules, the RAD detected 206 (93.2%) and 202 (91.4%) by the CAD. CAD and RAD together picked up more number of nodules than either CAD or RAD alone. Overall sensitivity for nodule detection with the CAD program was 91.4%, and FP detection per patient was 5.5%. The CAD showed comparatively higher sensitivity for nodules of size 4-10 mm (93.4%) and nodules in hilar (100%) and central (96.5%) location when compared to RAD's performance. CAD performance was high in detecting pulmonary nodules including the small size and low-density nodules. CAD even with relatively high FP rate, assists and improves RAD's performance as a second reader, especially for nodules located in the central and hilar region and for small nodules by saving RADs time.

  4. Adrenal lesions: spectrum of imaging findings with emphasis on multi-detector computed tomography and magnetic resonance imaging.

    PubMed

    Guerrisi, Antonino; Marin, Daniele; Baski, Mahbubeh; Guerrisi, Pietro; Capozza, Federica; Catalano, Carlo

    2013-01-01

    The adrenal gland is a common site of a large spectrum of abnormalities like primary tumors, hemorrhage, metastases, and enlargement of the gland from external hormonal stimulation. Most of these lesions represent nonfunctioning adrenal adenomas and thus warrant a conservative management. Multi-detector computed tomography (CT) and magnetic resonance (MR) imaging are still considered highly specific and complementary techniques for the detection and characterization of adrenal abnormalities. Radiologist can establish a definitive diagnosis for most adrenal masses (i.e., carcinoma, hemorrhage) based on imaging alone. Imaging therefore can differentiate malignant lesions from those benign and avoid unnecessary aggressive management of benign lesions. The article gives an overview of the adrenal lesions and their imaging characteristics seen on CT and MR imaging.

  5. A Study of Internal Thoracic Arteriovenous Principal Perforators by Using Multi-detector Row Computed Tomography Angiography.

    PubMed

    Okumura, Ko; Hashikawa, Kazunobu; Sakakibara, Shunsuke; Onishi, Hiroyuki; Terashi, Hiroto

    2016-01-01

    There are numerous reports of perforating branches from the intercostal spaces of the internal thoracic vessels. These branches have varying diameters, and a main perforating branch, the principal perforator, most often found in the second or third intercostal space. We report different results based on multi-detector row computed tomography. We evaluated 121 sides from 70 women scheduled for breast reconstruction with free lower abdominal skin flaps who underwent preoperative multi-detector row computed tomographic scan between June 2008 and June 2015. For primary reconstruction, we analyzed both sides, and for 1-sided secondary reconstruction, we analyzed only the unaffected side. We evaluated both early arterial phase and late venous phase 5-mm horizontal, cross-sectional, and volume-rendering images for perforation sites and internal thoracic arteriovenous perforating branches' intercostal space thickness. We analyzed differences in thickness between the internal thoracic arteries and veins and symmetry in cases involving both sides. Venous principal perforators nearly always perforated the same intercostal spaces as accompanying veins of arterial principal perforators (99.2%), forming arteriovenous principal perforators. We found 49 principal perforators in the first intercostal space (37.4%), 52 in the second intercostal space (39.7%), 23 in the third intercostal space (17.6%), 6 in the fourth intercostal space (4.6%), and 1 in the fifth intercostal space (0.7%). Of the 51 cases in which we studied both sides, 25 cases (49%) had principal perforators with bilateral symmetry. In contrast to findings from past reports, we found that internal thoracic arteriovenous principal perforators were often present in almost the same numbers in the first and second intercostal spaces.

  6. A Study of Internal Thoracic Arteriovenous Principal Perforators by Using Multi-detector Row Computed Tomography Angiography

    PubMed Central

    Hashikawa, Kazunobu; Sakakibara, Shunsuke; Onishi, Hiroyuki; Terashi, Hiroto

    2016-01-01

    Objective: There are numerous reports of perforating branches from the intercostal spaces of the internal thoracic vessels. These branches have varying diameters, and a main perforating branch, the principal perforator, most often found in the second or third intercostal space. We report different results based on multi-detector row computed tomography. Methods: We evaluated 121 sides from 70 women scheduled for breast reconstruction with free lower abdominal skin flaps who underwent preoperative multi-detector row computed tomographic scan between June 2008 and June 2015. For primary reconstruction, we analyzed both sides, and for 1-sided secondary reconstruction, we analyzed only the unaffected side. We evaluated both early arterial phase and late venous phase 5-mm horizontal, cross-sectional, and volume-rendering images for perforation sites and internal thoracic arteriovenous perforating branches’ intercostal space thickness. We analyzed differences in thickness between the internal thoracic arteries and veins and symmetry in cases involving both sides. Results: Venous principal perforators nearly always perforated the same intercostal spaces as accompanying veins of arterial principal perforators (99.2%), forming arteriovenous principal perforators. We found 49 principal perforators in the first intercostal space (37.4%), 52 in the second intercostal space (39.7%), 23 in the third intercostal space (17.6%), 6 in the fourth intercostal space (4.6%), and 1 in the fifth intercostal space (0.7%). Of the 51 cases in which we studied both sides, 25 cases (49%) had principal perforators with bilateral symmetry. Conclusions: In contrast to findings from past reports, we found that internal thoracic arteriovenous principal perforators were often present in almost the same numbers in the first and second intercostal spaces. PMID:26958104

  7. Role of Computer Aided Diagnosis (CAD) in the detection of pulmonary nodules on 64 row multi detector computed tomography

    PubMed Central

    Prakashini, K; Babu, Satish; Rajgopal, KV; Kokila, K Raja

    2016-01-01

    Aims and Objectives: To determine the overall performance of an existing CAD algorithm with thin-section computed tomography (CT) in the detection of pulmonary nodules and to evaluate detection sensitivity at a varying range of nodule density, size, and location. Materials and Methods: A cross-sectional prospective study was conducted on 20 patients with 322 suspected nodules who underwent diagnostic chest imaging using 64-row multi-detector CT. The examinations were evaluated on reconstructed images of 1.4 mm thickness and 0.7 mm interval. Detection of pulmonary nodules, initially by a radiologist of 2 years experience (RAD) and later by CAD lung nodule software was assessed. Then, CAD nodule candidates were accepted or rejected accordingly. Detected nodules were classified based on their size, density, and location. The performance of the RAD and CAD system was compared with the gold standard that is true nodules confirmed by consensus of senior RAD and CAD together. The overall sensitivity and false-positive (FP) rate of CAD software was calculated. Observations and Results: Of the 322 suspected nodules, 221 were classified as true nodules on the consensus of senior RAD and CAD together. Of the true nodules, the RAD detected 206 (93.2%) and 202 (91.4%) by the CAD. CAD and RAD together picked up more number of nodules than either CAD or RAD alone. Overall sensitivity for nodule detection with the CAD program was 91.4%, and FP detection per patient was 5.5%. The CAD showed comparatively higher sensitivity for nodules of size 4–10 mm (93.4%) and nodules in hilar (100%) and central (96.5%) location when compared to RAD's performance. Conclusion: CAD performance was high in detecting pulmonary nodules including the small size and low-density nodules. CAD even with relatively high FP rate, assists and improves RAD's performance as a second reader, especially for nodules located in the central and hilar region and for small nodules by saving RADs time. PMID:27578931

  8. Time Efficiency and Diagnostic Accuracy of New Automated Myocardial Perfusion Analysis Software in 320-Row CT Cardiac Imaging

    PubMed Central

    Rief, Matthias; Stenzel, Fabian; Kranz, Anisha; Schlattmann, Peter

    2013-01-01

    Objective We aimed to evaluate the time efficiency and diagnostic accuracy of automated myocardial computed tomography perfusion (CTP) image analysis software. Materials and Methods 320-row CTP was performed in 30 patients, and analyses were conducted independently by three different blinded readers by the use of two recent software releases (version 4.6 and novel version 4.71GR001, Toshiba, Tokyo, Japan). Analysis times were compared, and automated epi- and endocardial contour detection was subjectively rated in five categories (excellent, good, fair, poor and very poor). As semi-quantitative perfusion parameters, myocardial attenuation and transmural perfusion ratio (TPR) were calculated for each myocardial segment and agreement was tested by using the intraclass correlation coefficient (ICC). Conventional coronary angiography served as reference standard. Results The analysis time was significantly reduced with the novel automated software version as compared with the former release (Reader 1: 43:08 ± 11:39 min vs. 09:47 ± 04:51 min, Reader 2: 42:07 ± 06:44 min vs. 09:42 ± 02:50 min and Reader 3: 21:38 ± 3:44 min vs. 07:34 ± 02:12 min; p < 0.001 for all). Epi- and endocardial contour detection for the novel software was rated to be significantly better (p < 0.001) than with the former software. ICCs demonstrated strong agreement (≥ 0.75) for myocardial attenuation in 93% and for TPR in 82%. Diagnostic accuracy for the two software versions was not significantly different (p = 0.169) as compared with conventional coronary angiography. Conclusion The novel automated CTP analysis software offers enhanced time efficiency with an improvement by a factor of about four, while maintaining diagnostic accuracy. PMID:23323027

  9. Evaluation of the pelvic apophysis with multi-detector computed tomography for legal age estimation in living individuals

    PubMed Central

    Karami, Mehdi; Rabiei, Meisam; Riahinezhad, Maryam

    2015-01-01

    Background: Legal age estimations of living individuals are gaining increasing importance for radiologists involved in delivering expert opinions. The present study aimed to assess the correlation between chronological age and apophyseal centers distance from pelvic bone. Materials and Methods: This was a cross-sectional study carried out on 2013. Subjects were chosen from 15 to 25 years old people who had previous pelvic multi-detector computed tomography for any reason. The distance of iliac crest apophysis to iliac bone, and pubic apophysis to pubic bone were assessed. Results: There was a reverse linear correlation between chronological age and distance of iliac crest apophysis (P < 0.001, r = 0.899) and pubic apophysis to pelvic bone (P < 0.001, r = 0.898). Pubic apophysis was not appeared in subjects before 16 years old and it was appeared in all of the subjects with 18 years old and more. Subjects with age of 21 had near ossification of iliac or pubic apophysis and subjects with age of 24 had full ossification of iliac or pubic apophysis. Conclusion: skeletalage can be estimated by assessing the apophyseal centers distance from the pelvic bone in adolescents 15-25 years old. PMID:26109964

  10. Evaluation of renal vascular anatomy in live renal donors: Role of multi detector computed tomography.

    PubMed

    Pandya, Vaidehi Kumudchandra; Patel, Alpeshkumar Shakerlal; Sutariya, Harsh Chandrakant; Gandhi, Shruti Pradipkumar

    2016-01-01

    Evaluation of renal vascular variations is important in renal donors to avoid vascular complications during surgery. Venous variations, mainly resulting from the errors of the embryological development, are frequently observed. This retrospective cross-sectional study aimed to investigate the renal vascular variants with multidetector computed tomography (MDCT) angiography to provide valuable information for surgery and its correlations with surgical findings. A total of 200 patients underwent MDCT angiography as a routine work up for live renal donors. The number, course, and drainage patterns of the renal veins were retrospectively observed from the scans. Anomalies of renal veins and inferior vena cava (IVC) were recorded and classified. Multiplanar reformations (MPRs), maximum intensity projections, and volume rendering were used for analysis. The results obtained were correlated surgically. In the present study, out of 200 healthy donors, the standard pattern of drainage of renal veins was observed in only 67% of donors on the right side and 92% of donors on the left side. Supernumerary renal veins in the form of dual and triple renal veins were seen on the right side in about 32.5% of donors (dual right renal veins in 30.5% cases and triple right renal veins in 2.5% cases). Variations on the left side were classified into four groups: supernumerary, retro-aortic, circumaortic, and plexiform left renal veins in 1%, 2.5%, 4%, 0.5%, cases respectively. Developmental variations in renal veins can be easily detected on computed tomography scan, which can go unnoticed and can pose a fatal threat during major surgeries such as donor nephrectomies in otherwise healthy donors if undiagnosed.

  11. Criteria for establishing shielding of multi-detector computed tomography (MDCT) rooms.

    PubMed

    Verdun, F R; Aroua, A; Baechler, S; Schmidt, S; Trueb, P R; Bochud, F O

    2010-01-01

    The aim of this work is to compare two methods used for determining the proper shielding of computed tomography (CT) rooms while considering recent technological advances in CT scanners. The approaches of the German Institute for Standardisation and the US National Council on Radiation Protection and Measurements were compared and a series of radiation measurements were performed in several CT rooms at the Lausanne University Hospital. The following three-step procedure is proposed for assuring sufficient shielding of rooms hosting new CT units with spiral mode acquisition and various X-ray beam collimation widths: (1) calculate the ambient equivalent dose for a representative average weekly dose length product at the position where shielding is required; (2) from the maximum permissible weekly dose at the location of interest, calculate the transmission factor F that must be taken to ensure proper shielding and (3) convert the transmission factor into a thickness of lead shielding. A similar approach could be adopted to use when designing shielding for fluoroscopy rooms, where the basic quantity would be the dose area product instead of the load of current (milliampere-minute).

  12. The small foramina of the orbit and periorbital region: assessment with multi detector computed tomography.

    PubMed

    Gufler, Hubert; Preiss, Markus; Koesling, Sabrina

    2015-12-01

    Familiarity with the variants of the foramina of the orbit and periorbital region is important in planning anesthesiological blocks and during orbital and maxillofacial surgery to avoid damage to nerves and vessels. To assess the visibility and the incidence of variants of the small foramina of the orbit by multidetector computed tomography (MDCT). The MDCT scans of 400 orbits from 200 patients were evaluated retrospectively. Slice thickness of the reconstructed images were in the range of 0.5-1.0 mm. The visibility and the variants of the foramen supraorbitale, the foramen infraorbitale, the foramen zygomaticofaciale, the foramen ethmoidale anterius et posterius, and the foramen cranio-orbitale were assessed using three-dimensional reconstruction tools. The foramen infraorbitale (100%; n = 400), foramen supraorbitale (99.5%, n = 398), foramen zygomaticofaciale (76.5%; n = 307), and foramen zygomatico-orbitale (74.5%; n = 298) were most reliably detected by MDCT, while the foramen ethmoidale anterius (58.7%; n = 235) et posterius (56.7%; n = 225) were depicted less frequently. The foramen cranio-orbitale could not be identified in any case. Doubling was found for the foramen supraorbitale in 3.25% (n = 13), the foramen infraorbitale in 1.75% (n = 7), the foramen zygomaticofaciale in 16% (n = 64), and the foramen zygomatico-orbitale 14% (n = 56). Three foramina zygomatico-orbitale and foramina infraorbitale were found in 1.5% (n = 6) and in 0.5% (n = 2) of orbits, respectively. The foramina supraorbitale, infraorbitale, zygomatico-orbitale, and zygomaticiofaciale and their variants are well visible on MDCT. Knowledge of the exact number of these small foramina is relevant for preoperative evaluation. © The Foundation Acta Radiologica 2014.

  13. Denver screening protocol for blunt cerebrovascular injury reduces the use of multi-detector computed tomography angiography.

    PubMed

    Beliaev, Andrei M; Barber, P Alan; Marshall, Roger J; Civil, Ian

    2014-06-01

    Blunt cerebrovascular injury (BCVI) occurs in 0.2-2.7% of blunt trauma patients and has up to 30% mortality. Conventional screening does not recognize up to 20% of BCVI patients. To improve diagnosis of BCVI, both an expanded battery of screening criteria and a multi-detector computed tomography angiography (CTA) have been suggested. The aim of this study is to investigate whether the use of CTA restricted to the Denver protocol screen-positive patients would reduce the unnecessary use of CTA as a pre-emptive screening tool. This is a registry-based study of blunt trauma patients admitted to Auckland City Hospital from 1998 to 2012. The diagnosis of BCVI was confirmed or excluded with CTA, magnetic resonance angiography and, if these imaging were non-conclusive, four-vessel digital subtraction angiography. Thirty (61%) BCVI and 19 (39%) non-BCVI patients met eligibility criteria. The Denver protocol applied to our cohort of patients had a sensitivity of 97% (95% confidence interval (CI): 83-100%) and a specificity of 42% (95% CI: 20-67%). With a prevalence of BCVI in blunt trauma patients of 0.2% and 2.7%, post-test odds of a screen-positive test were 0.03 (95% CI: 0.002-0.005) and 0.046 (95% CI: 0.314-0.068), respectively. Application of the CTA to the Denver protocol screen-positive trauma patients can decrease the use of CTA as a pre-emptive screening tool by 95-97% and reduces its hazards. © 2013 Royal Australasian College of Surgeons.

  14. Diagnostic nomogram for gallbladder wall thickening mimicking malignancy: using contrast-enhanced ultrasonography or multi-detector computed tomography?

    PubMed

    Chen, Li-Da; Huang, Yang; Xie, Xiao-Hua; Chen, Wei; Shan, Quan-Yuan; Xu, Ming; Liu, Jin-Ya; Nie, Zhi-Qiang; Xie, Xiao-Yan; Lu, Ming-De; Shen, Shun-Li; Wang, Wei

    2017-04-26

    To establish a diagnostic nomogram using contrast-enhanced ultrasonography (CEUS) in gallbladder wall thickening mimicking malignancy and compare with multi-detector computed tomography (MDCT). Seventy-two patients with gallbladder wall thickening on B-mode ultrasonography (BUS) were examined by CEUS to develop independent predictors for diagnosing gallbladder carcinoma. Among the 72 cases, 48 patients underwent both CEUS and MDCT. The diagnostic performances of different sets of CEUS criteria and MDCT were compared. A prediction model of malignancy using CEUS was developed. The performance of the nomogram was assessed with respect to its calibration, discrimination, and clinical usefulness. Multivariate logistic regression indicated that inhomogeneous enhancement in the arterial phase was the strongest independent predictor of malignancy (odds ratio, OR 51.162), followed by interrupted inner layer (OR 19.788), washout time ≤40 s (OR 16.686), and wall thickness >1.6 cm (OR 3.019), which were all selected into the nomogram. Combined with the above significant features, the diagnostic performance of CEUS (AUC = 0.917) was higher than that of MDCT (AUC = 0.788, P = 0.070). The predictive model using CEUS showed good discrimination, with a concordance index of 0.974 (0.950 through internal validation), and good calibration. Decision curve analysis demonstrated that the nomogram was clinically useful. CEUS could accurately differentiate between malignant and benign gallbladder wall thickening with equivalent efficacy compared to MDCT. The proposed nomogram could be conveniently used to facilitate the preoperative individualized prediction of malignancy in patients with gallbladder wall thickening.

  15. Comparison of coronary plaque subtypes in male and female patients using 320-row MDCTA.

    PubMed

    Khosa, Faisal; Khan, Atif N; Nasir, Khurram; Bedayat, Arash; Malik, Zehra; Jon, Ali F; Cheema, Ahmad R; Clouse, Melvin E; Welty, Francine K

    2013-02-01

    Determine plaque subtype and volume difference in male and female patients with obstructive and non-obstructive CAD using 320-row MDCTA. 128 patients with suspected CAD underwent MDCTA. All studies were divided into two groups based on disease severity. 0-70% stenosis (non-obstructive CAD) & >70% (obstructive). All were compared for plaque quantity and subtypes by gender. Main arteries, RCA, LM, LAD and LCX were analyzed using Vitrea 5.2 software to quantify fatty, fibrous and calcified plaque. Thresholds for coronary plaque quantification (volume in mm(3)) were preset at 35 ± 12 HU for fatty, 90 ± 24 HU for fibrous and >130 HU for calcified/mixed plaque and analyzed using STATA software. Total plaque burden in 118 patients [65M: 53F] was significantly higher in all arteries in males compared to females with non-obstructive disease. Total plaque volume for males vs. females was: RCA: 10.10 ± 5.02 mm(3) vs. 6.89 ± 2.75 mm(3), respectively, p = 0.001; LAD: 7.21 ± 3.38 mm(3) vs. 5.89 ± 1.93 mm(3), respectively, p = 0.04; LCX: 9.13 ± 3.27 mm(3) vs. 7.16 ± 1.73 mm(3), respectively, p = 0.002; LM 15.13 ± 4.51 mm(3) vs. 11.85 ± 4.03 mm(3), respectively, p = 0.001. In sub-analyses, males had significantly more fibrous and fatty plaque in LM, LAD & LCX than females. However in the RCA, only fibrous plaque was significantly greater in males. Calcified plaque volume was not significantly different in both genders. Only 8% of patients had obstructive CAD (>70% stenosis); there was no significant difference in plaque volume or subtypes. In patients with non-obstructive CAD, males were found to have significantly higher total coronary plaque volume with predominance of fibrous and fatty subtypes compared to females of the same age and BMI. There was no significant difference in plaque subtype or volume in patients with obstructive disease. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. Contribution of diffusion weighted MRI to diagnosis and staging in gastric tumors and comparison with multi-detector computed tomography

    PubMed Central

    Fatih Özbay, Mehmet; Çallı, İskan; Doğan, Erkan; Çelik, Sebahattin; Batur, Abdussamet; Bora, Aydın; Yavuz, Alpaslan; Bulut, Mehmet Deniz; Özgökçe, Mesut; Çetin Kotan, Mehmet

    2017-01-01

    Abstract Background Diagnostic performance of Diffusion-Weighted magnetic resonance Imaging (DWI) and Multi-Detector Computed Tomography (MDCT) for TNM (Tumor, Lymph node, Metastasis) staging of gastric cancer was compared. Patients and methods We used axial T2-weighted images and DWI (b-0,400 and b-800 s/mm2) protocol on 51 pre-operative patients who had been diagnosed with gastric cancer. We also conducted MDCT examinations on them. We looked for a signal increase in the series of DWI images. The depth of tumor invasion in the stomach wall (tumor (T) staging), the involvement of lymph nodes (nodal (N) staging), and the presence or absence of metastases (metastatic staging) in DWI and CT images according to the TNM staging system were evaluated. In each diagnosis of the tumors, sensitivity, specificity, positive and negative accuracy rates of DWI and MDCT examinations were found through a comparison with the results of the surgical pathology, which is the gold standard method. In addition to the compatibilities of each examination with surgical pathology, kappa statistics were used. Results Sensitivity and specificity of DWI and MDCT in lymph node staging were as follows: N1: DWI: 75.0%, 84.6%; MDCT: 66.7%, 82%;N2: DWI: 79.3%, 77.3%; MDCT: 69.0%, 68.2%; N3: DWI: 60.0%, 97.6%; MDCT: 50.0%, 90.2%. The diagnostic tool DWI seemed more compatible with the gold standard method (surgical pathology), especially in the staging of lymph node, when compared to MDCT. On the other hand, in T staging, the results of DWI and MDCT were better than the gold standard when the T stage increased. However, DWI did not demonstrate superiority to MDCT. The sensitivity and specificity of both imaging techniques for detecting distant metastasis were 100%. Conclusions The diagnostic accuracy of DWI for TNM staging in gastric cancer before surgery is at a comparable level with MDCT and adding DWI to routine protocol of evaluating lymph nodes metastasis might increase diagnostic accuracy

  17. A reliable radiographic measurement for evaluation of normal distal tibiofibular syndesmosis: a multi-detector computed tomography study in adults.

    PubMed

    Chen, Yanxi; Qiang, Minfei; Zhang, Kun; Li, Haobo; Dai, Hao

    2015-01-01

    Syndesmotic injury may be difficult to diagnose, and radiological evaluation is very important. The purpose of this study was to offer a series of reliable and repeatable normal tibiofibular syndesmosis parameters in diagnosing injuries of the syndesmosis. Multi-detector computed tomography (MDCT) and radiographs of the distal tibiofibular syndesmosis in 484 cases were retrospectively reviewed. Relevant parameters included the tibiofibular clear space (TCS), the tibiofibular overlap (TFO), the depth of the incisura fibularis (IFD), and the height of the incisura fibularis (IFH), which were measured by novel three-dimensional (3-D) and two-dimensional (2-D) techniques. The distance between the measuring plane of the distal tibiofibular syndesmosis and the tibial plafond was measured. Intra- and inter-rater reliability was assessed by intraclass correlation coefficient (ICC) and the root mean square standard deviation (RMS-SD), to determine measurement precision. Sex differences of parameters were analyzed using analysis of covariance (ANCOVA) with body height as the covariate. Paired sample t-testing was used to compare parameters in different image modalities, including radiography, and 2-D and 3-D CT. The reliability of the 3-D images measurement (ICC range, 0.907 to 0.972) was greater than that for the 2-D axial images (ICC range, 0.895 to 0.927), and the AP view radiographs (ICC range, 0.742 to 0.838). The intra-rater RMS-SD of the 3-D CT, 2-D CT and radiographic measurements were less than 0.94 mm, 0.26 mm, and 2.87 mm, respectively. The measuring plane of the distal tibiofibular syndesmosis showed the sex difference, which was 12.1 mm proximal to the tibial plafond in the male group and 7.8 mm in the female group. In this plane, the parameters for tibiofibular syndesmosis were measured in different image modalities. All variables were significantly different between females and males (p < 0.05). 3-D measurement technique could be helpful to identify

  18. 320-Row Detector Dynamic 4D-CTA for the Assessment of Brain and Spinal Cord Vascular Shunting Malformations. A Technical Note.

    PubMed

    D'Orazio, Federico; Splendiani, Alessandra; Gallucci, Massimo

    2014-12-01

    Shunting vascular malformations of the brain and spinal cord are traditionally studied using digital subtraction angiography (DSA), the current gold standard imaging method routinely used because of its favourable combination in terms of spatial and temporal resolution. Because DSA is relatively expensive, time-consuming and carries a risk of silent embolic events and a small risk of transient or permanent neurologic deterioration, a non-invasive alternative angiographic method is of interest. New 320 row-detector CT scanners allow volumetric imaging of the whole brain with temporal resolution up to ≌ 3 Hz. Those characteristics make computed tomography angiography (CTA) an affordable imaging method to study the haemodynamics of the whole brain and can also be applied to the study of limited portions of the spinal cord. The aim of this paper is to make a brief summary of our experience in studying shunting vascular malformation of the brain and spinal cord using dynamic 4D-CTA, explaining the technical details of the studies performed at our institution, and the state-of-the-art major advantages and drawbacks of this new technique. We found that dynamic 4D-CTA is able to depict the main architectural characteristics of previously untreated vascular shunting malformations both in brain and spinal cord (i.e. their main arterial feeders and draining veins) allowing their correct diagnosis and exhaustive classification, limiting the use of DSA for therapeutic purposes.

  19. Surgically Cured, Relapsed Pneumococcal Meningitis Due to Bone Defects, Non-invasively Identified by Three-dimensional Multi-detector Computed Tomography

    PubMed Central

    Akimoto, Takayoshi; Morita, Akihiko; Shiobara, Keiji; Hara, Makoto; Minami, Masayuki; Shijo, Katsunori; Nomura, Yasuyuki; Shigihara, Shuntaro; Haradome, Hiroki; Abe, Osamu; Kamei, Satoshi

    2016-01-01

    A 43-year-old Japanese man presented with a history of bacterial meningitis (BM). He was admitted to our department with a one-day history of headache and was diagnosed with relapse of BM based on the cerebrospinal fluid findings. The conventional imaging studies showed serial findings suggesting left otitis media, a temporal cephalocele, and meningitis. Three-dimensional multi-detector computed tomography (3D-MDCT) showed left petrous bone defects caused by the otitis media, and curative surgical treatment was performed. Skull bone structural abnormalities should be considered a cause of relapsed BM. 3D-MDCT was useful for revealing the causal minimal bone abnormality and performing pre-surgical mapping. PMID:27980270

  20. Visualisation of passive middle ear implants by cone beam and multi-detector computed tomography: a comparative in vitro study.

    PubMed

    Nguyen, T D; Kösling, S; Mlynski, R; Plontke, S K

    2016-12-01

    Modern passive middle ear titanium prostheses are filigree structures, resulting in poorer depiction on CT compared to prostheses used in the past. We compared the visibility of newer prostheses on cone beam CT (CBCT) to multi-detector CT (MDCT) with standard and lower dose in vitro, and analysed image noise and metal artefacts. Six different titanium middle ear prostheses (three partial and one total ossicular replacement prostheses, two stapes prostheses) were implanted twice in formalin-fixed head specimens-first correctly and then with displacement. Imaging was performed using standard CBCT and MDCT as well as MDCT with lower dose (36 single imaging investigations). Images were analysed with knowledge of the used types of middle ear prostheses, but blinded with respect to the positioning in the specific case. On all images the type of prostheses and their positions could be clearly recognized. Their identifiability including their details was rated as statistically significantly higher for all CBCT investigations than for MDCT. MDCT with lower dose showed the worst results. No statistical differences were found in image noise and metal artefacts. If available, CBCT should be preferred over MDCT in diagnostic evaluation of passive middle ear prostheses. • Middle ear prostheses became more filigree, leading to poorer visibility on CT. • High spatial resolution and paraxial reconstructions are necessary requirements for imaging evaluation. • CBCT and MDCT can identify type and positioning of titanium prostheses. • Metal artefacts play a minor part in filigree titanium prostheses. • Regarding visualisation of prostheses details, cone beam CT aids the evaluation.

  1. [Extra-organic primary tumor in pelvis: correlation of multi-detector row computed tomography, anatomy and pathology].

    PubMed

    Dong, Zhihui; Yang, Zhigang; Li, Yuan; Min, Pengqiu; Zhang, Xiaochun

    2009-02-01

    The purpose of this study was to investigate the correlation between multi-detector row CT (MDCT) features, pathological findings and the anatomic basis of extra-organic primary tumors in pelvis so as to improve the document diagnosis of these entities. We retrospectively analyzed the MDCT manifestations of 20 cases with surgically and/or pathologically evidenced diagnoses of extra-organic primary tumors in pelvis. The results showed that, in 14 cases, the tumors were located in the pelvis, and 6 of them involved both pelvis and hypogastric zone. There were 8 tumors located in the peritoneal cavity of the pelvis, and 3 of them also involved the extraperitoneal space of the pelvis. In the peritoneal cavity, 2 tumors of male patients were located in the rectovesical pouch while 3 tumors of female patients were located in the rectouterine pouch. The majority of entities in these 2 pouches were germ cell tumors (3/5 cases, 60.0%). In the extraperitoneal space, 5 of 12 tumors were located in the pararectal space and 5 of them were located in the retrorectal space. The majority entities of these 10 cases were germ cell tumors (7/10 cases, 70.0%). Lymphoma mainly involved paravesical and pararectal space in disorder. Calcification occurred in 6 cases, including 4 cases of teratoma, 1 case of neurilemmoma, and 1 case of malignant teratoma. The fatty element occurred in 7 masses, including 4 cases of teratoma, 1 case of malignant teratoma, 1 case of mixed germ cell tumor, and 1 case of liposarcoma. MDCT with multi-planar reconstruction (MPR) could more clearly reveal the anatomic location of the extra-organic primary tumor in pelvis, could unveil the tumor's relationship with its surrounding organs, and could help to differentiate benign tumors from malignant tumors.

  2. Flat-detector computed tomography in the assessment of intracranial stents: comparison with multi detector CT and conventional angiography in a new animal model.

    PubMed

    Struffert, Tobias; Ott, Sabine; Adamek, Edyta; Schwarz, Marc; Engelhorn, Tobias; Kloska, Stephan; Deuerling-Zheng, Yu; Doerfler, Arnd

    2011-08-01

    Careful follow up is necessary after intracranial stenting because in-stent restenosis (ISR) or residual stenosis (RS) is not rare. A minimally invasive follow-up imaging technique is desirable. The objective was to compare the visualisation of stents in Flat Detector-CT Angiography (FD-CTA) after intravenous contrast medium injection (i.v.) with Multi Detector Computed Tomography Angiography (MD-CTA) and Digital Subtracted Angiography (DSA) in an animal model. Stents were implanted in the carotid artery of 12 rabbits. In 6 a residual stenosis (RS) was surgically created. Imaging was performed using FD-CTA, MD-CTA and DSA. Measurements of the inner and outer diameter and cross-section area of the stents were performed. Stenosis grade was calculated. In subjective evaluation FD-CTA was superior to MD-CTA. FD-CTA was more accurate compared with DSA than MD-CTA. Cross-sectional area of the stent lumen was significantly larger (p < 0.05) in FD-CTA in comparison to MD-CTA. Accurate evaluation of stenosis was impossible in MD-CTA. There was no statistically significant difference in the stenosis grade of DSA and FD-CTA. Our results show that visualisation of stent and stenosis using intravenous FD-CTA compares favourably with DSA and may replace DSA in the follow-up of patients treated with intracranial stents.

  3. Validation of the Australian diagnostic reference levels for paediatric multi detector computed tomography: a comparison of RANZCR QUDI data and subsequent NDRLS data from 2012 to 2015.

    PubMed

    Anna, Hayton; Wallace, Anthony; Thomas, Peter

    2017-03-01

    The national diagnostic reference level service (NDRLS), was launched in 2011, however no paediatric data were submitted during the first calendar year of operation. As such, Australian national diagnostic reference levels (DRLs), for paediatric multi detector computed tomography (MDCT), were established using data obtained from a Royal Australian and New Zealand College of Radiologists (RANZCR), Quality Use of Diagnostic Imaging (QUDI), study. Paediatric data were submitted to the NDRLS in 2012 through 2015. An analysis has been made of the NDRLS paediatric data using the same method as was used to analyse the QUDI data to establish the Australian national paediatric DRLs for MDCT. An analysis of the paediatric NDRLS data has also been made using the method used to calculate the Australian national adult DRLs for MDCT. A comparison between the QUDI data and subsequent NDRLS data shows the NDRLS data to be lower on average for the Head and AbdoPelvis protocol and similar for the chest protocol. Using an average of NDRLS data submitted between 2012 and 2015 implications for updated paediatric DRLS are considered.

  4. Three-dimensional demonstration of the lymphatic system in the lower extremities with multi-detector-row computed tomography: a study in a cadaver model.

    PubMed

    Yamazaki, Shun; Suami, Hiroo; Imanishi, Nobuaki; Aiso, Sadakazu; Yamada, Minoru; Jinzaki, Masahiro; Kuribayashi, Sachio; Chang, David W; Kishi, Kazuo

    2013-03-01

    Sentinel lymph node biopsy (SLNB) has had a great impact on the staging and treatment of cancer. The purpose of this study was to study the lymphatic anatomy of the lower extremities by constructing three-dimensional images using multi-detector-row computed tomography (MDCT). To select appropriate contrast media for MDCT lymphatic imaging in a cadaver, we tested four kinds of contrast media by injecting them into fresh swine kidneys. After the suitable contrast medium was selected, 10 lower extremities from 5 fresh cadavers were studied. After injection of the contrast medium, each lower extremity was scanned with high-spatial-resolution MDCT. The zinc oxide mixture was found to be the most appropriate contrast formula for MDCT imaging of cadaver lymphatics in terms of CT value and no extravasation. The high-resolution MDCT imaging revealed two different superficial lymphatic pathways in the legs. One lymphatic pathway accompanying the great saphenous vein had a constant course and was connected to the superficial inguinal lymph nodes. However, another pathway, along the small saphenous vein, was variable. Some of the deep lymphatic vessels bypassed the inguinal lymph nodes. Using a new protocol, we were able to construct three-dimensional images of the lower extremity lymphatics in a cadaver model. MDCT imaging provided novel information about two different superficial lymphatic pathways in the lower extremities. Copyright © 2013 Wiley Periodicals, Inc.

  5. Pulmonary vein morphology by free-breathing whole heart magnetic resonance imaging at 3 Tesla versus breathhold multi-detector computed tomography.

    PubMed

    Fodi, Eszter; McAreavey, Dorothea; Abd-Elmoniem, Khaled Z; Ohayon, Jacques; Saba, Magdi; Elagha, Abdalla; Pettigrew, Roderic I; Gharib, Ahmed M

    2013-04-01

    To compare pulmonary vein and left atrial anatomy using three-dimensional free-breathing whole-heart magnetic resonance imaging (MR) at 3 Tesla (T) and multi-detector computed tomography (MDCT). Thirty-three subjects (19 male, age 49 ± 12 years) underwent free-breathing 3T MR and contrast-enhanced MDCT during inspiratory breath hold. Pulmonary vein parameters (ostial areas, diameters, angles) were measured. All pulmonary veins and anomalies were identified by 3T MR and by MDCT. The right-sided pulmonary veins were directed more posteriorly, the right superior pulmonary vein more inferiorly, and the right inferior pulmonary vein more superiorly by 3T MR when compared with MDCT. The cross-sectional area, perimeters and minimum diameters of right-sided pulmonary vein ostia were significantly larger by MR, as were the maximum diameters of right and left inferior pulmonary veins. There were no significant differences between techniques in distance to first pulmonary vein branch. Pulmonary vein measurements demonstrated significant differences in angulations and dimensions when 3T MR is compared with MDCT. These differences likely represent hemodynamic and respiratory variation during free-breathing with MR versus breath-holding with MDCT. MR imaging at 3T during free-breathing offers an alternate method to define pulmonary vein and left atrial anatomy without exposure to radiation. Copyright © 2012 Wiley Periodicals, Inc.

  6. Non-invasive coronary angiography with multi-detector computed tomography: comparison to conventional X-ray angiography.

    PubMed

    Schoenhagen, Paul; Stillman, Arthur E; Halliburton, Sandy S; Kuzmiak, Stacie A; Painter, Tracy; White, Richard D

    2005-02-01

    Selective coronary angiography introduced clinical coronary imaging in the late 1950s. The angiographic identification of high-grade coronary lesions in patients with acute and chronic symptomatic coronary artery disease (CAD) led to the development of surgical and percutaneous coronary revascularization. However, the fact that CAD remains the major cause of death in North America and Europe demonstrates the need for novel, complementary diagnostic strategies. These are driven by the need to characterize both increasingly advanced disease stages but also early, asymptomatic disease development. Complex revascularization techniques for patients with advanced disease stages will initiate a growing demand for 3-dimensional coronary imaging and integration of imaging modalities with new mechanical therapeutic devices. An emerging focus is atherosclerosis imaging with the goal to identify subclinical disease stages as the basis for pharmacological intervention aimed at disease stabilization or reversal. Non-invasive coronary imaging with coronary multidetector computed tomographic angiography (MDCTA) allows both assessment of luminal stenosis and subclinical disease of the arterial wall. Its complementary role in the assessment of early and advanced stages of CAD is increasingly recognized.

  7. Comparison between survey radiography, B-mode ultrasonography, contrast-enhanced ultrasonography and contrast-enhanced multi-detector computed tomography findings in dogs with acute abdominal signs.

    PubMed

    Shanaman, Miriam M; Schwarz, Tobias; Gal, Arnon; O'Brien, Robert T

    2013-01-01

    Contrast-enhanced multi-detector computed tomography (CE-MDCT) is used routinely in evaluating human patients with acute abdominal symptoms. Contrast-enhanced ultrasound (CEUS) continues to be in its infancy as it relates to evaluation of the acute abdomen. The purpose of this study was to compare survey radiography, B-mode ultrasound, CEUS, and CE-MDCT findings in canine patients presenting with acute abdominal signs; with a focus on the ability to differentiate surgical from non-surgical conditions. Nineteen dogs were prospectively enrolled. Inclusion required a clinical diagnosis of acute abdominal signs and confirmed surgical or non-surgical causes for the clinical signs. Agreement for the majority of recorded imaging features was at least moderate. There was poor agreement in the identification of pneumoperitoneum and in the comparison of pancreatic lesion dimensions for B-mode vs. CEUS. The CT feature of fat stranding was detected in cases including, but not limited to, gastric neoplasia with perforation, pancreatitis, and small intestinal foreign body. Ultrasound underestimated the size and number of specific lesions when compared with CE-MDCT. Contrast-enhanced ultrasound was successful in detecting bowel and pancreatic perfusion deficits that CE-MDCT failed to identify. Accuracy for differentiation of surgical vs. non-surgical conditions was high for all modalities; 100%, 94%, and 94% for CE-MDCT, ultrasonography and survey radiography respectively. Findings indicated that CE-MDCT is an accurate screening test for differentiating surgical from non-surgical acute abdominal conditions in dogs. Focused CEUS following CE-MDCT or B-mode ultrasonography may be beneficial for identifying potentially significant hypoperfused lesions. © 2013 Veterinary Radiology & Ultrasound.

  8. Perforated duodenal ulcer presenting with a subphrenic abscess revealed by plain abdominal X-ray films and confirmed by multi-detector computed tomography: a case report.

    PubMed

    Camera, Luigi; Calabrese, Milena; Romeo, Valeria; Scordino, Fabrizio; Mainenti, Pier Paolo; Clemente, Marco; Rapicano, Gaetano; Salvatore, Marco

    2013-11-11

    Peptic ulcer disease is still the major cause of gastrointestinal perforation despite major improvements in both diagnostic and therapeutic strategies. While the diagnosis of a perforated ulcer is straightforward in typical cases, its clinical onset may be subtle because of comorbidities and/or concurrent therapies. We report the case of a 53-year-old Caucasian man with a history of chronic myeloid leukemia on maintenance therapy (100mg/day) with imatinib who was found to have a subphrenic abscess resulting from a perforated duodenal ulcer that had been clinically overlooked. Our patient was febrile (38.5°C) with abdominal tenderness and hypoactive bowel sounds. On the abdominal plain X-ray films, a right subphrenic abscess could be seen. On contrast-enhanced multi-detector computed tomography, a huge air-fluid collection extending from the subphrenic to the subhepatic anterior space was observed. After oral administration of 500cm3 of 3 percent diluted diatrizoate meglumine, an extraluminal leakage of the water-soluble iodinated contrast media could then be appreciated as a result of a perforated duodenal ulcer. During surgery, the abscess was drained and extensive adhesiolysis had to be performed to expose the duodenal bulb where the ulcer was first identified by methylene blue administration and then sutured. While subphrenic abscesses are well known complications of perforated gastric or duodenal ulcers, they have nowadays become rare thanks to advances in both diagnostic and therapeutic strategies for peptic ulcer disease. However, when peptic ulcer disease is not clinically suspected, the contribution of imaging may be substantial.

  9. Perforated duodenal ulcer presenting with a subphrenic abscess revealed by plain abdominal X-ray films and confirmed by multi-detector computed tomography: a case report

    PubMed Central

    2013-01-01

    Introduction Peptic ulcer disease is still the major cause of gastrointestinal perforation despite major improvements in both diagnostic and therapeutic strategies. While the diagnosis of a perforated ulcer is straightforward in typical cases, its clinical onset may be subtle because of comorbidities and/or concurrent therapies. Case presentation We report the case of a 53-year-old Caucasian man with a history of chronic myeloid leukemia on maintenance therapy (100mg/day) with imatinib who was found to have a subphrenic abscess resulting from a perforated duodenal ulcer that had been clinically overlooked. Our patient was febrile (38.5°C) with abdominal tenderness and hypoactive bowel sounds. On the abdominal plain X-ray films, a right subphrenic abscess could be seen. On contrast-enhanced multi-detector computed tomography, a huge air-fluid collection extending from the subphrenic to the subhepatic anterior space was observed. After oral administration of 500cm3 of 3 percent diluted diatrizoate meglumine, an extraluminal leakage of the water-soluble iodinated contrast media could then be appreciated as a result of a perforated duodenal ulcer. During surgery, the abscess was drained and extensive adhesiolysis had to be performed to expose the duodenal bulb where the ulcer was first identified by methylene blue administration and then sutured. Conclusions While subphrenic abscesses are well known complications of perforated gastric or duodenal ulcers, they have nowadays become rare thanks to advances in both diagnostic and therapeutic strategies for peptic ulcer disease. However, when peptic ulcer disease is not clinically suspected, the contribution of imaging may be substantial. PMID:24215711

  10. Validation of Multi-Detector Computed Tomography as a Non-Invasive Method for Measuring Ovarian Volume in Macaques (Macaca fascicularis)

    PubMed Central

    Jones, Jeryl C.; Appt, Susan E.; Werre, Stephen R.; Tan, Joshua C.; Kaplan, Jay R.

    2013-01-01

    The purpose of this study was to validate low radiation dose, contrast-enhanced, multi-detector computed tomography (MDCT) as a non-invasive method for measuring ovarian volume in macaques. Computed tomography scans of four known-volume phantoms and nine mature female cynomolgus macaques were acquired using a previously described, low radiation dose scanning protocol, intravenous contrast enhancement, and a 32-slice MDCT scanner. Immediately following MDCT, ovaries were surgically removed and the ovarian weights were measured. The ovarian volumes were determined using water displacement. A veterinary radiologist who was unaware of actual volumes measured ovarian CT volumes three times, using a laptop computer, pen display tablet, hand-traced regions of interest, and free image analysis software. A statistician selected and performed all tests comparing the actual and CT data. Ovaries were successfully located in all MDCT scans. The iliac arteries and veins, uterus, fallopian tubes, cervix, ureters, urinary bladder, rectum, and colon were also consistently visualized. Large antral follicles were detected in six ovaries. Phantom mean CT volume was 0.702±SD 0.504 cc and the mean actual volume was 0.743±SD 0.526 cc. Ovary mean CT volume was 0.258±SD 0.159 cc and mean water displacement volume was 0.257±SD 0.145 cc. For phantoms, the mean coefficient of variation for CT volumes was 2.5%. For ovaries, the least squares mean coefficient of variation for CT volumes was 5.4%. The ovarian CT volume was significantly associated with actual ovarian volume (ICC coefficient 0.79, regression coefficient 0.5, P = 0.0006) and the actual ovarian weight (ICC coefficient 0.62, regression coefficient 0.6, P = 0.015). There was no association between the CT volume accuracy and mean ovarian CT density (degree of intravenous contrast enhancement), and there was no proportional or fixed bias in the CT volume measurements. Findings from this study indicate that MDCT is a valid non

  11. Validation of multi-detector computed tomography as a non-invasive method for measuring ovarian volume in macaques (Macaca fascicularis).

    PubMed

    Jones, Jeryl C; Appt, Susan E; Werre, Stephen R; Tan, Joshua C; Kaplan, Jay R

    2010-06-01

    The purpose of this study was to validate low radiation dose, contrast-enhanced, multi-detector computed tomography (MDCT) as a non-invasive method for measuring ovarian volume in macaques. Computed tomography scans of four known-volume phantoms and nine mature female cynomolgus macaques were acquired using a previously described, low radiation dose scanning protocol, intravenous contrast enhancement, and a 32-slice MDCT scanner. Immediately following MDCT, ovaries were surgically removed and the ovarian weights were measured. The ovarian volumes were determined using water displacement. A veterinary radiologist who was unaware of actual volumes measured ovarian CT volumes three times, using a laptop computer, pen display tablet, hand-traced regions of interest, and free image analysis software. A statistician selected and performed all tests comparing the actual and CT data. Ovaries were successfully located in all MDCT scans. The iliac arteries and veins, uterus, fallopian tubes, cervix, ureters, urinary bladder, rectum, and colon were also consistently visualized. Large antral follicles were detected in six ovaries. Phantom mean CT volume was 0.702+/-SD 0.504 cc and the mean actual volume was 0.743+/-SD 0.526 cc. Ovary mean CT volume was 0.258+/-SD 0.159 cc and mean water displacement volume was 0.257+/-SD 0.145 cc. For phantoms, the mean coefficient of variation for CT volumes was 2.5%. For ovaries, the least squares mean coefficient of variation for CT volumes was 5.4%. The ovarian CT volume was significantly associated with actual ovarian volume (ICC coefficient 0.79, regression coefficient 0.5, P=0.0006) and the actual ovarian weight (ICC coefficient 0.62, regression coefficient 0.6, P=0.015). There was no association between the CT volume accuracy and mean ovarian CT density (degree of intravenous contrast enhancement), and there was no proportional or fixed bias in the CT volume measurements. Findings from this study indicate that MDCT is a valid non

  12. Magnetic resonance imaging and multi-detector computed tomography assessment of extracellular compartment in ischemic and non-ischemic myocardial pathologies

    PubMed Central

    Saeed, Maythem; Hetts, Steven W; Jablonowski, Robert; Wilson, Mark W

    2014-01-01

    Myocardial pathologies are major causes of morbidity and mortality worldwide. Early detection of loss of cellular integrity and expansion in extracellular volume (ECV) in myocardium is critical to initiate effective treatment. The three compartments in healthy myocardium are: intravascular (approximately 10% of tissue volume), interstitium (approximately 15%) and intracellular (approximately 75%). Myocardial cells, fibroblasts and vascular endothelial/smooth muscle cells represent intracellular compartment and the main proteins in the interstitium are types I/III collagens. Microscopic studies have shown that expansion of ECV is an important feature of diffuse physiologic fibrosis (e.g., aging and obesity) and pathologic fibrosis [heart failure, aortic valve disease, hypertrophic cardiomyopathy, myocarditis, dilated cardiomyopathy, amyloidosis, congenital heart disease, aortic stenosis, restrictive cardiomyopathy (hypereosinophilic and idiopathic types), arrythmogenic right ventricular dysplasia and hypertension]. This review addresses recent advances in measuring of ECV in ischemic and non-ischemic myocardial pathologies. Magnetic resonance imaging (MRI) has the ability to characterize tissue proton relaxation times (T1, T2, and T2*). Proton relaxation times reflect the physical and chemical environments of water protons in myocardium. Delayed contrast enhanced-MRI (DE-MRI) and multi-detector computed tomography (DE-MDCT) demonstrated hyper-enhanced infarct, hypo-enhanced microvascular obstruction zone and moderately enhanced peri-infarct zone, but are limited for visualizing diffuse fibrosis and patchy microinfarct despite the increase in ECV. ECV can be measured on equilibrium contrast enhanced MRI/MDCT and MRI longitudinal relaxation time mapping. Equilibrium contrast enhanced MRI/MDCT and MRI T1 mapping is currently used, but at a lower scale, as an alternative to invasive sub-endomyocardial biopsies to eliminate the need for anesthesia, coronary

  13. Relationship between routine multi-detector cardiac computed tomographic angiography prior to reoperative cardiac surgery, length of stay, and hospital charges.

    PubMed

    Goldstein, Matthew A; Roy, Sion K; Hebsur, Shinivas; Maluenda, Gabriel; Weissman, Gaby; Weigold, Guy; Landsman, Marc J; Hill, Peter C; Pita, Francisco; Corso, Paul J; Boyce, Steven W; Pichard, Augusto D; Waksman, Ron; Taylor, Allen J

    2013-03-01

    While multi-detector cardiac computed tomography angiography (MDCCTA) prior to reoperative cardiac surgery (RCS) has been associated with improved clinical outcomes, its impact on hospital charges and length of stay remains unclear. We studied 364 patients undergoing RCS at Washington Hospital Center between 2004 and 2008, including 137 clinically referred for MDCCTA. Baseline demographics, procedural data, and perioperative outcomes were recorded at the time of the procedure. The primary clinical endpoint was the composite of perioperative death, myocardial infarction (MI), stroke, and hemorrhage-related reoperation. Secondary clinical endpoints included surgical procedural variables and the perioperative volume of bleeding and transfusion. Length of stay was determined using the hospital's electronic medical record. Cost data were extracted from the hospital's billing summary. Analysis was performed on individual categories of care, as well as on total hospital charges. Data were compared between subjects with and without MDCCTA, after adjustment for the Society of Thoracic Surgeons score. Baseline characteristics were similar between the two groups. MDCCTA was associated with shorter procedural times, shorter intensive care unit stays, fewer blood transfusions, and less frequent perioperative MI. There was additionally a trend towards a lower incidence of the primary endpoint (17.5 vs. 24.2 %, p = 0.13) primarily due to a lower incidence of perioperative MI (0 vs. 5.7 %, p = 0.002). MDCCTA was also associated with lower median recovery room [$1,325 (1,250-3,302) vs. $3,217 (1,325-5,353) p < 0.001] and nursing charges [$6,335 (3,623-10,478) vs. $6,916 (3,915-14,499) p = 0.03], although operating room charges were higher [$24,100 (22,300-29,700) vs. $23,500 (19,900-27,700) p < 0.05]. Median total charges [$127,000 (95,000-188,000) vs. $123,000 (86,800-226,000) p = 0.77] and length of stay [9 days (6-19) vs. 11 days (7-19), p = 0.21] were similar. Means analysis

  14. Optimal scan timing for artery-vein separation at whole-brain CT angiography using a 320-row MDCT volume scanner.

    PubMed

    Shirasaka, Takashi; Hiwatashi, Akio; Yamashita, Koji; Kondo, Masatoshi; Hamasaki, Hiroshi; Shimomiya, Yamato; Nakamura, Yasuhiko; Funama, Yoshinori; Honda, Hiroshi

    2017-02-01

    A 320-row multidetector CT (MDCT) is expected for a good artery-vein separation in terms of temporal resolution. However, a shortened scan duration may lead to insufficient vascular enhancement. We assessed the optimal scan timing for the artery-vein separation at whole-brain CT angiography (CTA) when bolus tracking was used at 320-row MDCT. We analyzed 60 patients, who underwent whole-brain four-dimensional CTA. Difference in CT attenuation between the internal carotid artery (ICA) and the superior sagittal sinus (Datt) was calculated in each phase. Using a visual evaluation score for the depiction of arteries and veins, we calculated the difference between the mean score for the intracranial arteries and the mean score for the veins (Dscore). We assessed the time at which the maximum Datt and Dscore were simultaneously observed. The maximum Datt was observed at 6.0 s and 8.0 s in the arterial-dominant phase and at 16.0 s and 18.0 s in the venous-dominant phase after the contrast media arrival time at the ICA (Taa). The maximum Dscore was observed at 6.0 s and 8.0 s in the arterial-dominant phase and at 16.0 s in the venous-dominant phase after the Taa. There were no statistically significant differences in Datt (p = 0.375) or Dscore (p = 0.139) between these scan timings. The optimal scan timing for artery-vein separation at whole-brain CTA was 6.0 s or 8.0 s for the arteries and 16.0 s for the veins after the Taa. Advances in knowledge: Optimal scan timing allowed us to visualize intracranial arteries or veins with minimal superimposition.

  15. CT venography after knee replacement surgery: comparison of dual-energy CT-based monochromatic imaging and single-energy metal artifact reduction techniques on a 320-row CT scanner.

    PubMed

    Kidoh, Masafumi; Utsunomiya, Daisuke; Oda, Seitaro; Nakaura, Takeshi; Funama, Yoshinori; Yuki, Hideaki; Hirata, Kenichiro; Hatemura, Masahiro; Namimoto, Tomohiro; Yamashita, Yasuyuki

    2017-02-01

    An optimal metal artifact reduction (MAR) technique is needed for a reliable and accurate image-based diagnosis. Using a 320-row scanner, we compared the dual-energy computed tomography (CT)-based monochromatic and the single-energy metal artifact reduction (SEMAR) techniques for CT venography (CTV) to identify the better imaging method for diagnosing deep vein thrombosis (DVT) in patients who had undergone knee replacement surgery. Twenty-three consecutive patients with suspected DVT after unilateral knee replacement surgery underwent dual-energy CT (135/80 kVp). Monochromatic images of 35-135 keV were generated; the monochromatic image with the best signal-to-noise ratio (SNR) of the popliteal vein near the metal prosthesis were selected. The projection data of 80 kVp were reconstructed using MAR algorithm. The mean SNR ON MAR and the best SNR ON monochromatic images were compared. Two radiologists evaluated visualization of the metal artifacts on a four-point scale where 1 = extensive artifacts, 2 = strong artifacts, 3 = mild artifacts, and 4 = minimal artifacts. The mean SNR was significantly higher on the MAR than the monochromatic images (12.8 ± 4.7 versus 7.7 ± 5.1, P < 0.01) and the visual scores were significantly higher for MAR than monochromatic images (2.6 ± 0.8 versus 1.3 ± 0.4, P < 0.01). For CTV after knee replacement surgery, the MAR technique is superior to the monochromatic imaging technique.

  16. CT venography after knee replacement surgery: comparison of dual-energy CT-based monochromatic imaging and single-energy metal artifact reduction techniques on a 320-row CT scanner

    PubMed Central

    Utsunomiya, Daisuke; Oda, Seitaro; Nakaura, Takeshi; Funama, Yoshinori; Yuki, Hideaki; Hirata, Kenichiro; Hatemura, Masahiro; Namimoto, Tomohiro; Yamashita, Yasuyuki

    2017-01-01

    Background An optimal metal artifact reduction (MAR) technique is needed for a reliable and accurate image-based diagnosis. Purpose Using a 320-row scanner, we compared the dual-energy computed tomography (CT)-based monochromatic and the single-energy metal artifact reduction (SEMAR) techniques for CT venography (CTV) to identify the better imaging method for diagnosing deep vein thrombosis (DVT) in patients who had undergone knee replacement surgery. Material and Methods Twenty-three consecutive patients with suspected DVT after unilateral knee replacement surgery underwent dual-energy CT (135/80 kVp). Monochromatic images of 35–135 keV were generated; the monochromatic image with the best signal-to-noise ratio (SNR) of the popliteal vein near the metal prosthesis were selected. The projection data of 80 kVp were reconstructed using MAR algorithm. The mean SNR ON MAR and the best SNR ON monochromatic images were compared. Two radiologists evaluated visualization of the metal artifacts on a four-point scale where 1 = extensive artifacts, 2 = strong artifacts, 3 = mild artifacts, and 4 = minimal artifacts. Results The mean SNR was significantly higher on the MAR than the monochromatic images (12.8 ± 4.7 versus 7.7 ± 5.1, P < 0.01) and the visual scores were significantly higher for MAR than monochromatic images (2.6 ± 0.8 versus 1.3 ± 0.4, P < 0.01). Conclusion For CTV after knee replacement surgery, the MAR technique is superior to the monochromatic imaging technique. PMID:28321330

  17. Evaluation of stenosis severity of coronary calcified lesions using transluminal attenuation gradient: clinical application of 320-row volume CT.

    PubMed

    Yang, Fengfeng; Dong, Jie; Wang, Wei; Wang, Xiuting; Fu, Xiaojiao; Kumar, Nanda C; Zhang, Tong

    2017-08-01

    The aim of this study was to evaluate the accuracy of transluminal attenuation gradient (TAG) in diagnosing the stenosis degree of difficult lesions to accurately assess the degree of luminal stenosis using coronary computed tomography angiography (CCTA). A total of 130 patients consecutively received CCTA and coronary angiography (CAG). The average transluminal Hounsfield units (HU) of the regions of interest were consecutively measured at an interval of 5 mm from the ostium to the distal level, followed by the calculation of TAG. The diagnostic performance of CCTA, TAG and CCTA+TAG for the stenosis degree of coronary calcified lesions and their reclassification for stenosis degree were analyzed, especially for calcified lesions. Compared with CAG, the TAG in CCTA was consistent with the largest stenosis degree of each blood vessel. TAG improved the accuracy of CCTA in the diagnosis of calcified lesions (P<0.0001). When threshold was ≤-6.9 HU/10 mm, the sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of CCTA+TAG in the diagnosis of coronary calcified lesions were 90.26%, 95.45%, 98.58% and 73.68%. TAG for calcified lesions had moderate sensitivity (86.61%; 95% CI: 81.8-90.5%) and high specificity (91.20%; 95% CI: 84.8-95.5%). In addition, TAG can help to improve the reclassification of CCTA for coronary stenosis degree, especially for calcified lesions (NRI=0.127, P=0.045). TAG can help to improve the diagnostic performance of CCTA for the stenosis degree of lesions, and it may also help to improve the reclassification of the stenosis degree of calcified lesions.

  18. Image quality and radiation reduction of 320-row area detector CT coronary angiography with optimal tube voltage selection and an automatic exposure control system: comparison with body mass index-adapted protocol.

    PubMed

    Lim, Jiyeon; Park, Eun-Ah; Lee, Whal; Shim, Hackjoon; Chung, Jin Wook

    2015-06-01

    To assess the image quality and radiation exposure of 320-row area detector computed tomography (320-ADCT) coronary angiography with optimal tube voltage selection with the guidance of an automatic exposure control system in comparison with a body mass index (BMI)-adapted protocol. Twenty-two patients (study group) underwent 320-ADCT coronary angiography using an automatic exposure control system with the target standard deviation value of 33 as the image quality index and the lowest possible tube voltage. For comparison, a sex- and BMI-matched group (control group, n = 22) using a BMI-adapted protocol was established. Images of both groups were reconstructed by an iterative reconstruction algorithm. For objective evaluation of the image quality, image noise, vessel density, signal to noise ratio (SNR), and contrast to noise ratio (CNR) were measured. Two blinded readers then subjectively graded the image quality using a four-point scale (1: nondiagnostic to 4: excellent). Radiation exposure was also measured. Although the study group tended to show higher image noise (14.1 ± 3.6 vs. 9.3 ± 2.2 HU, P = 0.111) and higher vessel density (665.5 ± 161 vs. 498 ± 143 HU, P = 0.430) than the control group, the differences were not significant. There was no significant difference between the two groups for SNR (52.5 ± 19.2 vs. 60.6 ± 21.8, P = 0.729), CNR (57.0 ± 19.8 vs. 67.8 ± 23.3, P = 0.531), or subjective image quality scores (3.47 ± 0.55 vs. 3.59 ± 0.56, P = 0.960). However, radiation exposure was significantly reduced by 42 % in the study group (1.9 ± 0.8 vs. 3.6 ± 0.4 mSv, P = 0.003). Optimal tube voltage selection with the guidance of an automatic exposure control system in 320-ADCT coronary angiography allows substantial radiation reduction without significant impairment of image quality, compared to the results obtained using a BMI-based protocol.

  19. Optic Strut and Para-clinoid Region – Assessment by Multi-detector Computed Tomography with Multiplanar and 3 Dimensional Reconstructions

    PubMed Central

    Ravikiran, S.R.; Kumar, Ashvini; Chavadi, Channabasappa; Pulastya, Sanyal

    2015-01-01

    Purpose To evaluate thickness, location and orientation of optic strut and anterior clinoid process and variations in paraclinoid region, solely based on multidetector computed tomography (MDCT) images with multiplanar (MPR) and 3 dimensional (3D) reconstructions, among Indian population. Materials and Methods Ninety five CT scans of head and paranasal sinuses patients were retrospectively evaluated with MPR and 3D reconstructions to assess optic strut thickness, angle and location, variations like pneumatisation, carotico-clinoid foramen and inter-clinoid osseous ridge. Results Mean optic strut thickness was 3.64mm (±0.64), optic strut angle was 42.67 (±6.16) degrees. Mean width and length of anterior clinoid process were 10.65mm (±0.79) and 11.20mm (±0.95) respectively. Optic strut attachment to sphenoid body was predominantly sulcal as in 52 cases (54.74%) and was most frequently attached to anterior 2/5th of anterior clinoid process, seen in 93 sides (48.95%). Pneumatisation of optic strut occurred in 23 sides. Carotico-clinoid foramen was observed in 42 cases (22.11%), complete foramen in 10 cases (5.26%), incomplete foramen in 24 cases (12.63%) and contact type in 8 cases (4.21%). Inter-clinoid osseous bridge was seen unilaterally in 4 cases. Conclusion The study assesses morphometric features and anatomical variations of paraclinoid region using MDCT 3D and multiplanar reconstructions in Indian population. PMID:26557589

  20. Association of atherosclerosis in the descending thoracic aorta with coronary artery disease on multi detector row computed tomography coronary angiography in patients with suspected coronary artery disease.

    PubMed

    Roos, Cornelis J; Witkowska, Agnieszka J; de Graaf, Michiel A; Veltman, Caroline E; Delgado, Victoria; de Grooth, Greetje J; Jukema, J Wouter; Bax, Jeroen J; Scholte, Arthur J

    2013-12-01

    The association between atherosclerosis in the descending thoracic aorta (DTA) visualized on computed tomography coronary angiography (CTA) and coronary artery disease (CAD) has not been extensively explored. Therefore, a comprehensive analysis of DTA atherosclerosis on CTA was performed and the association of DTA atherosclerosis with CAD was evaluated in patients with suspected CAD. A total of 344 patients (54 ± 12 years, 54% men) with suspected CAD underwent CTA. CTA were classified based on CAD severity in no signs of atherosclerosis or minor wall-irregularities <30%, non-significant CAD 30-50%, or significant CAD ≥50% stenosis. The DTA was divided in segments according the posterior intercostal arteries. Per segment the presence of atherosclerotic plaque (defined as ≥2 mm wall thickness) was determined and maximal wall thickness was measured. Plaque composition was scored as non-calcified or mixed and the percentage of DTA segments with atherosclerosis was calculated. Significant CAD was present in 152 (44%) patients and 278 (81%) had DTA atherosclerotic plaque. DTA maximal wall thickness and percentage of DTA segments with atherosclerosis were 2.7 ± 1 mm and 49 ± 36%. The presence, severity and extent of DTA atherosclerosis significantly increased with increasing CAD severity. Multivariate logistic regression analysis corrected for age and other risk factors demonstrated independent associations of DTA plaque (OR 6.56, 95% CI 1.78-24.19, p = 0.005) and maximal DTA wall thickness (OR 2.00, 95% CI 1.28-3.12, p = 0.002) with significant CAD. The presence and severity of DTA atherosclerosis were independently related with significant CAD on CTA in patients with suspected CAD.

  1. The value of dual-source multi detector-row computed tomography in determining pulmonary blood supply in patients with pulmonary atresia with ventricular septal defect.

    PubMed

    Chaosuwannakit, Narumol; Makarawate, Pattarapong

    2017-07-13

    Primary evaluation of patients with pulmonary atresia with ventricular septal defect (PA-VSD) traditionally relies upon echocardiography and conventional cardiac angiography (CCA). CA is considered the gold standard for delineation of anatomy in children with pulmonary atresia with ventricular septal defect (PA with VSD). Data comparing CCA and dual-source multidetector-row computed tomography angiography (MDCT) in PA with VSD patients is limited. The objective of this study was to test the hypothesis that MDCT is equivalent to CCA for anatomic delineation in these patients. Twenty eight patients with PA with VSD were underwent CCA and MDCT in close proximity to each other without interval therapy. A retrospective review of these 28 patients was performed. All MDCT data of pulmonary artery morphology, MAPCAs and type of blood supply (dual versus single supply) were evaluated by a blinded experts and results were compared with CCA. 28 patients had adequate size right and left pulmonary arteries (21 confluent and 7 non-confluent). Seven patients had complete absence of native pulmonary artery and three patients had stenosis of distal branches of pulmonary arteries; all had major aortopulmonary collateral arteries (MAPCAs) from descending thoracic aorta and/or subclavian arteries. Sensitivity, specificity,positive and negative predictive value of MDCT for detecting confluent of pulmonary arteries, absence of native pulmonary artery and stenosis of pulmonary arteries were all 100%. Moreover accuracy of detecting MAPCAs were excellent. These results suggest that MDCT and CCA are equilavent in their ability to delineate pulmonary artery anatomy and MAPCAs. Dual source MDCT provides high diagnostic accuracy in evaluation of pulmonary blood supply in patients with PA-VSD and allow precise characterization of the condition of pulmonary arteries and MAPCAs which is of paramount importance in managing patients with PA-VSD.

  2. Health economic evaluation of Gd-EOB-DTPA MRI vs ECCM-MRI and multi-detector computed tomography in patients with suspected hepatocellular carcinoma in Thailand and South Korea.

    PubMed

    Lee, Jeong-Min; Kim, Myeong-Jin; Phongkitkarun, Sith; Sobhonslidsuk, Abhasnee; Holtorf, Anke-Peggy; Rinde, Harald; Bergmann, Karsten

    2016-08-01

    The effectiveness of treatment decisions and economic outcomes of using gadolinium ethoxybenzyl diethylenetriamine pentaacetic acid-enhanced magnetic resonance imaging (Gd-EOB-DTPA-MRI) were compared with extracellular contrast media-enhanced MRI (ECCM-MRI) and multi-detector computed tomography (MDCT) as initial procedures in patients with suspected hepatocellular carcinoma (HCC) in South Korea and Thailand. A decision-tree model simulated the clinical pathway for patients with suspected HCC from the first imaging procedure to a confirmed treatment decision. Input data (probabilities and resource consumptions) were estimated and validated by clinical experts. Costs for diagnostic alternatives and related treatment options were derived from published sources, taking into account both payer's and hospital's perspectives. All experts from Korea and Thailand agreed that Gd-EOB-DTPA-MRI yields the highest diagnostic certainty and minimizes the need for additional confirmatory diagnostic procedures in HCC. In Korea, from the payer's perspective, total cost was USD $3087/patient to reach a confirmed treatment decision using Gd-EOB-DTPA-MRI (vs $3205/patient for MDCT and $3403/patient for ECCM-MRI). From the hospital's perspective, Gd-EOB-DTPA-MRI incurred the lowest cost ($2289/patient vs $2320/patient and $2528/patient, respectively). In Thailand, Gd-EOB-DTPA-MRI was the least costly alternative for the payer ($702/patient vs $931/patient for MDCT and $873/patient for ECCM-MRI). From the hospital's perspective, costs were $1106/patient, $1178/patient, and $1087/patient for Gd-EOB-DTPA-MRI, MDCT, and ECCM-MRI, respectively. Gd-EOB-DTPA-MRI as an initial imaging procedure in patients with suspected HCC provides better diagnostic certainty and relevant statutory health insurance cost savings in Thailand and Korea, compared with ECCM-MRI and MDCT.

  3. Accuracy of multi-detector computed tomography (MDCT) in staging of renal cell carcinoma (RCC): analysis of risk factors for mis-staging and its impact on surgical intervention.

    PubMed

    El-Hefnawy, Ahmed S; Mosbah, Ahmed; El-Diasty, Tarek; Hassan, Mohammed; Shaaban, Atallah A

    2013-08-01

    To assess the accuracy of multi-detector computed tomography (MDCT) in preoperative staging of renal cell carcinoma (RCC) and to detect the possible risk factors for mis-staging. In addition, the impact of radiological mis-staging on surgical decision and operative procedures was evaluated. Data files of 693 patients, who underwent either radical or partial nephrectomy after preoperative staging by MDCT between January 2003 and December 2010, were retrospectively reviewed. Radiological data were compared to surgical and histopathological findings. Patients were classified according to 2009 TNM staging classification. Diagnostic accuracy per stage and its impact on surgical intervention were evaluated. The overall accuracy was 64.5%, and over-stage was detected in 29.5% and under-stage in 6%. Sensitivity and specificity were highest in stage T3b (85 and 99.5%, respectively), while T4 showed the lowest sensitivity and PPV (57 and 45%). Degree of agreement with pathological staging was substantial in T1 (κ = 0.7), fair in T2 (κ = 0. 4), perfect in T3b (κ = 0.81), and slight for the other stages (κ = <0.1). On multivariate analysis, conventional RCC and tumor size > 7 cm represent the significant risk factors (RR: 1.6, 95% CI: 1.1-2.3, P < 0.004 and RR: 2.4, 95% CI: 1.7-3.5, P < 0.001, respectively). Mis-staging was seen to have no negative impact on surgical decision. MDCT is an accepted tool for renal tumor staging. Tumor mis-staging after MDCT is of little clinical importance. Large tumor size >7 cm and conventional RCC are risk factors for tumor mis-staging.

  4. Gd-EOB-DTPA-enhanced 3.0-Tesla MRI findings for the preoperative detection of focal liver lesions: Comparison with iodine-enhanced multi-detector computed tomography

    NASA Astrophysics Data System (ADS)

    Park, Hyong-Hu; Goo, Eun-Hoe; Im, In-Chul; Lee, Jae-Seung; Kim, Moon-Jib; Kwak, Byung-Joon; Chung, Woon-Kwan; Dong, Kyung-Rae

    2012-12-01

    The safety of gadolinium-ethoxybenzyl-diethylenetriamine-pentaacetic-acid (Gd-EOB-DTPA) has been confirmed, but more study is needed to assess the diagnostic accuracy of Gd-EOB-DTPA-enhanced magnetic resonance imaging (MRI) in patients with a hepatocellular carcinoma (HCC) for whom surgical treatment is considered or with a metastatic hepatoma. Research is also needed to examine the rate of detection of hepatic lesions compared to multi-detector computed tomography (MDCT), which is used most frequently to localize and characterize a HCC. Gd-EOB-DTPA-enhanced MRI and iodine-enhanced MDCT imaging were compared for the preoperative detection of focal liver lesions. The clinical usefulness of each method was examined. The current study enrolled 79 patients with focal liver lesions who preoperatively underwent MRI and MDCT. In these patients, there was less than one month between the two diagnostic modalities. Imaging data were taken before and after contrast enhancement in both methods. To evaluate the images, we analyzed the signal-to-noise ratio (SNR) and the contrast-to-noise ratio (CNR) in the lesions and the liver parenchyma. To compare the sensitivity of the two methods, we performed a quantitative analysis of the percentage signal intensity of the liver (PSIL) on a high resolution picture archiving and communication system (PACS) monitor (paired-samples t-test, p < 0.05). The enhancement was evaluated based on a consensus of four observers. The enhancement pattern and the morphological features during the arterial and the delayed phases were correlated between the Gd-EOB-DTPA-enhanced MRI findings and the iodine-enhanced MDCT by using an adjusted x2 test. The SNRs, CNRs, and PSIL all had a greater detection rate in Gd-EOB-DTPA enhanced MRI than in iodine-enhanced MDCT. Hepatocyte-selective uptake was observed 20 minutes after the injection in the focal nodular hyperplasia (FNH, 9/9), adenoma (9/10), and highly-differentiated HCC (grade G1, 27/30). Rim

  5. Can Contrast-Enhanced Multi-Detector Computed Tomography Replace Transesophageal Echocardiography for the Detection of Thrombogenic Milieu and Thrombi in the Left Atrial Appendage: A Prospective Study with 124 Patients.

    PubMed

    Homsi, R; Nath, B; Luetkens, J A; Schwab, J O; Schild, H H; Naehle, C P

    2016-01-01

    To assess the diagnostic value of contrast-enhanced multi-detector computed tomography (MD-CT) for identifying patients with left atrial appendage (LAA) thrombus or circulatory stasis. 124 patients with a history of atrial fibrillation and/or cerebral ischemia (83 men, mean age 58.6 ± 12.4 years) and with a clinical indication for MD-CT of the heart and for transesophageal echocardiography (TEE) were included in the study. LAA thrombus or thrombogenic milieu was visually identified in TEE and MD-CT. In addition, MD-CT was analyzed quantitatively measuring the Hounsfield units (HU) of the left atrium (LA), the LAA and the ascending aorta (AA), and calculating the HU ratios LAA/AA (HU [LAA/AA]) und LAA/LA (HU [LAA/LA]). Sensitivity, specificity, negative predictive value (NPV) and positive predictive value (PPV) were calculated. The prevalence of a thrombus or thrombogenic milieu as assessed by TEE was 21.8 %. The HU ratio was lower in patients with thrombus or thrombogenic milieu (HU [LAA/AA]: 0.590 ± 0.248 vs. 0.909 ± 0.141; p < 0.001 und HU [LAA/LA] 0.689 ± 0.366 vs. 1.082 ± 0.228; p < 0.001). For the diagnosis of thrombus or a thrombogenic milieu, visual analysis yielded a sensitivity of 81.5 %, a specificity of 96.9 %, a PPV of 87.5 % and a NPV of 95.2 %. By combining visual and quantitative analysis with one criterion being positive, the specificity decreased to 91.8 %, the sensitivity to 77.8 %, the PPV to 72.4 %, and the NPV to 94.9 %. Visual analysis of the LAA in the evaluation of thrombus or thrombogenic milieu yields a high NPV of 95.1 % and may especially be useful to rule out LAA thrombi in patients with contraindications for TEE. Additional calculation of HU ratios did not improve the diagnostic performance of MD-CT. • MD-CT can reliably exclude atrial appendage thrombi/thrombogenic milieu. • MD-CT is an alternative method in patients with contraindications to TEE. • Calculation of

  6. Multi-detector row CT of pancreatic islet cell tumors.

    PubMed

    Horton, Karen M; Hruban, Ralph H; Yeo, Charles; Fishman, Elliot K

    2006-01-01

    Pancreatic islet cell tumors (ICTs) are neuroendocrine neoplasms that produce and secrete hormones to a variable degree. These neoplasms can present a diagnostic challenge, both clinically and radiologically. ICTs can be classified as either syndromic or nonsyndromic on the basis of their clinical manifestations. Multi-detector row computed tomography (CT) plays an important role in the diagnosis and staging of both syndromic and nonsyndromic ICTs. In general, syndromic ICTs are less than 3 cm in size. They are typically hyperenhancing and are usually best seen on CT scans obtained during the arterial phase. Nonsyndromic ICTs tend to be larger than syndromic ICTs at presentation and are more likely to be cystic or necrotic. It is important for the radiologist to be familiar with appropriate CT protocol for the evaluation of patients with suspected pancreatic ICT and to understand the variable CT appearances of these neoplasms. (c) RSNA, 2006.

  7. Early post-operative weight loss after laparoscopic sleeve gastrectomy correlates with the volume of the excised stomach and not with that of the sleeve! Preliminary data from a multi-detector computed tomography-based study.

    PubMed

    Pawanindra, Lal; Vindal, Anubhav; Midha, Manoj; Nagpal, Prashant; Manchanda, Alpana; Chander, Jagdish

    2015-10-01

    Pre- and post-operative stomach volumes can be important determinants for effectiveness of laparoscopic sleeve gastrectomy (LSG) in causing weight loss. There is little existing data on the volumes of stomach preoperatively and that excised during LSG. This study was designed to evaluate the change in gastric volume after LSG using multi-detector CT and to correlate it with early post-operative weight loss. Twenty consecutive patients with BMI ≥ 40 kg/m(2) and medical comorbidities underwent LSG between October 2011 and October 2013 and were analysed prospectively. The pre-operative stomach volume was measured by MDCT done 1-3 days before the surgery. LSG was performed in the standard manner using a 36F bougie. The volume of excised stomach was measured by distending the specimen with saline. MDCT of the upper abdomen was repeated 3 months postoperatively to calculate the gastric sleeve volume. Weight loss and resolution of comorbidities were documented. The mean pre-operative weight of patients was 123.90 kg, and the mean pre-operative stomach volume on MDCT was 1,067 ml. The stomach volume on pre-operative MDCT correlated with pre-operative weight and BMI. The mean volume of the excised stomach was 859 ml when measured by distension of the specimen and 850 ml on MDCT. After 3 months post surgery, the mean volume of gastric sleeve on MDCT was 217 ml, and the mean weight of the patients was 101.22 kg. The volume of the excised stomach calculated by MDCT correlated with the weight loss achieved 3 months postoperatively. However, no correlation was seen between the gastric sleeve volume 3 months postoperatively and weight loss during this period. MDCT is a good method to measure gastric volume before and after LSG. Early post-operative weight loss (3 months) correlates well with the volume of the excised stomach but not with that of the gastric sleeve.

  8. Dedicated multi-detector CT of the esophagus: spectrum of diseases.

    PubMed

    Ba-Ssalamah, Ahmed; Zacherl, Johannes; Noebauer-Huhmann, Iris Melanie; Uffmann, Martin; Matzek, Wolfgang Karl; Pinker, Katja; Herold, Christian; Schima, Wolfgang

    2009-01-01

    Multi-detector computed tomography (CT) offers new opportunities in the imaging of the gastrointestinal tract. Its ability to cover a large volume in a very short scan time, and in a single breath hold with thin collimation and isotropic voxels, allows the imaging of the entire esophagus with high-quality multiplanar reformation and 3D reconstruction. Proper distention of the esophagus and stomach (by oral administration of effervescent granules and water) and optimally timed administration of intravenous contrast material are required to detect and characterize disease. In contrast to endoscopy and double-contrast studies of the upper GI tract, CT provides information about both the esophageal wall and the extramural extent of disease. Preoperative staging of esophageal carcinoma appears to be the main indication for MDCT. In addition, MDCT allows detection of other esophageal malignancies, such as lymphoma and benign esophageal tumors, such as leiomyma. A diagnosis of rupture or fistula of the esophagus can be firmly established using MDCT. Furthermore, miscellaneous esophageal conditions, such as achalasia, esophagitis, diverticula, and varices, are incidental findings and can also be visualized with hydro-multi-detector CT. Multi-detector CT is a valuable tool for the evaluation of esophageal wall disease and serves as an adjunct to endoscopy.

  9. Relationship between noise, dose, and pitch in cardiac multi-detector row CT.

    PubMed

    Primak, Andrew N; McCollough, Cynthia H; Bruesewitz, Michael R; Zhang, Jie; Fletcher, Joel G

    2006-01-01

    In spiral computed tomography (CT), dose is always inversely proportional to pitch. However, the relationship between noise and pitch (and hence noise and dose) depends on the scanner type (single vs multi-detector row) and reconstruction mode (cardiac vs noncardiac). In single detector row spiral CT, noise is independent of pitch. Conversely, in noncardiac multi-detector row CT, noise depends on pitch because the spiral interpolation algorithm makes use of redundant data from different detector rows to decrease noise for pitch values less than 1 (and increase noise for pitch values > 1). However, in cardiac spiral CT, redundant data cannot be used because such data averaging would degrade the temporal resolution. Therefore, the behavior of noise versus pitch returns to the single detector row paradigm, with noise being independent of pitch. Consequently, since faster rotation times require lower pitch values in cardiac multi-detector row CT, dose is increased without a commensurate decrease in noise. Thus, the use of faster rotation times will improve temporal resolution, not alter noise, and increase dose. For a particular application, the higher dose resulting from faster rotation speeds should be justified by the clinical benefits of the improved temporal resolution.

  10. Three-dimensional imaging in the context of minimally invasive and transcatheter cardiovascular interventions using multi-detector computed tomography: from pre-operative planning to intra-operative guidance.

    PubMed

    Schoenhagen, Paul; Numburi, Uma; Halliburton, Sandra S; Aulbach, Peter; von Roden, Martin; Desai, Milind Y; Rodriguez, Leonardo L; Kapadia, Samir R; Tuzcu, E Murat; Lytle, Bruce W

    2010-11-01

    The rapid expansion of less invasive surgical and transcatheter cardiovascular procedures for a wide range of cardiovascular conditions, including coronary, valvular, structural cardiac, and aortic disease has been paralleled by novel three-dimensional (3-D) approaches to imaging. Three-dimensional imaging allows acquisition of volumetric data sets and subsequent off-line reconstructions along unlimited 2-D planes and 3-D volumes. Pre-procedural 3-D imaging provides detailed understanding of the operative field for surgical/interventional planning. Integration of imaging modalities during the procedure allows real-time guidance. Because computed tomography routinely acquires 3-D data sets, it has been one of the early imaging modalities applied in the context of surgical and interventional planning. This review describes the continuum of applications from pre-operative planning to procedural integration, based on the emerging experience with computed tomography and rotational angiography, respectively. At the same time, the potential adverse effects of imaging with X-ray-based tomographic or angiographic modalities are discussed. It is emphasized that the role of imaging guidance in this context remains unclear and will need to be evaluated in clinical trials. This is in particular true, because data showing improved outcome or even non-inferiority for most of the emerging transcatheter procedures are still lacking.

  11. Fast and precise map-making for massively multi-detector CMB experiments

    NASA Astrophysics Data System (ADS)

    Sutton, D.; Zuntz, J. A.; Ferreira, P. G.; Brown, M. L.; Eriksen, H. K.; Johnson, B. R.; Kusaka, A.; Næss, S. K.; Wehus, I. K.

    2010-09-01

    Future cosmic microwave background (CMB) polarization experiments aim to measure an unprecedentedly small signal - the primordial gravity wave component of the polarization field B mode. To achieve this, they will analyse huge data sets, involving years of time-ordered data (TOD) from massively multi-detector focal planes. This creates the need for fast and precise methods to complement the maximum-likelihood (ML) approach in analysis pipelines. In this paper, we investigate fast map-making methods as applied to long duration, massively multi-detector, ground-based experiments, in the context of the search for B modes. We focus on two alternative map-making approaches: destriping and TOD filtering, comparing their performance on simulated multi-detector polarization data. We have written an optimized, parallel destriping code, the DEStriping CARTographer (DESCART), that is generalized for massive focal planes, including the potential effect of cross-correlated TOD 1/f noise. We also determine the scaling of computing time for destriping as applied to a simulated full-season data set for a realistic experiment. We find that destriping can outperform filtering in estimating both the large-scale E- and B-mode angular power spectra. In particular, filtering can produce significant spurious B-mode power via EB mixing. Whilst this can be removed, it contributes to the variance of B-mode bandpower estimates at scales near the primordial B-mode peak. For the experimental configuration we simulate, this has an effect on the possible detection significance for primordial B modes. Destriping is a viable alternative fast method to the full ML approach that does not cause the problems associated with filtering, and is flexible enough to fit into both ML and Monte Carlo pseudo-Cl pipelines.

  12. Acute appendicitis: comparison of low-dose and standard-dose unenhanced multi-detector row CT.

    PubMed

    Keyzer, Caroline; Tack, Denis; de Maertelaer, Viviane; Bohy, Pascale; Gevenois, Pierre Alain; Van Gansbeke, Daniel

    2004-07-01

    To prospectively compare low- and standard-dose unenhanced multi-detector row computed tomography (CT) in patients suspected of having acute appendicitis. Ninety-five consecutive patients underwent two unenhanced multi-detector row CT examinations with 4 x 2.5-mm collimation, 120 kVp, and 30 and 100 effective mAs. Two radiologists independently read the images obtained at each dose during two sessions. Readers recorded visualization of the appendix and presence of gas in its lumen, appendicolith, periappendiceal fat stranding, cecal wall thickening, and abscess or phlegmon to measure the diameter of the appendix and to propose diagnosis (appendicitis or alternative). Data were compared according to dose and reader, with definite diagnosis established on basis of surgical findings (n = 37) or clinical follow-up. chi(2) tests and logistic regression were used. Measurement agreements were assessed with Cohen kappa statistics. Twenty-nine patients had a definite diagnosis of appendicitis. No difference was observed between the frequency of visualization of the appendix (P =.874) neither in its mean diameter (P =.101-.696, according to readers and sessions) nor in the readers' overall diagnosis (P =.788) at each dose. Sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of each sign were not different between doses. Fat stranding, appendicolith, and diameter were the most predictive signs, regardless of dose, yielding approximately 90% of correct diagnoses. The ability to propose a correct alternative diagnosis was not influenced by the dose. Low-dose unenhanced multi-detector row CT has similar diagnostic performance as standard-dose unenhanced multi-detector row CT for the diagnosis of acute appendicitis. Copyright RSNA, 2004

  13. Noninvasive imaging of coronary arteries: current and future role of multi-detector row CT.

    PubMed

    Schoenhagen, Paul; Halliburton, Sandra S; Stillman, Arthur E; Kuzmiak, Stacie A; Nissen, Steven E; Tuzcu, E Murat; White, Richard D

    2004-07-01

    While invasive imaging techniques, especially selective conventional coronary angiography, will remain vital to planning and guiding catheter-based and surgical treatment of significantly stenotic coronary lesions, the comprehensive and serial assessment of asymptomatic or minimally symptomatic stages of coronary artery disease (CAD) for preventive purposes will eventually need to rely on noninvasive imaging techniques. Cardiovascular imaging with tomographic modalities, including computed tomography (CT) and magnetic resonance imaging, has great potential for providing valuable information. This review article will describe the current and future role of cardiac CT, and in particular that of multi-detector row CT, for imaging of atherosclerotic and other pathologic changes of the coronary arteries. It will describe how tomographic coronary imaging may eventually supplement traditional angiographic techniques in understanding the patterns of atherosclerotic CAD development.

  14. Toroid cavity/coil NMR multi-detector

    DOEpatents

    Gerald, II, Rex E.; Meadows, Alexander D.; Gregar, Joseph S.; Rathke, Jerome W.

    2007-09-18

    An analytical device for rapid, non-invasive nuclear magnetic resonance (NMR) spectroscopy of multiple samples using a single spectrometer is provided. A modified toroid cavity/coil detector (TCD), and methods for conducting the simultaneous acquisition of NMR data for multiple samples including a protocol for testing NMR multi-detectors are provided. One embodiment includes a plurality of LC resonant circuits including spatially separated toroid coil inductors, each toroid coil inductor enveloping its corresponding sample volume, and tuned to resonate at a predefined frequency using a variable capacitor. The toroid coil is formed into a loop, where both ends of the toroid coil are brought into coincidence. Another embodiment includes multiple micro Helmholtz coils arranged on a circular perimeter concentric with a central conductor of the toroid cavity.

  15. Enhanced security for multi-detector quantum random number generators

    NASA Astrophysics Data System (ADS)

    Marangon, Davide G.; Vallone, Giuseppe; Zanforlin, Ugo; Villoresi, Paolo

    2016-11-01

    Quantum random number generators (QRNG) represent an advanced solution for randomness generation, which is essential in every cryptographic application. In this context, integrated arrays of single-photon detectors have promising applications as QRNGs based on the spatial detection of photons. For the employment of QRNGs in cryptography, it is necessary to have efficient methods to evaluate the so-called quantum min-entropy that corresponds to the amount of the true extractable quantum randomness from the QRNG. Here, we present an efficient method that allows the estimation of the quantum min-entropy for a multi-detector QRNG. In particular, we consider a scenario in which an attacker can control the efficiency of the detectors and knows the emitted number of photons. Eventually, we apply the method to a QRNG with 103 detectors.

  16. Recent technologic advances in multi-detector row cardiac CT.

    PubMed

    Halliburton, Sandra Simon

    2009-11-01

    Recent technical advances in multi-detector row CT have resulted in lower radiation dose, improved temporal and spatial resolution, decreased scan time, and improved tissue differentiation. Lower radiation doses have resulted from the use of pre-patient z collimators, the availability of thin-slice axial data acquisition, the increased efficiency of ECG-based tube current modulation, and the implementation of iterative reconstruction algorithms. Faster gantry rotation and the simultaneous use of two x-ray sources have led to improvements in temporal resolution, and gains in spatial resolution have been achieved through application of the flying x-ray focal-spot technique in the z-direction. Shorter scan times have resulted from the design of detector arrays with increasing numbers of detector rows and through the simultaneous use of two x-ray sources to allow higher helical pitch. Some improvement in tissue differentiation has been achieved with dual energy CT. This article discusses these recent technical advances in detail.

  17. Contrast enhanced multi-detector CT and MR findings of a well-differentiated pancreatic vipoma.

    PubMed

    Camera, Luigi; Severino, Rosa; Faggiano, Antongiulio; Masone, Stefania; Mansueto, Gelsomina; Maurea, Simone; Fonti, Rosa; Salvatore, Marco

    2014-10-28

    Pancreatic vipoma is an extremely rare tumor accounting for less than 2% of endocrine pancreatic neoplasms with a reported incidence of 0.1-0.6 per million. While cross-sectional imaging findings are usually not specific, exact localization of the tumor by means of either computed tomography (CT) or magnetic resonance (MR) is pivotal for surgical planning. However, cross-sectional imaging findings are usually not specific and further characterization of the tumor may only be achieved by somatostatin-receptor scintigraphy (SRS). We report the case of a 70 years old female with a two years history of watery diarrhoea who was found to have a solid, inhomogeneously enhancing lesion at the level of the pancreatic tail at Gadolinium-enhanced MR (Somatom Trio 3T, Siemens, Germany). The tumor had been prospectively overlooked at a contrast-enhanced multi-detector CT (Aquilion 64, Toshiba, Japan) performed after i.v. bolus injection of only 100 cc of iodinated non ionic contrast media because of a chronic renal failure (3.4 mg/mL) but it was subsequently confirmed by SRS. The patient first underwent a successful symptomatic treatment with somatostatin analogues and was then submitted to a distal pancreasectomy with splenectomy to remove a capsulated whitish tumor which turned out to be a well-differentiated vipoma at histological and immuno-histochemical analysis.

  18. Follow-up of multicentric HCC according to the mRECIST criteria: role of 320-Row CT with semi-automatic 3D analysis software for evaluating the response to systemic therapy

    PubMed Central

    TELEGRAFO, M.; DILORENZO, G.; DI GIOVANNI, G.; CORNACCHIA, I.; STABILE IANORA, A.A.; ANGELELLI, G.; MOSCHETTA, M.

    2016-01-01

    Aim To evaluate the role of 320-detector row computed tomography (MDCT) with 3D analysis software in follow up of patients affected by multicentric hepatocellular carcinoma (HCC) treated with systemic therapy by using modified response evaluation criteria in solid tumors (mRECIST). Patients and methods 38 patients affected by multicentric HCC underwent MDCT. All exams were performed before and after iodinate contrast material intravenous injection by using a 320-detection row CT device. CT images were analyzed by two radiologists using multi-planar reconstructions (MPR) in order to assess the response to systemic therapy according to mRECIST criteria: complete response (CR), partial response (PR), progressive disease (PD), stable disease (SD). 30 days later, the same two radiologists evaluated target lesion response to systemic therapy according to mRECIST criteria by using 3D analysis software. The difference between the two systems in assessing HCC response to therapy was assessed by the analysis of the variance (Anova Test). Interobserver agreement between the two radiologists by using MPR images and 3D analysis software was calculated by using Cohen’s Kappa test. Results PR occurred in 10/38 cases (26%), PD in 6/38 (16%), SD in 22/38 (58%). Anova Test showed no statistically significant difference between the two systems for assessing target lesion response to therapy (p >0.05). Inter-observer agreement (k) was respectively of 0.62 for MPR images measurements and 0.86 for 3D analysis ones. Conclusions 3D Analysis software provides a semiautomatic system for assessing target lesion response to therapy according to mRE-CIST criteria in patient affected by multifocal HCC treated with systemic therapy. The reliability of 3D analysis software makes it useful in the clinical practice. PMID:28098056

  19. Coronary artery calcium measurement with multi-detector row CT: in vitro assessment of effect of radiation dose.

    PubMed

    Hong, Cheng; Bae, Kyongtae T; Pilgram, Thomas K; Suh, Jongdae; Bradley, David

    2002-12-01

    The authors assessed in vitro the effect of radiation dose on coronary artery calcium quantification with multi-detector row computed tomography. A cardiac phantom with calcified cylinders was scanned at various milliampere second settings (20-160 mAs). A clear tendency was found for image noise to decrease as tube current increased (P <.001). No tendency was found for the Agatson score or calcium volume and mass errors to vary with tube current. Calcium measurements were not significantly affected by the choice of tube current. Calcium mass error was strongly correlated with calcium volume error (P <.001). The calcium mass measurement was more accurate and less variable than the calcium volume measurement.

  20. Multi-detector row CT colonography: effect of collimation, pitch, and orientation on polyp detection in a human colectomy specimen.

    PubMed

    Taylor, Stuart A; Halligan, Steve; Bartram, Clive I; Morgan, Paul R; Talbot, Ian C; Fry, Nicola; Saunders, Brian P; Khosraviani, Kirosh; Atkin, Wendy

    2003-10-01

    To investigate the effects of orientation, collimation, pitch, and tube current setting on polyp detection at multi-detector row computed tomographic (CT) colonography and to determine the optimal combination of scanning parameters for screening. A colectomy specimen containing 117 polyps of different sizes was insufflated and imaged with a multi-detector row CT scanner at various collimation (1.25 and 2.5 mm), pitch (3 and 6), and tube current (50, 100, and 150 mA) settings. Two-dimensional multiplanar reformatted images and three-dimensional endoluminal surface renderings from the 12 resultant data sets were examined by one observer for the presence and conspicuity of polyps. The results were analyzed with Poisson regression and logistic regression to determine the effects of scanning parameters and of specimen orientation on polyp detection. The percentage of polyps that were detected significantly increased when collimation (P =.008) and table feed (P =.03) were decreased. Increased tube current resulted in improved detection only of polyps with a diameter of less than 5 mm. Polyps of less than 5 mm were optimally depicted with a collimation of 1.25 mm, a pitch of 3, and a tube current setting of 150 mA; polyps with a diameter greater than 5 mm were adequately depicted with 1.25-mm collimation and with either pitch setting and any of the three tube current settings. Small polyps in the transverse segment (positioned at a 90 degrees angle to the z axis of scanning) were significantly less visible than those in parallel or oblique orientations (P <.001). The effective radiation dose, calculated with a Monte Carlo simulation, was 1.4-10.0 mSv. Detection of small polyps (<5 mm) with multi-detector row CT is highly dependent on collimation, pitch, and, to a lesser extent, tube current. Collimation of 1.25 mm, combined with pitch of 6 and tube current of 50 mA, provides for reliable detection of polyps 5 mm or larger while limiting the effective radiation dose

  1. Novel ultrahigh resolution data acquisition and image reconstruction for multi-detector row CT

    SciTech Connect

    Flohr, T. G.; Stierstorfer, K.; Suess, C.; Schmidt, B.; Primak, A. N.; McCollough, C. H.

    2007-05-15

    We present and evaluate a special ultrahigh resolution mode providing considerably enhanced spatial resolution both in the scan plane and in the z-axis direction for a routine medical multi-detector row computed tomography (CT) system. Data acquisition is performed by using a flying focal spot both in the scan plane and in the z-axis direction in combination with tantalum grids that are inserted in front of the multi-row detector to reduce the aperture of the detector elements both in-plane and in the z-axis direction. The dose utilization of the system for standard applications is not affected, since the grids are moved into place only when needed and are removed for standard scanning. By means of this technique, image slices with a nominal section width of 0.4 mm (measured full width at half maximum=0.45 mm) can be reconstructed in spiral mode on a CT system with a detector configuration of 32x0.6 mm. The measured 2% value of the in-plane modulation transfer function (MTF) is 20.4 lp/cm, the measured 2% value of the longitudinal (z axis) MTF is 21.5 lp/cm. In a resolution phantom with metal line pair test patterns, spatial resolution of 20 lp/cm can be demonstrated both in the scan plane and along the z axis. This corresponds to an object size of 0.25 mm that can be resolved. The new mode is intended for ultrahigh resolution bone imaging, in particular for wrists, joints, and inner ear studies, where a higher level of image noise due to the reduced aperture is an acceptable trade-off for the clinical benefit brought about by the improved spatial resolution.

  2. Multi-detector row CT scanning in Paleoanthropology at various tube current settings and scanning mode.

    PubMed

    Badawi-Fayad, J; Yazbeck, C; Balzeau, A; Nguyen, T H; Istoc, A; Grimaud-Hervé, D; Cabanis, E- A

    2005-12-01

    The purpose of this study was to determine the optimal tube current setting and scanning mode for hominid fossil skull scanning, using multi-detector row computed tomography (CT). Four fossil skulls (La Ferrassie 1, Abri Pataud 1, CroMagnon 2 and Cro-Magnon 3) were examined by using the CT scanner LightSpeed 16 (General Electric Medical Systems) with varying dose per section (160, 250, and 300 mAs) and scanning mode (helical and conventional). Image quality of two-dimensional (2D) multiplanar reconstructions, three-dimensional (3D) reconstructions and native images was assessed by four reviewers using a four-point grading scale. An ANOVA (analysis of variance) model was used to compare the mean score for each sequence and the overall mean score according to the levels of the scanning parameters. Compared with helical CT (mean score=12.03), the conventional technique showed sustained poor image quality (mean score=4.17). With the helical mode, we observed a better image quality at 300 mAs than at 160 in the 3D sequences (P=0.03). Whereas in native images, a reduction in the effective tube current induced no degradation in image quality (P=0.05). Our study suggests a standardized protocol for fossil scanning with a 16 x 0.625 detector configuration, a 10 mm beam collimation, a 0.562:1 acquisition mode, a 0.625/0.4 mm slice thickness/reconstruction interval, a pitch of 5.62, 120 kV and 300 mAs especially when a 3D study is required.

  3. Calcium score of small coronary calcifications on multidetector computed tomography: results from a static phantom study.

    PubMed

    Groen, J M; Kofoed, K F; Zacho, M; Vliegenthart, R; Willems, T P; Greuter, M J W

    2013-02-01

    Multi detector computed tomography (MDCT) underestimates the coronary calcium score as compared to electron beam tomography (EBT). Therefore clinical risk stratification based on MDCT calcium scoring may be inaccurate. The aim of this study was to assess the feasibility of a new phantom which enables establishment of a calcium scoring protocol for MDCT that yields a calcium score comparable to the EBT values and to the physical mass. A phantom containing 100 small calcifications ranging from 0.5 to 2.0mm was scanned on EBT using a standard coronary calcium protocol. In addition, the phantom was scanned on a 320-row MDCT scanner using different scanning, reconstruction and scoring parameters (tube voltage 80-135 kV, slice thickness 0.5-3.0mm, reconstruction kernel FC11-FC15 and threshold 110-150 HU). The Agatston and mass score of both modalities was compared and the influence of the parameters was assessed. On EBT the Agatston and mass scores were between 0 and 20, and 0 and 3mg, respectively. On MDCT the Agatston and mass scores were between 0 and 20, and 0 and 4 mg, respectively. All parameters showed an influence on the calcium score. The Agatston score on MDCT differed 52% between the 80 and 135kV, 65% between 0.5 and 3.0mm and 48% between FC11 and FC15. More calcifications were detected with a lower tube voltage, a smaller slice thickness, a sharper kernel and a lower threshold. Based on these observations an acquisition protocol with a tube voltage of 100 kV and two reconstructions protocols were defined with a FC12 reconstruction kernel; one with a slice thickness of 3.0mm and a one with a slice thickness of 0.5mm. This protocol yielded an Agatston score as close to the EBT as possible, but also a mass score as close to the physical phantom value as possible, respectively. With the new phantom one acquisition protocol and two reconstruction protocols can be defined which produces Agatston scores comparable to EBT values and to the physical mass. Copyright

  4. Quantitative Features of Liver Lesions, Lung Nodules, and Renal Stones at Multi-Detector Row CT Examinations: Dependency on Radiation Dose and Reconstruction Algorithm.

    PubMed

    Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan

    2016-04-01

    To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the

  5. Validity of blood flow measurement using 320 multi-detectors CT and first-pass distribution theory: a phantom study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Yu, Xuefang; Xu, Shaopeng; Zhou, Kenneth J.

    2015-03-01

    To evaluate the feasibility of measuring the myocardial blood flow using 320 row detector CT by first-pass technique. Heart was simulated with a container that was filled with pipeline of 3mm diameter; coronary artery was simulated with a pipeline of 2 cm diameter and connected with the simulated heart. The simulated coronary artery was connected with a big container with 1500 ml saline and 150ml contrast agent. One pump linking with simulated heart will withdraw with a speed of 10 ml/min, 15 ml/min, 20 ml/min, 25 ml/min and 30 ml/min. First CT scan starts after 30 s of pumpback with certain speed. The second CT scan starts 5 s after first CT scans. CT images processed as follows: The second CT scan images subtract first CT scan images, calculate the increase of CT value of simulated heart and the CT value of the unit volume of simulated coronary artery and then to calculate the total inflow of myocardial blood flow. CT myocardial blood flows were calculated as: 0.94 ml/s, 2.09 ml/s, 2.74 ml/s, 4.18 ml/s, 4.86 ml/s. The correlation coefficient is 0.994 and r2 = 0.97. The method of measuring the myocardial blood flow using 320 row detector CT by 2 scans is feasible. It is possible to develop a new method for quantitatively and functional assessment of myocardial perfusion blood flow with less radiation does.

  6. Evaluation of accuracy of 3D reconstruction images using multi-detector CT and cone-beam CT

    PubMed Central

    Kim, Mija; YI, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul

    2012-01-01

    Purpose This study was performed to determine the accuracy of linear measurements on three-dimensional (3D) images using multi-detector computed tomography (MDCT) and cone-beam computed tomography (CBCT). Materials and Methods MDCT and CBCT were performed using 24 dry skulls. Twenty-one measurements were taken on the dry skulls using digital caliper. Both types of CT data were imported into OnDemand software and identification of landmarks on the 3D surface rendering images and calculation of linear measurements were performed. Reproducibility of the measurements was assessed using repeated measures ANOVA and ICC, and the measurements were statistically compared using a Student t-test. Results All assessments under the direct measurement and image-based measurements on the 3D CT surface rendering images using MDCT and CBCT showed no statistically difference under the ICC examination. The measurements showed no differences between the direct measurements of dry skull and the image-based measurements on the 3D CT surface rendering images (P>.05). Conclusion Three-dimensional reconstructed surface rendering images using MDCT and CBCT would be appropriate for 3D measurements. PMID:22474645

  7. Use of an automatic exposure control mechanism for dose optimization in multi-detector row CT examinations: clinical evaluation.

    PubMed

    Mulkens, Tom H; Bellinck, Patrick; Baeyaert, Michel; Ghysen, Dirk; Van Dijck, Xavier; Mussen, Elvier; Venstermans, Caroline; Termote, Jean-Luc

    2005-10-01

    To prospectively compare dose reduction and image quality achieved with an automatic exposure control system that is based on both angular (x-y axis) and z-axis tube current modulation with dose reduction and image quality achieved with an angular modulation system for multi-detector row computed tomography (CT). The study protocol was approved by the institutional review board, and oral informed consent was obtained. In two groups of 200 patients, five anatomic regions (ie, the thorax, abdomen-pelvis, abdomen-liver, lumbar spine, and cervical spine) were examined with this modulation system and a six-section multi-detector row CT scanner. Data from these patients were compared with data from 200 patients who were examined with an angular modulation system. Dose reduction by means of reduction of the mean effective tube current in 600 examinations, image noise in 200 examinations performed with each modulation system, and subjective image quality scores in 100 examinations per-formed with each modulation system were compared with Wilcoxon signed rank tests. Mean dose reduction for the angular and z-axis tube current modulation system and for the angular modulation system was as follows: thorax, 20% and 14%, respectively; abdomen-liver, 38% and 18%, respectively; abdomen-pelvis, 32% and 26%, respectively; lumbar spine, 37% and 10%, respectively; and cervical spine, 68% and 16%, respectively. These differences were statistically significant (P < .05). There was no significant difference in image noise and mean image quality scores between modulation systems, with the exception of cervical spinal examinations (P < .001 for both), where the examinations with angular modulation resulted in better scores. There is good correlation between the mean effective tube current level and the body mass index of patients with the new modulation system. Correlation was as follows: thorax, 0.77; abdomen-pelvis, 0.83; abdomen-liver, 0.84; lumbar spine, 0.8; and cervical spine, 0

  8. Suspected aortic dissection and other aortic disorders: multi-detector row CT in 373 cases in the emergency setting.

    PubMed

    Hayter, Robert G; Rhea, James T; Small, Andrew; Tafazoli, Faranak S; Novelline, Robert A

    2006-03-01

    To retrospectively review the authors' experience with multi-detector row computed tomography (CT) for detection of aortic dissection in the emergency setting. The investigation was institutional review board approved, did not require informed patient consent, and was HIPAA compliant. In 373 clinical evaluations in the emergency setting, 365 patients suspected of having aortic dissection and/or other aortic disorders underwent multidetector CT. Criteria for acute aortic disorder were confirmed by using surgical and pathologic diagnoses or findings at clinical follow-up and any subsequent imaging as the reference standard. Positive cases were characterized according to type of disorder interpreted. Resulting sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy were calculated by using two-way contingency tables. All cases found to be negative for acute aortic disorders were grouped according to alternative CT findings. Sixty-seven (18.0%) of the 373 cases were interpreted as positive for acute aortic disorder. One hundred twelve acute aortic disorders were identified in these 67 cases: 23 acute aortic dissections, 14 acute aortic intramural hematomas, 20 acute penetrating aortic ulcers, 44 new or enlarging aortic aneurysms, and 11 acute aortic ruptures. Three hundred five (81.8%) cases were interpreted as negative for acute aortic disorder. In 48 negative cases, multidetector CT depicted alternative findings that accounted for the clinical presentation. Of these, three included both acute aortic disorders and alternative findings, and 45 included only alternative findings. One (0.3%) case was indeterminate for acute aortic disorder. Overall, 112 findings were interpreted as positive for acute aortic disorder, an alternative finding, or both at CT. No interpretations were false-positive, one was false-negative, 67 were true-positive, and 304 were true-negative. Sensitivity, specificity, PPV, NPV, and accuracy were

  9. Multi-detector row CT of the left atrium and pulmonary veins before radio-frequency catheter ablation for atrial fibrillation.

    PubMed

    Lacomis, Joan M; Wigginton, William; Fuhrman, Carl; Schwartzman, David; Armfield, Derek R; Pealer, Karen M

    2003-10-01

    Radio-frequency catheter ablation (RFCA) of the distal pulmonary veins and posterior left atrium is increasingly being used to treat recurrent or refractory atrial fibrillation that resists pharmacologic therapy or cardioversion. Successful RFCA of atrial fibrillation requires resolution of abnormal rhythms while minimizing complications and can be achieved with precise, preprocedural, three-dimensional (3D) anatomic delineation of the target, the atriopulmonary venous junction. Three-dimensional multi-detector row computed tomography (CT) of the pulmonary veins and left atrium provides the necessary anatomic information for successful RFCA, including (a) the number, location, and angulation of pulmonary veins and their ostial branches unobscured by adjacent cardiac and vascular anatomy, and (b) left atrial volume. The 3D multi-detector row CT scanning and postprocessing techniques used for pre-RFCA planning are straightforward. Radiologists must not only understand these techniques but must also be familiar with atrial fibrillation and the technical considerations and complications associated with RFCA of this condition. In addition, radiologists must be familiar with anatomic variants of the left atrium and distal pulmonary veins and understand the importance of these variants to the referring cardiac interventional electrophysiologist. Copyright RSNA, 2003

  10. Control electronics for a multi-laser/multi-detector scanning system

    NASA Technical Reports Server (NTRS)

    Kennedy, W.

    1980-01-01

    The Mars Rover Laser Scanning system uses a precision laser pointing mechanism, a photodetector array, and the concept of triangulation to perform three dimensional scene analysis. The system is used for real time terrain sensing and vision. The Multi-Laser/Multi-Detector laser scanning system is controlled by a digital device called the ML/MD controller. A next generation laser scanning system, based on the Level 2 controller, is microprocessor based. The new controller capabilities far exceed those of the ML/MD device. The first draft circuit details and general software structure are presented.

  11. Multi-detector CT imaging in the postoperative orthopedic patient with metal hardware.

    PubMed

    Vande Berg, Bruno; Malghem, Jacques; Maldague, Baudouin; Lecouvet, Frederic

    2006-12-01

    Multi-detector CT imaging (MDCT) becomes routine imaging modality in the assessment of the postoperative orthopedic patients with metallic instrumentation that degrades image quality at MR imaging. This article reviews the physical basis and CT appearance of such metal-related artifacts. It also addresses the clinical value of MDCT in postoperative orthopedic patients with emphasis on fracture healing, spinal fusion or arthrodesis, and joint replacement. MDCT imaging shows limitations in the assessment of the bone marrow cavity and of the soft tissues for which MR imaging remains the imaging modality of choice despite metal-related anatomic distortions and signal alteration.

  12. Detection of hypervascular hepatocellular carcinoma: Comparison of multi-detector CT with digital subtraction angiography and Lipiodol CT

    PubMed Central

    Zheng, Xiao-Hua; Guan, Yong-Song; Zhou, Xiang-Ping; Huang, Juan; Sun, Long; Li, Xiao; Liu, Yuan

    2005-01-01

    AIM: The purpose of this study was to compare the diagnostic accuracy of biphasic multi-detector row helical computed tomography (MDCT), digital subtraction angiography (DSA) and Lipiodol computed tomography (CT) in detection of hypervascular hepatocellular carcinoma (HCC). METHODS: Twenty-eight patients with nodular HCC underwent biphasic MDCT examination: hepatic arterial phase (HAP) 25 s and portal venous phase (PVP) 70 s after injection of the contrast medium (1.5 mL/kg). They also underwent hepatic angiography and intra-arterial infusion of iodized oil. Lipiodol CT was performed 3-4 wk after infusion. MDCT images were compared with DSA and Lipiodol CT images for detection of hepatic nodules. RESULTS: The three imaging techniques had the same sensitivity in detecting nodules >20 mm in diameter. There was no significant difference in the sensitivity among HAP-MDCT, Lipiodol CT and DSA for nodules of 10-20 mm in diameter. For the nodules <10 mm in diameter, HAP-MDCT identified 47, Lipiodol CT detected 27 (χ2 = 11.3, P = 0.005<0.01, HAP-MDCT vs Lipiodol CT) and DSA detected 16 (χ2 = 9.09, P = 0.005<0.01 vs Lipiodol CT and χ2 = 29.03, P = 0.005<0.01vs HAP-MDCT). However, six nodules <10 mm in diameter were detected only by Lipiodol CT. CONCLUSION: MDCT and Lipiodol CT are two complementary modalities. At present, MDCT does not obviate the need for DSA and subsequent Lipiodol CT as a preoperative examination for HCC. PMID:15633215

  13. Collimated prompt gamma TOF measurements with multi-slit multi-detector configurations

    NASA Astrophysics Data System (ADS)

    Krimmer, J.; Chevallier, M.; Constanzo, J.; Dauvergne, D.; De Rydt, M.; Dedes, G.; Freud, N.; Henriquet, P.; La Tessa, C.; Létang, J. M.; Pleskač, R.; Pinto, M.; Ray, C.; Reithinger, V.; Richard, M. H.; Rinaldi, I.; Roellinghoff, F.; Schuy, C.; Testa, E.; Testa, M.

    2015-01-01

    Longitudinal prompt-gamma ray profiles have been measured with a multi-slit multi-detector configuration at a 75 MeV/u 13C beam and with a PMMA target. Selections in time-of-flight and energy have been applied in order to discriminate prompt-gamma rays produced in the target from background events. The ion ranges which have been extracted from each individual detector module agree amongst each other and are consistent with theoretical expectations. In a separate dedicated experiment with 200 MeV/u 12C ions the fraction of inter-detector scattering has been determined to be on the 10%-level via a combination of experimental results and simulations. At the same experiment different collimator configurations have been tested and the shielding properties of tungsten and lead for prompt-gamma rays have been measured.

  14. Assessment of coronary bypass graft patency by first-line multi-detector computed tomography.

    PubMed

    Pesenti-Rossi, D; Baron, N; Georges, J-L; Augusto, S; Gibault-Genty, G; Livarek, B

    2014-11-01

    The purpose of the study was to assess whether a strategy based on a MDCT performed routinely before CA can reduce the radiation dose during the CA, without increased global exposure in patients who need imaging of CABG. A total of 147 consecutive patients were included. The radiation dose during CA (KAP 12.1 vs 22.0 Gy/cm(2), P<.01) and the volume of iodinated contrast (155 vs 200 mL, P<.02) were reduced when preceded by a MDCT. Patients' cumulative exposures were not different in the 2 strategies (5.0 vs 5.1 mSv, P=.76). MDCT performed in first line is a valuable strategy for the assessment of CABG.

  15. High Resolution Multi-Detector CT Aided Tissue Analysis and Quantification of Lung Fibrosis

    PubMed Central

    Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A

    2009-01-01

    Rational and Objectives Volumetric high-resolution scans can be acquired of the lungs with multi-detector CT (MDCT). Such scans have potential to facilitate useful visualization, characterization, and quantification of the extent of diffuse lung diseases, such as Usual Interstitial Pneumonitis or Idiopathic Pulmonary Fibrosis (UIP/IPF). There is a need to objectify, standardize and improve the accuracy and repeatability of pulmonary disease characterization and quantification from such scans. This paper presents a novel texture analysis approach toward classification and quantification of various pathologies present in lungs with UIP/IPF. The approach integrates a texture matching method with histogram feature analysis. Materials and Methods Patients with moderate UIP/IPF were scanned on a Lightspeed 8-detector GE CT scanner (140kVp, 250mAs). Images were reconstructed with 1.25mm slice thickness in a high-frequency sparing algorithm (BONE) with 50% overlap and a 512 × 512 axial matrix, (0.625 mm3 voxels). Eighteen scans were used in this study. Each dataset is pre-processed which includes segmentation of the lungs and the broncho-vascular trees. Two types of analysis were performed, first an analysis of independent volume of interests (VOIs) and second an analysis of whole lung datasets. 1.) Fourteen of the eighteen scans were used to create a database of independent 15×15×15 cubic voxel VOIs. The VOIs were selected by experts as having greater than 70% of the defined class. The database was composed of the following: Honeycombing (# of VOIs 337), Reticular (130), Ground glass (148), Normal (240), and Emphysema (54). This database was used to develop our algorithm. Three progressively challenging classification experiments were designed to test our algorithm. All three experiments were performed using a 10-fold cross validation method for error estimation. Experiment 1 consisted of a two class discrimination: Normal and Abnormal. Experiment 2 consisted of a four

  16. Cost effectiveness of multi detector CT angiography of the coronary arteries for the diagnosis of suspected non-ST elevation acute coronary syndrome (NSTE-ACS) in the emergency department. Mathematical analysis with a decision model.

    PubMed

    De Beule, T; Vanhoenacker, P; de Booij, M; Ardies, L; Bladt, O

    2010-01-01

    The purpose of our study is to model cost-effectiveness of MDCTA for the diagnosis of NSTE-ACS with initially negative enzymes, in the emergency department. In Belgium, the use of multi-detector computed tomography (MDCTA) is probably cost-effective in the diagnosis of NSTE-ACS in the acute setting A decision tree model was developed and a mathematical study was performed that included two hypothetical strategies: MDCTA and admission with classic clinical follow-up and treatment. Cost-effectiveness for the Belgian situation was simulated with sensitivity analysis using known values for diagnostic performance and known costs for the different strategies or components of strategies.

  17. Evaluation of different small bowel contrast agents by multi - detector row CT

    PubMed Central

    Wang, Yong-Ren; Yu, Xiao-Li; Peng, Zhi-Yi

    2015-01-01

    Objective: This study aims to evaluate the effects of different oral small bowel contrast agents towards the intestinal dilatation and intestinal wall structure exhibition by the abdominal multi - detector row CT (MDCT) examination. Methods: 80 patients were performed the whole abdominal CT examination, then randomly divided into four groups, with 20 patients in each group. 45 minutes before the CT examination, the patients were served with a total of 1800 ml pure water, pure milk, dilute lactulose solution and isotonic mannitol solution, respectively. Results: The images were blinded read by two experienced abdominal radiologists in the workstation, the cross-sectional diameters of duodenum, jejunum, proximal and terminal ends of ileum of each patient were measured, then the analysis of variance was performed to analyze the differences in the intestinal dilatation among the experimental groups. The scoring method was used to score the intestinal dilatation and intestinal structure exhibition. The diluted lactulose solution and 2.5% mannitol exhibited the best intestinal dilation degrees. Similarly, the diluted lactulose solution and 2.5% mannitol exhibited the highest scores in the entire small bowel dilatation degree and intestinal structure exhibition. Conclusions: 2.5% osmotic mannitol and the diluted lactulose solution enabled the full dilatation of small bowel, and could clearly exhibit the wall structure. PMID:26629131

  18. Bronchial anatomy of left lung: a study of multi-detector row CT.

    PubMed

    Zhao, Xinya; Ju, Yuanrong; Liu, Cheng; Li, Jianfeng; Huang, Min; Sun, Jian; Wang, Tao

    2009-02-01

    Familiarity with prevailing pattern and variations in the bronchial tree is not only essential for the anatomist to explain bronchial variation in bronchial specimens, but also useful for guiding bronchoscopy and instructing pulmonary segmental resection. The purpose of this study was designed to demonstrate various branching patterns of left lung with 3D images, with special attention given to identify the major types at transverse thin-section CT. Two hundred and sixteen patients with routine thorax scans were enrolled. The images of bronchial tree, virtual bronchoscopy were reconstructed using post-processing technique of multi-detector row CT. We attempted to classify the segmental bronchi by interpreting the post-processing images, and identified them in transverse thin-section CT. Our results showed that the segmental bronchial ramifications of the left superior lobe were classified into three types mainly, i.e., common stem of apical and posterior segmental bronchi (64%, 138/216); trifurcation (23%, 50/216); common stem of apical and anterior segmental bronchi (10%, 22/216), and they could be identified at two typical sections of transverse thin-section CT. There were two major types in left basal segmental bronchi, i.e., bifurcation (75%, 163/216), trifurcation (18%, 39/216), and they could also be identified at two typical sections of transverse thin-section CT. In conclusion, our study have offered simplified branching patterns of bronchi and demonstrated various unusual bronchial branching patterns perfectly with 3D images, and have also revealed how to identify the main branching patterns in transverse thin-section CT.

  19. [Comparison of a dental cone beam CT with a multi-detector row CT on effective doses and physical image quality].

    PubMed

    Yoshida, Yutaka; Tokumori, Kenji; Okamura, Kazutoshi; Yoshiura, Kazunori

    2011-01-01

    The purpose of this study was to compare a dental cone beam computed tomography (dental CBCT) and a multi-detector row CT (MDCT) using effective doses and physical image quality. A dental mode (D-mode) and an implant mode (I-mode) were employed for calculating effective doses. Field of view (FOV) size of the MDCT was 150 mm. Three types of images were obtained using 3 different reconstruction functions: FC1 (for abdomen images), FC30 (for internal ear and bone images) and FC81 (for high resolution images). Effective doses obtained with the D-mode and with the I-mode were about 20% and 50% of those obtained with the MDCT, respectively. Resolution properties obtained with the D-mode and I-mode were superior to that of the MDCT in a high frequency range. Noise properties of the D-mode and the I-mode were better than those with FC81. It was found that the dental CBCT has better potential as compared with MDCT in both dental and implant modes.

  20. Radiation dose is reduced with a single-pass whole-body multi-detector row CT trauma protocol compared with a conventional segmented method: initial experience.

    PubMed

    Ptak, Thomas; Rhea, James T; Novelline, Robert A

    2003-12-01

    Radiation dose data were collected from a calibrated multi-detector row computed tomographic (CT) scanner during trauma CT. One protocol (used with 10 case subjects) involved a single-pass continuous whole-body acquisition from cranial vertex to symphysis pubis, while the other, conventional protocol (used with 10 control subjects) involved scouting and scanning body segments (head, cervical spine, chest, abdomen, and pelvis) individually. Technical factors were kept constant within each body segment for the single-pass and the segmented protocols. Statistics included univariate analysis, two-tailed t testing to evaluate statistical significance of the summary statistic, and power and subject population contingency tables. The mean dose length product (DLP) with the single-pass protocol was 17% lower than the sum of the DLPs of each of the individual body segment scans (P <.001). Analysis of power and subject population by using a difference in mean of 500 mGy. cm and an alpha of.05 revealed a (1-beta) of higher than 0.90 for a sample of 10 patients. Thus, a whole-body single-pass trauma protocol, compared with a typical segmented acquisition protocol matched for imaging technique, resulted in reduced total radiation dose. The reduction in radiation dose is thought to represent a reduction in redundant imaging at overlap zones between body segments scanned in the segmental protocol but not in the continuous acquisition.

  1. Multi-detector row CT as a "one-stop" examination in the preoperative evaluation of the morphology and function of living renal donors: preliminary study.

    PubMed

    Su, Chen; Yan, Chaogui; Guo, Yan; Zhou, Xuhui; Chen, Yaqing; Liu, Mingjuan; Wang, Wenjuan; Zhang, Xiaoling

    2011-02-01

    We designed to investigate the feasibility of multi-detector row computerized tomography (CT) as a "one-stop" examination for the simultaneous preoperative evaluation of the morphology and function of living renal donors. 21 living renal donors were examined by 64-slice spiral CT with a three-phase enhancement CT scan and two inserted dynamic scans. The maximum intensity projection (MIP), multi-planar reformation (MPR), and volume reconstruction (VR) procedures were performed to compare the renal parenchyma, renal vessels, and collecting system with operational findings. The known Patlak equation was used to calculate the glomerular filtration rate (GFR); exact GFR information was acquired by single photon emission computed tomography (SPECT). Our results as following, there were 3 cases of artery variation and 3 cases of vein variation. CT findings all corresponded with the operation, and the sensitivity, positive predictive value, specialty, and negative predictive value of CT were all 100%. The r of the GFR values estimated from CT is 0.894 (left) (P < 0.001) and 0.881 (right) (P < 0.001). In conclusions, our findings demonstrate that 64-slice spiral CT may offer a "one-stop" examination to replace SPECT in the preoperative evaluation of living renal donors to simultaneously provide information regarding both anatomy and the GFR of living renal donors.

  2. Radiation doses for pregnant women in the late pregnancy undergoing fetal-computed tomography: a comparison of dosimetry and Monte Carlo simulations.

    PubMed

    Matsunaga, Yuta; Kawaguchi, Ai; Kobayashi, Masanao; Suzuki, Shigetaka; Suzuki, Shoichi; Chida, Koichi

    2016-09-19

    The purposes of this study were (1) to compare the radiation doses for 320- and 80-row fetal-computed tomography (CT), estimated using thermoluminescent dosimeters (TLDs) and the ImPACT Calculator (hereinafter referred to as the "CT dosimetry software"), for a woman in her late pregnancy and her fetus and (2) to estimate the overlapped fetal radiation dose from a 320-row CT examination using two different estimation methods of the CT dosimetry software. The direct TLD data in the present study were obtained from a previous study. The exposure parameters used for TLD measurements were entered into the CT dosimetry software, and the appropriate radiation dose for the pregnant woman and her fetus was estimated. When the whole organs (e.g., the colon, small intestine, and ovaries) and the fetus were included in the scan range, the difference in the estimated doses between the TLD measurement and the CT dosimetry software measurement was <1 mGy (<23 %) in both CT units. In addition, when the whole organs were within the scan range, the CT dosimetry software was used for evaluating the fetal radiation dose and organ-specific doses for the woman in the late pregnancy. The conventional method using the CT dosimetry software cannot take into account the overlap between volumetric sections. Therefore, the conventional method using a 320-row CT unit in a wide-volume mode might result in the underestimation of radiation doses for the fetus and the colon, small intestine, and ovaries.

  3. Assessment of trabecular bone structure of the calcaneus using multi-detector CT: correlation with microCT and biomechanical testing.

    PubMed

    Diederichs, Gerd; Link, Thomas M; Kentenich, Marie; Schwieger, Karsten; Huber, Markus B; Burghardt, Andrew J; Majumdar, Sharmila; Rogalla, Patrik; Issever, Ahi S

    2009-05-01

    The prediction of bone strength can be improved when determining bone mineral density (BMD) in combination with measures of trabecular microarchitecture. The goal of this study was to assess parameters of trabecular bone structure and texture of the calcaneus by clinical multi-detector row computed tomography (MDCT) in an experimental in situ setup and to correlate these parameters with microCT (microCT) and biomechanical testing. Thirty calcanei in 15 intact cadavers were scanned using three different protocols on a 64-slice MDCT scanner with an in-plane pixel size of 208 microm and 500 microm slice thickness. Bone cores were harvested from each specimen and microCT images with a voxel size of 16 microm were obtained. After image coregistration, trabecular bone structure and texture were evaluated in identical regions on the MDCT images. After data acquisition, uniaxial compression testing was performed. Significant correlations between MDCT- and microCT-derived measures of bone volume fraction (BV/TV), trabecular thickness (Tb.Th) and trabecular separation (Tb.Sp) were found (range, R(2)=0.19-0.65, p<0.01 or 0.05). The MDCT-derived parameters of volumetric BMD, app. BV/TV, app. Tb.Th and app. Tb.Sp were capable of predicting 60%, 63%, 53% and 25% of the variation in bone strength (p<0.01). When combining those measures with one additional texture index (either GLCM, TOGLCM or MF.euler), prediction of mechanical competence was significantly improved to 86%, 85%, 71% and 63% (p<0.01). In conclusion, this study showed the feasibility of trabecular microarchitecture assessment using MDCT in an experimental setup simulating the clinical situation. Multivariate models of BMD or structural parameters combined with texture indices improved prediction of bone strength significantly and might provide more reliable estimates of fracture risk in patients.

  4. Assessment of organ absorbed doses and estimation of effective doses from pediatric anthropomorphic phantom measurements for multi-detector row CT with and without automatic exposure control.

    PubMed

    Brisse, Hervé J; Robilliard, Magalie; Savignoni, Alexia; Pierrat, Noelle; Gaboriaud, Geneviève; De Rycke, Yann; Neuenschwander, Sylvia; Aubert, Bernard; Rosenwald, Jean-Claude

    2009-10-01

    This study was designed to measure organ absorbed doses from multi-detector row computed tomography (MDCT) on pediatric anthropomorphic phantoms, calculate the corresponding effective doses, and assess the influence of automatic exposure control (AEC) in terms of organ dose variations. Four anthropomorphic phantoms (phantoms represent the equivalent of a newborn, 1-, 5-, and 10-y-old child) were scanned with a four-channel MDCT coupled with a z-axis-based AEC system. Two CT torso protocols were compared: a first protocol without AEC and constant tube current-time product and a second protocol with AEC using age-adjusted noise indices. Organ absorbed doses were monitored by thermoluminescent dosimeters (LiF: Mg, Cu, P). Effective doses were calculated according to the tissue weighting factors of the International Commission on Radiological Protection (). For fixed mA acquisitions, organ doses normalized to the volume CT dose index in a 16-cm head phantom (CTDIvol16) ranged from 0.6 to 1.5 and effective doses ranged from 8.4 to 13.5 mSv. For the newborn-equivalent phantom, the AEC-modulated scan showed almost no significant dose variation compared to the fixed mA scan. For the 1-, 5- and 10-y equivalent phantoms, the use of AEC induced a significant dose decrease on chest organs (ranging from 61 to 31% for thyroid, 37 to 21% for lung, 34 to 17% for esophagus, and 39 to 10% for breast). However, AEC also induced a significant dose increase (ranging from 28 to 48% for salivary glands, 22 to 51% for bladder, and 24 to 70% for ovaries) related to the high density of skull base and pelvic bones. These dose increases should be considered before using AEC as a dose optimization tool in children.

  5. Approach to interpret images produced by new generations of multi detector computed tomography scanners in post-operative spine.

    PubMed

    Zeitoun, Rania; Hussein, Manar

    2017-09-04

    To reach a practical approach to interpret MDCT findings in post-opartive spine cases and to change the false belief of CT failure in the setting of instruments secondary to related artifacts. We performed observational retrospective analysis of premier, early and late MDCT scans in 68 post-operative spine patients, with emphasis on instruments related complications and osseous fusion status. We used a grading system for assessment of osseous fusion in 35 patients and we further analyzed the findings in failure of fusion, grade (D). We observed a variety of instruments related complications (mostly screws medially penetrating the pedicle) and osseous fusion status in late scans. We graded 11 inter-body and 14 postero-lateral levels as osseous fusion failure, showing additional instruments related complications, end plates erosive changes, adjacent segments spondylosis and malalignement. Modern MDCT scanners provide high quality images and are strongly recommended in assessment of the instruments and status of osseous fusion. In post-operative imaging of the spine it is essential to be aware for what you are looking for, in relevance to the date of surgery. Advances in knowledge: 1.Modern MDCT scanners allow assessment of instruments position and integrity and osseous fusion status in post-operative spine. 2.We propose a helpful algorithm to simplify interpreting post-operative spine imaging.

  6. Multi-detector CT assessment in pulmonary hypertension: techniques, systematic approach to interpretation and key findings.

    PubMed

    Lewis, Gareth; Hoey, Edward T D; Reynolds, John H; Ganeshan, Arul; Ment, Jerome

    2015-06-01

    Pulmonary arterial hypertension (PAH) may be suspected based on the clinical history, physical examination and electrocardiogram findings but imaging is usually central to confirming the diagnosis, establishing a cause and guiding therapy. The diagnostic pathway of PAH involves a variety of complimentary investigations of which computed tomography pulmonary angiography (CTPA) has established a central role both in helping identify an underlying cause for PAH and assessing resulting functional compromise. In particular CTPA is considered as the gold standard technique for the diagnosis of thromboembolic disease. This article reviews the CTPA evaluation in PAH, describing CTPA techniques, a systematic approach to interpretation and spectrum of key imaging findings.

  7. Multi-detector spiral CT study of the relationships between pulmonary ground-glass nodules and blood vessels.

    PubMed

    Gao, Feng; Li, Ming; Ge, Xiaojun; Zheng, Xiangpeng; Ren, Qingguo; Chen, Yan; Lv, Fangzhen; Hua, Yanqing

    2013-12-01

    To investigate the relationships between pulmonary ground-glass nodules (GGN) and blood vessels and their diagnostic values in differentiating GGNs. Multi-detector spiral CT imaging of 108 GGNs was retrospectively reviewed. The spatial relationships between GGNs and supplying blood vessels were categorized into four types: I, vessels passing by GGNs; II, intact vessels passing through GGNs; III, distorted, dilated or tortuous vessels seen within GGNs; IV, more complicated vasculature other than described above. Relationship types were correlated to pathologic and/or clinical findings of GGNs. Of 108 GGNs, 10 were benign, 24 preinvasive nodules and 74 adenocarcinomas that were pathologically proven. Types I, II, III and IV vascular relationships were observed in 9, 58, 21 and 20 GGNs, respectively. Type II relationship was the dominating relationship for each GGN group, but significant differences were shown among them. Correlation analysis showed strong correlation between invasive adenocarcinoma and type III and IV relationships. Subgroup analysis indicated that type III was more commonly seen in IAC with comparison to type IV more likely seen in MIA. Different GGNs have different relationships with vessels. Understanding and recognising characteristic GGN-vessel relationships may help identify which GGNs are more likely to be malignant.

  8. A retrospective comparison of smart prep and test bolus multi-detector CT pulmonary angiography protocols

    SciTech Connect

    Suckling, Tara; Smith, Tony; Reed, Warren

    2013-06-15

    Optimal arterial opacification is crucial in imaging the pulmonary arteries using computed tomography (CT). This poses the challenge of precisely timing data acquisition to coincide with the transit of the contrast bolus through the pulmonary vasculature. The aim of this quality assurance exercise was to investigate if a change in CT pulmonary angiography (CTPA) scanning protocol resulted in improved opacification of the pulmonary arteries. Comparison was made between the smart prep protocol (SPP) and the test bolus protocol (TBP) for opacification in the pulmonary trunk. A total of 160 CTPA examinations (80 using each protocol) performed between January 2010 and February 2011 were assessed retrospectively. CT attenuation coefficients were measured in Hounsfield Units (HU) using regions of interest at the level of the pulmonary trunk. The average pixel value, standard deviation (SD), maximum, and minimum were recorded. For each of these variables a mean value was then calculated and compared for these two CTPA protocols. Minimum opacification of 200 HU was achieved in 98% of the TBP sample but only 90% of the SPP sample. The average CT attenuation over the pulmonary trunk for the SPP was 329 (SD = ±21) HU, whereas for the TBP it was 396 (SD = ±22) HU (P = 0.0017). The TBP also recorded higher maximum (P = 0.0024) and minimum (P = 0.0039) levels of opacification. This study has found that a TBP resulted in significantly better opacification of the pulmonary trunk than the SPP.

  9. Improving Image Quality of On-Board Cone-Beam CT in Radiation Therapy Using Image Information Provided by Planning Multi-Detector CT: A Phantom Study

    PubMed Central

    Yang, Ching-Ching; Chen, Fong-Lin; Lo, Yeh-Chi

    2016-01-01

    Purpose The aim of this study was to improve the image quality of cone-beam computed tomography (CBCT) mounted on the gantry of a linear accelerator used in radiation therapy based on the image information provided by planning multi-detector CT (MDCT). Methods MDCT-based shading correction for CBCT and virtual monochromatic CT (VMCT) synthesized using the dual-energy method were performed. In VMCT, the high-energy data were obtained from CBCT, while the low-energy data were obtained from MDCT. An electron density phantom was used to investigate the efficacy of shading correction and VMCT on improving the target detectability, Hounsfield unit (HU) accuracy and variation, which were quantified by calculating the contrast-to-noise ratio (CNR), the percent difference (%Diff) and the standard deviation of the CT numbers for tissue equivalent background material, respectively. Treatment plan studies for a chest phantom were conducted to investigate the effects of image quality improvement on dose planning. Results For the electron density phantom, the mean value of CNR was 17.84, 26.78 and 34.31 in CBCT, shading-corrected CBCT and VMCT, respectively. The mean value of %Diff was 152.67%, 11.93% and 7.66% in CBCT, shading-corrected CBCT and VMCT, respectively. The standard deviation within a uniform background of CBCT, shading-corrected CBCT and VMCT was 85, 23 and 15 HU, respectively. With regards to the chest phantom, the monitor unit (MU) difference between the treatment plan calculated using MDCT and those based on CBCT, shading corrected CBCT and VMCT was 6.32%, 1.05% and 0.94%, respectively. Conclusions Enhancement of image quality in on-board CBCT can contribute to daily patient setup and adaptive dose delivery, thus enabling higher confidence in patient treatment accuracy in radiation therapy. Based on our results, VMCT has the highest image quality, followed by the shading corrected CBCT and the original CBCT. The research results presented in this study should be

  10. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    SciTech Connect

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.

    2014-04-15

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same

  11. Impact of filter convolution and displayed field of view on estimation of coronary Agatston scores in low-dose lung computed tomography.

    PubMed

    Wan, Yung-Liang; Tsay, Pei-Kwei; Wu, Patricia Wanping; Juan, Yu-Hsiang; Tsai, Hui-Yu; Lin, Chung-Yin; Yeh, Chih-Sheng; Wang, Chun-Hua; Chen, Chun-Chi

    2017-06-01

    Coronary artery calcification (CAC) may be quantified on low-dose computed tomography (CT) of the lung (LDCT). This study aims to evaluate the effects of filter convolution (FC) and displayed field of view (dFOV) in a Toshiba 320-row CT scanner in quantifying CAC, and to compare the CAC scores obtained by LDCT with standard cardiac CT. Fifty subjects (52 to 85years, mean 68.5, 36 males) with visible CAC underwent both standard cardiac CT and LDCT. CAC scores were obtained from standard cardiac CT using conventional FC12(22) (FC12 with 22-cm dFOV) and four different LDCT protocols: FC02(22), FC02(40), FC08(22), and FC08(40). CAC scores obtained by each LDCT protocol were compared with those obtained by standard cardiac CT. CAC scores obtained by all four LDCT protocols were well correlated with those by standard protocol (Pearson's coefficient=0.978 to 0.987, p<0.001; kappa=0.731 to 0.836, p<0.001). CAC scores obtained by FC08(22) showed the best agreement with standard cardiac CT (kappa=0.836, p<0.001). Under fixed dFOV, CAC scores in FC08 were significantly higher than in FC02 (p<0.001). Under fixed FC, CAC scores were significantly higher in 22-cm dFOV than in 40-cm dFOV (p≤0.006). Both FC and dFOV have significant impact on CAC scoring. To obtain reliable data, consistent parameters should be employed when quantifying CAC using LDCT. In a Toshiba 320-row CT scanner, CAC scores obtained by FC08(22) agree well with standard cardiac CT. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  12. Application of an asymmetric flow field flow fractionation multi-detector approach for metallic engineered nanoparticle characterization--prospects and limitations demonstrated on Au nanoparticles.

    PubMed

    Hagendorfer, Harald; Kaegi, Ralf; Traber, Jacqueline; Mertens, Stijn F L; Scherrers, Roger; Ludwig, Christian; Ulrich, Andrea

    2011-11-14

    In this work we discuss about the method development, applicability and limitations of an asymmetric flow field flow fractionation (A4F) system in combination with a multi-detector setup consisting of UV/vis, light scattering, and inductively coupled plasma mass spectrometry (ICPMS). The overall aim was to obtain a size dependent-, element specific-, and quantitative method appropriate for the characterization of metallic engineered nanoparticle (ENP) dispersions. Thus, systematic investigations of crucial method parameters were performed by employing well characterized Au nanoparticles (Au-NPs) as a defined model system. For good separation performance, the A4F flow-, membrane-, and carrier conditions were optimized. To obtain reliable size information, the use of laser light scattering based detectors was evaluated, where an online dynamic light scattering (DLS) detector showed good results for the investigated Au-NP up to a size of 80 nm in hydrodynamic diameter. To adapt large sensitivity differences of the various detectors, as well as to guarantee long term stability and minimum contamination of the mass spectrometer a split-flow concept for coupling ICPMS was evaluated. To test for reliable quantification, the ICPMS signal response of ionic Au standards was compared to that of Au-NP. Using proper stabilization with surfactants, no difference for concentrations of 1-50 μg Au L(-1) in the size range from 5 to 80 nm for citrate stabilized dispersions was observed. However, studies using different A4F channel membranes showed unspecific particle-membrane interaction resulting in retention time shifts and unspecific loss of nanoparticles, depending on the Au-NP system as well as membrane batch and type. Thus, reliable quantification and discrimination of ionic and particular species was performed using ICPMS in combination with ultracentrifugation instead of direct quantification with the A4F multi-detector setup. Figures of merit were obtained, by comparing the

  13. Determining contrast medium dose and rate on basis of lean body weight: does this strategy improve patient-to-patient uniformity of hepatic enhancement during multi-detector row CT?

    PubMed

    Ho, Lisa M; Nelson, Rendon C; Delong, David M

    2007-05-01

    To prospectively evaluate the use of lean body weight (LBW) as the main determinant of the volume and rate of contrast material administration during multi-detector row computed tomography of the liver. This HIPAA-compliant study had institutional review board approval. All patients gave written informed consent. Four protocols were compared. Standard protocol involved 125 mL of iopamidol injected at 4 mL/sec. Total body weight (TBW) protocol involved 0.7 g iodine per kilogram of TBW. Calculated LBW and measured LBW protocols involved 0.86 g of iodine per kilogram and 0.92 g of iodine per kilogram calculated or measured LBW for men and women, respectively. Injection rate used for the three experimental protocols was determined proportionally on the basis of the calculated volume of contrast material. Postcontrast attenuation measurements during portal venous phase were obtained in liver, portal vein, and aorta for each group and were summed for each patient. Patient-to-patient enhancement variability in same group was measured with Levene test. Two-tailed t test was used to compare the three experimental protocols with the standard protocol. Data analysis was performed in 101 patients (25 or 26 patients per group), including 56 men and 45 women (mean age, 53 years). Average summed attenuation values for standard, TBW, calculated LBW, and measured LBW protocols were 419 HU +/- 50 (standard deviation), 443 HU +/- 51, 433 HU +/- 50, and 426 HU +/- 33, respectively (P = not significant for all). Levene test results for summed attenuation data for standard, TBW, calculated LBW, and measured LBW protocols were 40 +/- 29, 38 +/- 33 (P = .83), 35 +/- 35 (P = .56), and 26 +/- 19 (P = .05), respectively. By excluding highly variable but poorly perfused adipose tissue from calculation of contrast medium dose, the measured LBW protocol may lessen patient-to-patient enhancement variability while maintaining satisfactory hepatic and vascular enhancement.

  14. Analysis of the topological properties of the proximal femur on a regional scale: evaluation of multi-detector CT-scans for the assessment of biomechanical strength using local Minkowski functionals in 3D

    NASA Astrophysics Data System (ADS)

    Boehm, H. F.; Link, T. M.; Monetti, R. A.; Kuhn, V.; Eckstein, F.; Raeth, C. W.; Reiser, M.

    2006-03-01

    In our recent studies on the analysis of bone texture in the context of Osteoporosis, we could already demonstrate the great potential of the topological evaluation of bone architecture based on the Minkowski Functionals (MF) in 2D and 3D for the prediction of the mechanical strength of cubic bone specimens depicted by high resolution MRI. Other than before, we now assess the mechanical characteristics of whole hip bone specimens imaged by multi-detector computed tomography. Due to the specific properties of the imaging modality and the bone tissue in the proximal femur, this requires to introduce a new analysis method. The internal architecture of the hip is functionally highly specialized to withstand the complex pattern of external and internal forces associated with human gait. Since the direction, connectivity and distribution of the trabeculae changes considerably within narrow spatial limits it seems most reasonable to evaluate the femoral bone structure on a local scale. The Minkowski functionals are a set of morphological descriptors for the topological characterization of binarized, multi-dimensional, convex objects with respect to shape, structure, and the connectivity of their components. The MF are usually used as global descriptors and may react very sensitively to minor structural variations which presents a major limitation in a number of applications. The objective of this work is to assess the mechanical competence of whole hip bone specimens using parameters based on the MF. We introduce an algorithm that considers the local topological aspects of the bone architecture of the proximal femur allowing to identify regions within the bone that contribute more to the overall mechanical strength than others.

  15. Comparison of Diagnostic Accuracy of Radiation Dose-Equivalent Radiography, Multidetector Computed Tomography and Cone Beam Computed Tomography for Fractures of Adult Cadaveric Wrists

    PubMed Central

    Neubauer, Jakob; Benndorf, Matthias; Reidelbach, Carolin; Krauß, Tobias; Lampert, Florian; Zajonc, Horst; Kotter, Elmar; Langer, Mathias; Fiebich, Martin; Goerke, Sebastian M.

    2016-01-01

    Purpose To compare the diagnostic accuracy of radiography, to radiography equivalent dose multidetector computed tomography (RED-MDCT) and to radiography equivalent dose cone beam computed tomography (RED-CBCT) for wrist fractures. Methods As study subjects we obtained 10 cadaveric human hands from body donors. Distal radius, distal ulna and carpal bones (n = 100) were artificially fractured in random order in a controlled experimental setting. We performed radiation dose equivalent radiography (settings as in standard clinical care), RED-MDCT in a 320 row MDCT with single shot mode and RED-CBCT in a device dedicated to musculoskeletal imaging. Three raters independently evaluated the resulting images for fractures and the level of confidence for each finding. Gold standard was evaluated by consensus reading of a high-dose MDCT. Results Pooled sensitivity was higher in RED-MDCT with 0.89 and RED-MDCT with 0.81 compared to radiography with 0.54 (P = < .004). No significant differences were detected concerning the modalities’ specificities (with values between P = .98). Raters' confidence was higher in RED-MDCT and RED-CBCT compared to radiography (P < .001). Conclusion The diagnostic accuracy of RED-MDCT and RED-CBCT for wrist fractures proved to be similar and in some parts even higher compared to radiography. Readers are more confident in their reporting with the cross sectional modalities. Dose equivalent cross sectional computed tomography of the wrist could replace plain radiography for fracture diagnosis in the long run. PMID:27788215

  16. Design and characterization of a compact multi-detector array for studies of induced gamma emission: Spontaneous decay of 178m2Hf as a test case

    NASA Astrophysics Data System (ADS)

    Ugorowski, P.; Propri, R.; Karamian, S. A.; Gohlke, D.; Lazich, J.; Caldwell, N.; Chakrawarthy, R. S.; Helba, M.; Roberts, H.; Carroll, J. J.

    2006-09-01

    Reports that incident photons near 10 keV can induce the emission of gamma rays with concomitant energy release from the 31-year isomer of 178Hf challenge established models of nuclear and atomic physics. In order to provide a direct and independent assessment of these claims, a multi-detector system was designed as a specialized research tool. The YSU miniball is unique in its combination of performance characteristics, compact size and portability, enabling it to be easily transported to and placed within the confines of beamline hutches at synchrotron radiation sources. Monochromatic synchrotron radiation was used in the most recent studies from which evidence of prompt triggering was claimed, suggesting similar sites for independent tests of these results. The miniball array consists of six high-efficiency BGO scintillators coupled with a single 65% Ge detector and provides time-resolved gamma-ray calorimetry rather than purely spectroscopic data. The need to record high detected folds from the array (up to seven-fold gamma coincidences) makes this system different in practice from standard spectroscopic arrays for which data is typically restricted to triples or lower folds. Here the system requirements and design are discussed, as well as system performance as characterized using the well-known natural decay cascades of 178m2Hf. This serves as the foundation for subsequent high-sensitivity searches for low-energy triggering of gamma emission from this isomer.

  17. Development of adaptive noise reduction filter algorithm for pediatric body images in a multi-detector CT

    NASA Astrophysics Data System (ADS)

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Okita, Izumi; Ninomiya, Yuuji; Tomoshige, Yukihiro; Kurokawa, Takehiro; Ono, Yutaka; Nakamura, Yuko; Suzuki, Masayuki

    2008-03-01

    Recently, several kinds of post-processing image filters which reduce the noise of computed tomography (CT) images have been proposed. However, these image filters are mostly for adults. Because these are not very effective in small (< 20 cm) display fields of view (FOV), we cannot use them for pediatric body images (e.g., premature babies and infant children). We have developed a new noise reduction filter algorithm for pediatric body CT images. This algorithm is based on a 3D post-processing in which the output pixel values are calculated by nonlinear interpolation in z-directions on original volumetric-data-sets. This algorithm does not need the in-plane (axial plane) processing, so the spatial resolution does not change. From the phantom studies, our algorithm could reduce SD up to 40% without affecting the spatial resolution of x-y plane and z-axis, and improved the CNR up to 30%. This newly developed filter algorithm will be useful for the diagnosis and radiation dose reduction of the pediatric body CT images.

  18. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Gu, J.; Bednarz, B.; Caracappa, P. F.; Xu, X. G.

    2009-05-01

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  19. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations.

    PubMed

    Gu, J; Bednarz, B; Caracappa, P F; Xu, X G

    2009-05-07

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  20. Quantitative assessment of lipiodol deposition after Chemoembolization: Comparison between cone-beam CT and multi-detector CT

    PubMed Central

    Chen, Rongxin; Geschwind, Jean-François; Wang, Zhijun; Tacher, Vania; Lin, MingDe

    2013-01-01

    Purpose To evaluate the ability of cone-beam computed tomography (CBCT) acquired directly after TACE to assess lipiodol deposition in hepatocellular carcinoma (HCC) and compare it to unenhanced MDCT. Materials and methods Fifteen HCC patients were treated with conventional TACE, and CBCT was used to assess the lipiodol deposition directly after TACE. Unenhanced MDCT was performed 24 hours after TACE. Four patients were excluded because the margin of tumor or area of lipiodol deposition was not clear. The image enhancement density of the entire tumor (T) and liver parenchyma (L) was measured by ImageJ software, and tumor-to-liver contrast (TLC) was calculated. In addition, volumetric measurement of tumor and lipiodol was performed by semiautomatic 3D volume segmentation and compared using linear regression to evaluate consistency between two imaging modalities. Results The mean value of TLC on CBCT was not significantly different from that on MDCT (337.7±233.5 HU versus 283.0±152.1 HU, P=0.103). The average volume of the whole tumor and of only the lipiodol deposited regions, and the calculated average percentage of lipiodol retention on CBCT were not significantly different when compared to MDCT (tumor volume: 9.6±11.8 cm3 versus 10.8±14.2 cm3, P=0.142; lipiodol volume: 6.3±7.7 cm3 versus 7.0±8.1 cm3, P=0.214; percentage of lipiodol retention: 68.9%±24.0% versus 72.2%±23.1%, P=0.578). Additionally, there was a high correlation in the volume of tumor and lipiodol between CBCT and MDCT (R2=0.919 and 0.903, respectively). Conclusions The quantitative image enhancement and volume analyses demonstrate that CBCT is similar to MDCT in assessing the lipiodol deposition in HCC after TACE. PMID:24094672

  1. Impaired left ventricular function has a detrimental effect on image quality in multi-detector row CT coronary angiography.

    PubMed

    Manghat, N E; Morgan-Hughes, G J; Shaw, S R; Marshall, A J; Roobottom, C A

    2008-04-01

    To determine whether there is a relationship between left ventricular (LV) haemodynamic parameters, circulation times, and arterial contrast opacification that might affect the image quality of computed tomography (CT) coronary angiography. Thirty-six patients were included in the study: 18 with cardiomyopathy (CM) and LV dilatation of suspected ischaemic aetiology [age 57.9+/-13.7 years, range 30-77 years; 14 male, four female; body mass index (BMI)=27.7+/-4.5, range 25.5-31.8] and 18 controls (age 62.3+/-9.4 years, range 47-89 years; 10 male, eight female; BMI 27.8+/-6.6; range 19.2-33.6). Coronary artery image quality was assessed using a three-point visual scale; contrast medium circulation times, aortic root contrast attenuation, and LV functional parameters were studied. Visually reduced contrast opacification impaired image quality more often in the CM group than the control group (27.4 versus 5.1%). A total of 55.6% CM patients had a contrast transit time ranging from 30-75 s; the number of "unassessable" segments increased with increasing transit time conforming to a fitted quadratic model (R2=0.74). The relationship between LV ejection fraction and contrast attenuation may also conform to a quadratic model (R2=0.71). LV haemodynamics influence coronary artery opacification using cardiac CT, and users imaging this subgroup must do so with the knowledge of this potential pitfall. The results indicate the need for further studies examining CT protocols in this clinical subgroup.

  2. A jumping left atrial thrombus connected to a pulmonary vein thrombus using transthoracic echocardiography and 64-slice multi-detector computed tomography.

    PubMed

    Takeuchi, Hidekazu

    2015-03-01

    Few studies have reported the differences between transthoracic echocardiography (TTE) and 64-slice multidetector CT (64-MDCT) in identifying left atrium (LA) thrombi. I report the case of a 70 year old man with coronary artery disease and angina who was diagnosed with a thrombus in the left lower pulmonary vein extending to the LA using a non-invasive 64-MDCT scan and TTE. TTE was unable to clearly identify a thrombus in the pulmonary veins, whereas a 64-MDCT scan identified a thrombus in the pulmonary vein but was unable to detect a moving thrombus attached to the mitral valves. The 64-MDCT images of LA thrombi are smaller than those of TTE and the video created using TTE demonstrated a moving thrombus connected to the mitral valve, which was underdiagnosed by a 64-MDCT scan. This case illustrates the complementary role for both TTE and 64-MDCT in the noninvasive diagnosis of left atrial-pulmonary vein thrombi.

  3. Non-invasive Detection of Aortic and Coronary Atherosclerosis in Homozygous Familial Hypercholesterolemia by 64 Slice Multi-detector Row Computed Tomography Angiography

    USDA-ARS?s Scientific Manuscript database

    Homozygous familial hypercholesterolemia (HoFH) is a rare disorder characterized by the early onset of atherosclerosis, often at the ostia of coronary arteries. In this study we document for the first time that aortic and coronary atherosclerosis can be detected using 64 slice multiple detector-row ...

  4. Non-invasive detection of aortic and coronary atherosclerosis in homozygous familial hypercholesterolemia by 64 slice multi-detector row computed tomography angiography

    USDA-ARS?s Scientific Manuscript database

    Homozygous familial hypercholesterolemia (HoFH) is a rare disorder characterized by the early onset of atherosclerosis, often at the ostia of coronary arteries. In this study we document for the first time that aortic and coronary atherosclerosis can be detected using 64 slice multiple detector row ...

  5. The vascularized groin lymph node flap (VGLN): Anatomical study and flap planning using multi-detector CT scanner. The golden triangle for flap harvesting.

    PubMed

    Zeltzer, Assaf A; Anzarut, Alexander; Braeckmans, Delphine; Seidenstuecker, Katrin; Hendrickx, Benoit; Van Hedent, Eddy; Hamdi, Moustapha

    2017-09-01

    A growing number of surgeons perform lymph node transfers for the treatment of lymphedema. When harvesting a vascularized lymph node groin flap (VGLNF) one of the major concerns is the potential risk of iatrogenic lymphedema of the donor-site. This article helps understanding of the lymph node distribution of the groin in order to minimize this risk. Fifty consecutive patients undergoing abdominal mapping by multi-detector CT scanner were included and 100 groins analyzed. The groin was divided in three zones (of which zone II is the safe zone) and lymph nodes were counted and mapped with their distances to anatomic landmarks. Further node units were plotted and counted. The average age was 48 years. A mean number of nodes of 6.5/groin was found. In zone II, which is our zone of interest a mean of 3.1 nodes were counted with a mean size of 7.8 mm. In three patients no nodes were found in zone II. In five patients nodes were seen in zone II but were not sufficient in size or number to be considered a lymph node unit. On average the lymph node unit in zone II was found to be 48.3 mm from the pubic tubercle when projected on a line from the pubic tubercle to the anterior superior iliac spine, 16.0 mm caudal to this line, and 20.4 mm above the groin crease. On average the lymph node unit was a mean of 41.7 mm lateral to the SCIV-SIEV confluence. This study provides increased understanding of the lymphatic anatomy in zone II of the groin flap and suggests a refined technique for designing the VGLNF. As with any flap there is a degree of individual patient variability. However, having information on the most common anatomy and flap design is of great value. © 2017 Wiley Periodicals, Inc.

  6. Morphometric evaluation of subaxial cervical spine using multi-detector computerized tomography (MD-CT) scan: the consideration for cervical pedicle screws fixation.

    PubMed

    Chanplakorn, Pongsthorn; Kraiwattanapong, Chaiwat; Aroonjarattham, Kitti; Leelapattana, Pittavat; Keorochana, Gun; Jaovisidha, Suphaneewan; Wajanavisit, Wiwat

    2014-04-11

    Cervical pedicle screw (CPS) insertion is a technically demanding procedure. The quantitative understanding of cervical pedicle morphology, especially the narrowest part of cervical pedicle or isthmus, would minimize the risk of catastrophic damage to surrounding neurovascular structures and improve surgical outcome. The aim of this study was to investigate morphology and quantify cortical thickness of the cervical isthmus by using Multi-detector Computerized Tomography (MD-CT) scan. The cervical CT scans were performed in 74 patients (37 males and 37 females) with 1-mm slice thickness and then retro-reconstructed into sagittal and coronal planes to measure various cervical parameters as follows: outer pedicle width (OPW), inner pedicle width (IPW), outer pedicle height (OPH), inner pedicle height (IPH), pedicle cortical thickness, pedicle sagittal angle (PSA), and pedicle transverse angle (PTA). Total numbers of 740 pedicles were measured in this present study. The mean OPW and IPW significantly increased from C3 to C7 while the mean OPH and IPH of those showed non-significant difference between any measured levels. The medial-lateral cortical thickness was significantly smaller than the superior-inferior one. PTA in the upper cervical spine was significantly wider than the lower ones. The PSA changed from upward inclination at upper cervical spine to the downward inclination at lower cervical spine. This study has demonstrated that cervical vertebra has relatively small and narrow inner pedicle canal with thick outer pedicle cortex and also shows a variable in pedicle width and inconsistent transverse angle. To enhance the safety of CPS insertion, the entry point and trajectories should be determined individually by using preoperative MD-CT scan and the inner pedicle width should be a key parameter to determine the screw dimensions.

  7. Voxel-Based Sensitivity of Flat-Panel CT for the Detection of Intracranial Hemorrhage: Comparison to Multi-Detector CT

    PubMed Central

    Frölich, Andreas M.; Buhk, Jan-Hendrik; Fiehler, Jens; Kemmling, Andre

    2016-01-01

    Objectives Flat-panel CT (FPCT) allows cross-sectional parenchymal, vascular and perfusion imaging within the angiography suite, which could greatly facilitate acute stroke management. We hypothesized that FPCT offers equal diagnostic accuracy compared to multi-detector CT (MDCT) as a primary tool to exclude intracranial hemorrhage. Methods 22 patients with intracranial hematomas who had both MDCT and FPCT performed within 24 hours were retrospectively identified. Patients with visible change in hematoma size or configuration were excluded. Two raters independently segmented hemorrhagic lesions. Data sets and corresponding binary lesion maps were co-registered to compare hematoma volume. Diagnostic accuracy of FPCT to detect hemorrhage was calculated from voxel-wise analysis of lesion overlap compared to reference MDCT. Results Mean hematoma size was similar between MDCT (16.2±8.9 ml) and FPCT (16.1±8.6 ml), with near perfect correlation of hematoma sizes between modalities (ρ = 0.95, p<0.001). Sensitivity and specificity of FPCT to detect hemorrhagic voxels was 61.6% and 99.8% for intraventricular hematomas and 67.7% and 99.5% for all other intracranial hematomas. Conclusions In this small sample containing predominantly cases with subarachnoid hemorrhage, FPCT based assessment of hemorrhagic volume in brain yields acceptable accuracy compared to reference MDCT, albeit with a limited sensitivity on a voxel level. Further assessment and improvement of FPCT is necessary before it can be applied as a primary imaging modality to exclude intracranial hemorrhage in acute stroke patients. PMID:27806106

  8. [Perfusion computed tomography for diffuse liver diseases].

    PubMed

    Schmidt, S A; Juchems, M S

    2012-08-01

    Perfusion computed tomography (CT) has its main application in the clinical routine diagnosis of neuroradiological problems. Polyphase multi-detector spiral computed tomography is primarily used in liver diagnostics. The use of perfusion CT is also possible for the diagnostics and differentiation of diffuse hepatic diseases. The differentiation between cirrhosis and cirrhosis-like parenchymal changes is possible. It also helps to detect early stages of malignant tumors. However, there are some negative aspects, particularly that of radiation exposure. This paper summarizes the technical basics and possible applications of perfusion CT in cases of diffuse liver disease and weighs up the advantages and disadvantages of the examinations.

  9. A 64-slice multi-detector CT scan could evaluate the change of the left atrial appendage thrombi of the atrial fibrillation patient, which was reduced by warfarin therapy.

    PubMed

    Takeuchi, Hidekazu

    2011-08-19

    Curable cause of stroke is the left atrial appendage (LAA) thrombi of atrial fibrillation (AF) patients. Some AF patients have the LAA thrombi. It is very important to cure AF patients by warfarin. Transoesophageal echocardiography (TOE) is the usual clinical tool to detect the LAA thrombi. Recently, a 64-slice multi-detector CT (64-MDCT) scan enables us to display the LAA thrombi more easily than TOE. I reported a case that a 64-MDCT scan had been used successfully in displaying the change of the LAA thrombi reduced by warfarin therapy. The size of the LAA thrombi was reduced from 25.2 mm × 19.3 mm (figure 1) to 22.1 mm × 14.8 mm (figure 2) after the 3-month warfarin therapy. It was useful to estimate the LAA thrombi by a 64-MDCT scan to estimate LAA thrombi itself and the change of LAA thrombi to evaluate the effectiveness of warfarin therapy.

  10. The Impact of Different Levels of Adaptive Iterative Dose Reduction 3D on Image Quality of 320-Row Coronary CT Angiography: A Clinical Trial

    PubMed Central

    Feger, Sarah; Rief, Matthias; Zimmermann, Elke; Martus, Peter; Schuijf, Joanne Désirée; Blobel, Jörg; Richter, Felicitas; Dewey, Marc

    2015-01-01

    Purpose The aim of this study was the systematic image quality evaluation of coronary CT angiography (CTA), reconstructed with the 3 different levels of adaptive iterative dose reduction (AIDR 3D) and compared to filtered back projection (FBP) with quantum denoising software (QDS). Methods Standard-dose CTA raw data of 30 patients with mean radiation dose of 3.2 ± 2.6 mSv were reconstructed using AIDR 3D mild, standard, strong and compared to FBP/QDS. Objective image quality comparison (signal, noise, signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), contour sharpness) was performed using 21 measurement points per patient, including measurements in each coronary artery from proximal to distal. Results Objective image quality parameters improved with increasing levels of AIDR 3D. Noise was lowest in AIDR 3D strong (p≤0.001 at 20/21 measurement points; compared with FBP/QDS). Signal and contour sharpness analysis showed no significant difference between the reconstruction algorithms for most measurement points. Best coronary SNR and CNR were achieved with AIDR 3D strong. No loss of SNR or CNR in distal segments was seen with AIDR 3D as compared to FBP. Conclusions On standard-dose coronary CTA images, AIDR 3D strong showed higher objective image quality than FBP/QDS without reducing contour sharpness. Trial Registration Clinicaltrials.gov NCT00967876 PMID:25945924

  11. Automated assessment of the aortic root dimensions with multidetector row computed tomography.

    PubMed

    Delgado, Victoria; Ng, Arnold C T; Schuijf, Joanne D; van der Kley, Frank; Shanks, Miriam; Tops, Laurens F; van de Veire, Nico R L; de Roos, Albert; Kroft, Lucia J M; Schalij, Martin J; Bax, Jeroen J

    2011-03-01

    Accurate aortic root measurements and evaluation of spatial relationships with coronary ostia are crucial in preoperative transcatheter aortic valve implantation assessments. Standardization of measurements may increase intraobserver and interobserver reproducibility to promote procedural success rate and reduce the frequency of procedurally related complications. This study evaluated the accuracy and reproducibility of a novel automated multidetector row computed tomography (MDCT) imaging postprocessing software, 3mensio Valves (version 4.1.sp1, Medical Imaging BV, Bilthoven, The Netherlands), in the assessment of patients with severe aortic stenosis candidates for transcatheter aortic valve implantation. Ninety patients with aortic valve disease were evaluated with 64-row and 320-row MDCT. Aortic valve annular size, aortic root dimensions, and height of the coronary ostia relative to the aortic valve annular plane were measured with the 3mensio Valves software. The measurements were compared with those obtained manually by the Vitrea2 software (Vital Images, Minneapolis, MN). Assessment of aortic valve annulus and aortic root dimensions were feasible in all the patients using the automated 3mensio Valves software. There were excellent agreements with minimal bias between automated and manual MDCT measurements as demonstrated by Bland-Altman analysis and intraclass correlation coefficients ranging from 0.97 to 0.99. The automated 3mensio Valves software had better interobserver reproducibility and required less image postprocessing time than manual assessment. Novel automated MDCT postprocessing imaging software (3mensio Valves) permits reliable, reproducible, and automated assessments of the aortic root dimensions and spatial relations with the surrounding structures. This has important clinical implications for preoperative assessments of patients undergoing transcatheter aortic valve implantation. Copyright © 2011 The Society of Thoracic Surgeons. Published by

  12. [Novel method of noise power spectrum measurement for computed tomography images with adaptive iterative reconstruction method].

    PubMed

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Hara, Takanori; Terakawa, Shoichi; Yokomachi, Kazushi; Fujioka, Chikako; Kiguchi, Masao; Ishifuro, Minoru

    2012-01-01

    Adaptive iterative reconstruction techniques (IRs) can decrease image noise in computed tomography (CT) and are expected to contribute to reduction of the radiation dose. To evaluate the performance of IRs, the conventional two-dimensional (2D) noise power spectrum (NPS) is widely used. However, when an IR provides an NPS value drop at all spatial frequency (which is similar to NPS changes by dose increase), the conventional method cannot evaluate the correct noise property because the conventional method does not correspond to the volume data natures of CT images. The purpose of our study was to develop a new method for NPS measurements that can be adapted to IRs. Our method utilized thick multi-planar reconstruction (MPR) images. The thick images are generally made by averaging CT volume data in a direction perpendicular to a MPR plane (e.g. z-direction for axial MPR plane). By using this averaging technique as a cutter for 3D-NPS, we can obtain adequate 2D-extracted NPS (eNPS) from 3D NPS. We applied this method to IR images generated with adaptive iterative dose reduction 3D (AIDR-3D, Toshiba) to investigate the validity of our method. A water phantom with 24 cm-diameters was scanned at 120 kV and 200 mAs with a 320-row CT (Acquilion One, Toshiba). From the results of study, the adequate thickness of MPR images for eNPS was more than 25.0 mm. Our new NPS measurement method utilizing thick MPR images was accurate and effective for evaluating noise reduction effects of IRs.

  13. Monte Carlo simulations in multi-detector CT (MDCT) for two PET/CT scanner models using MASH and FASH adult phantoms

    NASA Astrophysics Data System (ADS)

    Belinato, W.; Santos, W. S.; Paschoal, C. M. M.; Souza, D. N.

    2015-06-01

    The combination of positron emission tomography (PET) and computed tomography (CT) has been extensively used in oncology for diagnosis and staging of tumors, radiotherapy planning and follow-up of patients with cancer, as well as in cardiology and neurology. This study determines by the Monte Carlo method the internal organ dose deposition for computational phantoms created by multidetector CT (MDCT) beams of two PET/CT devices operating with different parameters. The different MDCT beam parameters were largely related to the total filtration that provides a beam energetic change inside the gantry. This parameter was determined experimentally with the Accu-Gold Radcal measurement system. The experimental values of the total filtration were included in the simulations of two MCNPX code scenarios. The absorbed organ doses obtained in MASH and FASH phantoms indicate that bowtie filter geometry and the energy of the X-ray beam have significant influence on the results, although this influence can be compensated by adjusting other variables such as the tube current-time product (mAs) and pitch during PET/CT procedures.

  14. Anatomy of the larynx and pharynx: effects of age, gender and height revealed by multidetector computed tomography.

    PubMed

    Inamoto, Y; Saitoh, E; Okada, S; Kagaya, H; Shibata, S; Baba, M; Onogi, K; Hashimoto, S; Katada, K; Wattanapan, P; Palmer, J B

    2015-09-01

    Although oropharyngeal and laryngeal structures are essential for swallowing, the three-dimensional (3D) anatomy is not well understood, due in part to limitations of available measuring techniques. This study uses 3D images acquired by 320-row area detector computed tomography ('320-ADCT'), to measure the pharynx and larynx and to investigate the effects of age, gender and height. Fifty-four healthy volunteers (30 male, 24 female, 23-77 years) underwent one single-phase volume scan (0.35 s) with 320-ADCT during resting tidal breathing. Six measurements of the pharynx and two of larynx were performed. Bivariate statistical methods were used to analyse the effects of gender, age and height on these measurements. Length and volume were significantly larger for men than for women for every measurement (P < 0.05) and increased with height (P < 0.05). Multiple regression analysis was performed to understand the interactions of gender, height and age. Gender, height and age each had significant effects on certain values. The volume of the larynx and hypopharynx was significantly affected by height and age. The length of pharynx was associated with gender and age. Length of the vocal folds and distance from the valleculae to the vocal folds were significantly affected by gender (P < 0.05). These results suggest that age, gender and height have independent and interacting effects on the morphology of the pharynx and larynx. Three-dimensional imaging and morphometrics using 320-ADCT are powerful tools for efficiently and reliably observing and measuring the pharynx and larynx.

  15. Deconvolution Methods for Multi-Detectors

    DTIC Science & Technology

    1989-08-30

    other words, the integral (71) can be considered as an operator defined I by a certain linear combination of the Dirac masses 6 and their alai...fk /r . This is the Sturm - Liouville expansion for, k n/2 1 a boundary value problem singular at t = 0 and derivative equal to zero at t - r1 . It...1I CHAPTER 1 ANALYTIC BEZOUT IDENTITIES ............. 1.0I CHAPTER 2 EXACT DECONVOLUTION FOR MULTIPLE CONVOLUTION OPERATORS

  16. Low fingertip temperature rebound measured by digital thermal monitoring strongly correlates with the presence and extent of coronary artery disease diagnosed by 64-slice multi-detector computed tomography.

    PubMed

    Ahmadi, Naser; Nabavi, Vahid; Nuguri, Vivek; Hajsadeghi, Fereshteh; Flores, Ferdinand; Akhtar, Mohammad; Kleis, Stanley; Hecht, Harvey; Naghavi, Morteza; Budoff, Matthew

    2009-10-01

    Previous studies showed strong correlations between low fingertip temperature rebound measured by digital thermal monitoring (DTM) during a 5 min arm-cuff induced reactive hyperemia and both the Framingham Risk Score (FRS), and coronary artery calcification (CAC) in asymptomatic populations. This study evaluates the correlation between DTM and coronary artery disease (CAD) measured by CT angiography (CTA) in symptomatic patients. It also investigates the correlation between CTA and a new index of neurovascular reactivity measured by DTM. 129 patients, age 63 +/- 9 years, 68% male, underwent DTM, CAC and CTA. Adjusted DTM indices in the occluded arm were calculated: temperature rebound: aTR and area under the temperature curve aTMP-AUC. DTM neurovascular reactivity (NVR) index was measured based on increased fingertip temperature in the non-occluded arm. Obstructive CAD was defined as >or=50% luminal stenosis, and normal as no stenosis and CAC = 0. Baseline fingertip temperature was not different across the groups. However, all DTM indices of vascular and neurovascular reactivity significantly decreased from normal to non-obstructive to obstructive CAD [(aTR 1.77 +/- 1.18 to 1.24 +/- 1.14 to 0.94 +/- 0.92) (P = 0.009), (aTMP-AUC: 355.6 +/- 242.4 to 277.4 +/- 182.4 to 184.4 +/- 171.2) (P = 0.001), (NVR: 161.5 +/- 147.4 to 77.6 +/- 88.2 to 48.8 +/- 63.8) (P = 0.015)]. After adjusting for risk factors, the odds ratio for obstructive CAD compared to normal in the lowest versus two upper tertiles of FRS, aTR, aTMP-AUC, and NVR were 2.41 (1.02-5.93), P = 0.05, 8.67 (2.6-9.4), P = 0.001, 11.62 (5.1-28.7), P = 0.001, and 3.58 (1.09-11.69), P = 0.01, respectively. DTM indices and FRS combined resulted in a ROC curve area of 0.88 for the prediction of obstructive CAD. In patients suspected of CAD, low fingertip temperature rebound measured by DTM significantly predicted CTA-diagnosed obstructive disease.

  17. Computed tomography radiation dose optimization: scanning protocols and clinical applications of automatic exposure control.

    PubMed

    Kalra, Mannudeep K; Naz, Nausheen; Rizzo, Stefania M R; Blake, Michael A

    2005-01-01

    As multi-detector-row computed tomography (CT) technology evolves, manifold applications of CT scanning have been adopted in clinical practice and optimization of scanning protocols to comply with an "as low as reasonably achievable" radiation dose have become more complex. Automatic exposure control techniques, which have been recently introduced on most state-of-the-art CT equipment, aid in radiation dose optimization at a selected image quality. The present article reviews the fundamentals of automatic exposure control techniques in CT, along with the scanning protocols and associated radiation dose reduction.

  18. Endocardial–epicardial distribution of myocardial perfusion reserve assessed by multidetector computed tomography in symptomatic patients without significant coronary artery disease: insights from the CORE320 multicentre study

    PubMed Central

    Kühl, Jørgen Tobias; George, Richard T.; Mehra, Vishal C.; Linde, Jesper J.; Chen, Marcus; Arai, Andrew E.; Di Carli, Marcelo; Kitagawa, Kakuya; Dewey, Marc; Lima, Joao A.C.; Kofoed, Klaus Fuglsang

    2016-01-01

    Aim Previous animal studies have demonstrated differences in perfusion and perfusion reserve between the subendocardium and subepicardium. 320-row computed tomography (CT) with sub-millimetre spatial resolution allows for the assessment of transmural differences in myocardial perfusion reserve (MPR) in humans. We aimed to test the hypothesis that MPR in all myocardial layers is determined by age, gender, and cardiovascular risk profile in patients with ischaemic symptoms or equivalent but without obstructive coronary artery disease (CAD). Methods and results A total of 149 patients enrolled in the CORE320 study with symptoms or signs of myocardial ischaemia and absence of significant CAD by invasive coronary angiography were scanned with static rest and stress CT perfusion. Myocardial attenuation densities were assessed at rest and during adenosine stress, segmented into 3 myocardial layers and 13 segments. MPR was higher in the subepicardium compared with the subendocardium (124% interquartile range [45, 235] vs. 68% [22,102], P < 0.001). Moreover, MPR in the septum was lower than in the inferolateral and anterolateral segments of the myocardium (55% [19, 104] vs. 89% [37, 168] and 124% [54, 270], P < 0.001). By multivariate analysis, high body mass index was significantly associated with reduced MPR in all myocardial layers when adjusted for cardiovascular risk factors (P = 0.02). Conclusion In symptomatic patients without significant coronary artery stenosis, distinct differences in endocardial–epicardial distribution of perfusion reserve may be demonstrated with static CT perfusion. Low MPR in all myocardial layers was observed specifically in obese patients. PMID:26341292

  19. Assessment of left ventricular volumes, ejection fraction and regional wall motion with retrospective electrocardiogram triggered 320-detector computed tomography: a comparison with 2D-echocardiography.

    PubMed

    Nasis, Arthur; Moir, Stuart; Seneviratne, Sujith K; Cameron, James D; Mottram, Philip M

    2012-04-01

    Left ventricular (LV) volumes, ejection fraction (LVEF) and regional wall motion (LVRWM) have important treatment and prognostic implications in patients with coronary artery disease. We sought to determine the accuracy of 320-row multidetector computed tomography (MDCT) for the assessment of LV volumes, LVEF and LVRWM, using 2D-echocardiography as the reference standard. We evaluated 50 consecutive patients (mean age 60 ± 14 years, 66% male) who underwent 320-detector MDCT (dose-modulated retrospective electrocardiogram-triggering) and 2D-echocardiography within 14 days for investigation of known or suspected coronary artery disease. Two blinded readers measured LV volumes on MDCT and visually assessed LVRWM with a 3-point scale using a 17-segment model. A separate experienced echocardiologist, blinded to MDCT findings, assessed LVRWM on 2D-echocardiograms and determined LV volumes and LVEF using Simpson's biplane method. 2D-echocardiography served as the reference standard. Mean LVEF was 59 ± 9% (range 26-75%) on 2D-echocardiography and 60 ± 9% (range 27-76%) on MDCT. Using linear regression analysis, MDCT agreed very well with 2D-echocardiography for assessment of LVEDV (r(2) = 0.88; P < 0.001), LVESV (r(2) = 0.95; P < 0.001) and LVEF (r(2) = 0.90; P < 0.001). Mean differences (±standard deviation) of 14 ± 13 ml, 5 ± 7 ml and 1 ± 3% were observed between MDCT and 2D-echocardiography for LVEDV, LVESV and LVEF, respectively. On 2D-echocardiography, 81/850 (9.5%) segments had abnormal LVRWM. Agreement for assessment of LVRWM between 2D-echocardiography and MDCT was excellent (96%, k = 0.76). Accurate assessment of LV volumes, LVEF and LVRWM is feasible with 320-detector MDCT, with MDCT demonstrating slightly larger LV volumes than 2D-echocardiography.

  20. The role of mobile computed tomography in mass fatality incidents.

    PubMed

    Rutty, Guy N; Robinson, Claire E; BouHaidar, Ralph; Jeffery, Amanda J; Morgan, Bruno

    2007-11-01

    Mobile multi-detector computed tomography (MDCT) scanners are potentially available to temporary mortuaries and can be operational within 20 min of arrival. We describe, to our knowledge, the first use of mobile MDCT for a mass fatality incident. A mobile MDCT scanner attended the disaster mortuary after a five vehicle road traffic incident. Five out of six bodies were successfully imaged by MDCT in c. 15 min per body. Subsequent full radiological analysis took c. 1 h per case. The results were compared to the autopsy examinations. We discuss the advantages and disadvantages of imaging with mobile MDCT in relation to mass fatality work, illustrating the body pathway process, and its role in the identification of the pathology, personal effects, and health and safety hazards. We propose that the adoption of a single modality of mobile MDCT could replace the current use of multiple radiological sources within a mass fatality mortuary.

  1. Multidetector computed tomography of temporomandibular joint: A road less travelled.

    PubMed

    Pahwa, Shivani; Bhalla, Ashu Seith; Roychaudhary, Ajoy; Bhutia, Ongkila

    2015-05-16

    This article reviews the imaging anatomy of temporomandibular joint (TMJ), describes the technique of multi-detector computed tomography (MDCT) of the TMJ, and describes in detail various osseous pathologic afflictions affecting the joint. Traumatic injuries affecting the mandibular condyle are most common, followed by joint ankylosis as a sequel to arthritis. The congenital anomalies are less frequent, hemifacial microsomia being the most commonly encountered anomaly involving the TMJ. Neoplastic afflictions of TMJ are distinctly uncommon, osteochondroma being one of the most common lesions. MDCT enables comprehensive evaluation of osseous afflictions of TMJ, and is a valuable tool for surgical planning. Sagittal, coronal and 3D reformatted images well depict osseous TMJ lesions, and their relationship to adjacent structures.

  2. The influence of clinical and acquisition parameters on the interpretability of adenosine stress myocardial computed tomography perfusion.

    PubMed

    van Rosendael, Alexander R; de Graaf, Michiel A; Dimitriu-Leen, Aukelien C; van Zwet, Erik W; van den Hoogen, Inge J; Kharbanda, Rohit K; Bax, Jeroen J; Kroft, Lucia J; Scholte, Arthur J

    2017-02-01

    The interpretation of adenosine stress myocardial computed tomography perfusion (CTP) is often hampered by image artefacts caused by cardiac motion, beam hardening, and cone beam. The aim of the present analysis was to assess the influence of the heart-rate response during adenosine infusion, patient characteristics, and medication use on the interpretability of stress myocardial CTP examinations. Interpretability of stress myocardial CTP examinations was evaluated in 120 patients who underwent sequentially coronary CTA and adenosine stress myocardial CTP (320-row CT scanner, temporal resolution 175 ms) and scored as follows: excellent = absence of any artefact (n = 27, 22%); good = presence of artefacts that do not interfere with the study interpretability (n = 56, 47%); fair = artefacts that do interfere with interpretability (n = 35, 29%); poor = uninterpretable study due to artefacts (n = 2, 2%). 'Fair' and 'poor' were merged into 'reduced' for comparisons. Increasing heart rate during stress myocardial CTP acquisition was related to worse interpretability (excellent: 61.7 ± 13.4 bpm vs. good: 69.8 ± 13.5 bpm vs. reduced: 78.1 ± 17.0 bpm, P < 0.001). Thirteen (11%) of all examinations were considered non-diagnostic. In patients with a heart rate exceeding 85 bpm, 76% of the studies were 'reduced' interpretable. In multivariate analysis, no use of beta blocker (baseline or additional use prior to coronary CTA) (OR: 0.2, P = 0.012), increasing heart rate during coronary CTA (OR: 1.09, P = 0.032), younger age (OR: 0.92, P = 0.021), and the use of calcium antagonist (OR: 6.48, P = 0.017) were independently associated with a heart rate ≥85 bpm during stress myocardial CTP. Higher heart rate during the acquisition of stress myocardial CTP was related to worse interpretability. Furthermore, increasing heart rate during and no beta blocker use prior to the previously performed coronary CTA, younger age, and the use of calcium antagonist were independently

  3. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles.

  4. Diagnosis of a Cerebral Arteriovenous Malformation Using Isolated Brain Computed Tomography Angiography: Case Report.

    PubMed

    Qian, Hui; Shao, Yu; Li, Zhengdong; Huang, Ping; Zou, Donghua; Liu, Ningguo; Chen, Yijiu; Wan, Lei

    2016-09-01

    This report presents a case of a 40-year-old woman who was found dead in her house. The examination of the body revealed no external injuries. The whole body was scanned by multi-detector-row computed tomography (CT) before autopsy, revealing massive hemorrhage in the right frontal extending into the ventricular system. At autopsy, the brain parenchyma was removed. Then CT angiography was carried on the isolated brain. Computed tomography angiography suggested a mass of irregular, tortuous vessels in areas of hemorrhage in the right frontal lobe of the brain. Finally, histological examination confirmed the result of CT angiography due to an arteriovenous malformation. Hence, postmortem CT angiography played an important role in diagnosis of the cerebral arteriovenous malformation that was responsible for a massive hemorrhage in the skull.

  5. An automated system for lung nodule detection in low-dose computed tomography

    NASA Astrophysics Data System (ADS)

    Gori, I.; Fantacci, M. E.; Preite Martinez, A.; Retico, A.

    2007-03-01

    A computer-aided detection (CAD) system for the identification of pulmonary nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 Italian project. One of the main goals of this project is to build a distributed database of lung CT scans in order to enable automated image analysis through a data and cpu GRID infrastructure. The basic modules of our lung-CAD system, a dot-enhancement filter for nodule candidate selection and a neural classifier for false-positive finding reduction, are described. The system was designed and tested for both internal and sub-pleural nodules. The results obtained on the collected database of low-dose thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  6. Comparison of Thoracic Radiography and Computed Tomography in Calves with Naturally Occurring Respiratory Disease.

    PubMed

    Fowler, Jennifer; Stieger-Vanegas, Susanne M; Vanegas, Jorge A; Bobe, Gerd; Poulsen, Keith P

    2017-01-01

    To evaluate the severity and extent of lung disease using thoracic computed radiography (CR) compared to contrast-enhanced multi-detector computed tomography (MDCT) of the thorax in calves with naturally occurring respiratory disease and to evaluate the feasibility and safety of performing contrast-enhanced thoracic multi-detector MDCT examinations in sedated calves. Furthermore, to evaluate if combining CR or MDCT with respiratory scoring factors will improve prediction of the chronicity of pulmonary disease in calves. Thirty Jersey heifer calves ranging in age between 25 and 89 days with naturally occurring respiratory disease. All calves were evaluated via thoracic CR and contrast-enhanced MDCT. All calves were euthanized immediately following thoracic MDCT and submitted for necropsy. Imaging and histopathology results were compared with each other. Thoracic MDCT was superior for evaluation of pneumonia in calves due to the lack of summation in all areas of the lungs. Intravenously administered sedation provided an adequate plane of sedation for acquiring MDCT images of diagnostic quality, without the need for re-scanning. A diagnosis of pneumonia was made with equal rate on both thoracic CR and MDCT. Although mild differences in classification of lung pattern and extent of lung disease were seen when comparing an experienced and a less experienced evaluator, the overall differences were not statistically significant. The best intra- and inter-observer agreement was noted when evaluating the cranioventral aspects of the lungs in either modality. Clinical respiratory scoring is inadequate for diagnosing chronicity of pneumonia in calves with naturally occurring pneumonia. Both imaging modalities allowed diagnosis of pneumonia in calves. The cranial ventral aspects of the lungs were most commonly affected. Thoracic CR and MDCT provided similar diagnostic effectiveness in diagnosing pneumonia. However, MDCT provided better assessment of subtle details, which may

  7. Does Computed Tomography Change our Observation and Management of Fracture Non-Unions?

    PubMed Central

    Kleinlugtenbelt, Ydo V.; Scholtes, Vanessa A.B.; Toor, Jay; Amaechi, Christian; Maas, Mario; Bhandari, Mohit; Poolman, Rudolf W.; Kloen, Peter

    2016-01-01

    Background: The purpose of this study was to determine whether Multi-Detector Computed Tomography (MDCT) in addition to plain radiographs influences radiologists’ and orthopedic surgeons’ diagnosis and treatment plans for delayed unions and non-unions. Methods: A retrospective database of 32 non-unions was reviewed by 20 observers. On a scale of 1 to 5, observers rated on X-Ray and a subsequent Multi Detector Helical Computer Tomography (MDCT) scan was performed to determine the following categories: “healed”, “bridging callus present”, “persistent fracture line” or “surgery advised”. Interobserver reliability in each category was calculated using the Interclass Correlation Coefficient (ICC). The influence of the MDCT scan on the raters’ observations was determined in each case by subtracting the two scores of both time points. Results: All four categories show fair interobserver reliability when using plain radiographs. MDCT showed no improvement, the reliability was poor for the categories “bridging callus present” and “persistent fracture line”, and fair for “healed” and “surgery advised”. In none of the cases, MDCT led to a change of management from nonoperative to operative treatment or vice versa. For 18 out of 32 cases, the treatment plans did not alter. In seven cases MDCT led to operative treatment while on X-ray the treatment plan was undecided. Conclusion: In this study, the interobserver reliability of MDCT scan is not greater than conventional radiographs for determining non-union. However, a MDCT scan did lead to a more invasive approach in equivocal cases. Therefore a MDCT is only recommended for making treatment strategies in those cases. PMID:27847846

  8. Multi-detector CT in the paediatric urinary tract.

    PubMed

    Damasio, M B; Darge, K; Riccabona, M

    2013-07-01

    The use of paediatric multi-slice CT (MSCT) is rapidly increasing worldwide. As technology advances its application in paediatric care is constantly expanding with an increasing need for radiation dose control and appropriate utilization. Recommendations on how and when to use CT for assessment of the paediatric urinary tract appear to be an important issue. Therefore the European Society of Paediatric Radiology (ESPR) uroradiology task force and European Society of Urogenital Radiology (ESUR) paediatric working groups created a proposal for performing renal CT in children that has recently been published. The objective of this paper is to discuss paediatric urinary tract CT (uro-CT) in more detail and depth. The specific aim is not only to offer general recommendations on clinical indications and optimization processes of paediatric CT examination, but also to address various childhood characteristics and phenomena that facilitate understanding the different approach and use of uro-CT in children compared to adults. According to ALARA principles, paediatric uro-CT should only be considered for selected indications provided high-level comprehensive US is not conclusive and alternative non-ionizing techniques such as MR are not available or appropriate. Optimization of paediatric uro-CT protocols (considering lower age-adapted kV and mAs) is mandatory, and the number of phases and acquisition series should be kept as few as possible.

  9. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  10. 64-MULTIDETECTOR COMPUTED TOMOGRAPHIC ANGIOGRAPHY OF THE CANINE CORONARY ARTERIES

    PubMed Central

    Drees, Randi; Frydrychowicz, Alex; Reeder, Scott B.; Pinkerton, Marie E.; Johnson, Rebecca

    2012-01-01

    Canine coronary artery angiography (CTA) was performed in four anesthetized healthy dogs using 64-multi-detector computed tomography. Esmolol, a β-1 adrenergic receptor antagonist, and sodium nitroprusside, an arteriolar and venous dilator, were administered to enhance visualization of the coronary arteries by reducing heart rate and creating vasodilation. The left main coronary artery with its three main branches and the right coronary artery were visualized and subdivided in 13 segments for evaluation. Optimal reconstruction interval, expressed as percentage of the R-to-R interval, was determined at 5% in 2.9%, 35% in 1%, 75% in 21.2%, 85% in 43.3%, and 95% in 31.7% of the segments. Overall image quality was good in 41.3% of the segments and excellent in 14.4%. There was blur in 98.1%, motion in 17.3%, and stair step in 6.7% of the evaluated segments, but these artifacts did not interfere with anatomic depiction of the arteries. Cross-sectional anatomy of the coronary arteries as evaluated from the coronary CTA agreed well with gross anatomic evaluation and published information. The use of esmolol did not lead to the target heart rate of 60–65 beats/min. Nitroprusside had no significant effect on visualized length or diameter of the coronary artery branches. Coronary CTA is useful for the anatomic depiction of coronary artery branches in the dog. PMID:21521398

  11. Computer-based route-definition system for peripheral bronchoscopy.

    PubMed

    Graham, Michael W; Gibbs, Jason D; Higgins, William E

    2012-04-01

    Multi-detector computed tomography (MDCT) scanners produce high-resolution images of the chest. Given a patient's MDCT scan, a physician can use an image-guided intervention system to first plan and later perform bronchoscopy to diagnostic sites situated deep in the lung periphery. An accurate definition of complete routes through the airway tree leading to the diagnostic sites, however, is vital for avoiding navigation errors during image-guided bronchoscopy. We present a system for the robust definition of complete airway routes suitable for image-guided bronchoscopy. The system incorporates both automatic and semiautomatic MDCT analysis methods for this purpose. Using an intuitive graphical user interface, the user invokes automatic analysis on a patient's MDCT scan to produce a series of preliminary routes. Next, the user visually inspects each route and quickly corrects the observed route defects using the built-in semiautomatic methods. Application of the system to a human study for the planning and guidance of peripheral bronchoscopy demonstrates the efficacy of the system.

  12. Anomalous right coronary artery arising next to the left coronary ostium: unambiguous detection of the anatomy by computed tomography and evaluation of functional significance by cardiovascular magnetic resonance.

    PubMed

    Korosoglou, Grigorios; Heye, Tobias; Giannitsis, Evangelos; Hosch, Waldemar; Kauczor, Hans U; Katus, Hugo A

    2010-11-19

    Herein we report on the diagnostic potential of multi-detector row computed tomography (MDCT) combined with cardiovascular magnetic resonance (CMR) for the diagnostic workup in an adult patient with a rare coronary anomaly. MDCT unambiguously detected the anomalous right coronary artery (RCA), which originated next to the left coronary ostium and coursed inter-arterially between the ascending aorta and the pulmonary trunk. The intramural proximal intussusception of the ectopic RCA could be clearly appreciated on MDCT images, while multiple mixed plaques were detected in the left anterior descending (LAD), resulting in moderate stenosis of this vessel. CMR during adenosine infusion ruled-out inducible ischemia, yielding normal perfusion patterns both in the RCA and in the LAD coronary territory. Since ischemia was not demonstrated by stress CMR, revascularization was not performed.

  13. Comparison of image features calculated in different dimensions for computer-aided diagnosis of lung nodules

    NASA Astrophysics Data System (ADS)

    Xu, Ye; Lee, Michael C.; Boroczky, Lilla; Cann, Aaron D.; Borczuk, Alain C.; Kawut, Steven M.; Powell, Charles A.

    2009-02-01

    Features calculated from different dimensions of images capture quantitative information of the lung nodules through one or multiple image slices. Previously published computer-aided diagnosis (CADx) systems have used either twodimensional (2D) or three-dimensional (3D) features, though there has been little systematic analysis of the relevance of the different dimensions and of the impact of combining different dimensions. The aim of this study is to determine the importance of combining features calculated in different dimensions. We have performed CADx experiments on 125 pulmonary nodules imaged using multi-detector row CT (MDCT). The CADx system computed 192 2D, 2.5D, and 3D image features of the lesions. Leave-one-out experiments were performed using five different combinations of features from different dimensions: 2D, 3D, 2.5D, 2D+3D, and 2D+3D+2.5D. The experiments were performed ten times for each group. Accuracy, sensitivity and specificity were used to evaluate the performance. Wilcoxon signed-rank tests were applied to compare the classification results from these five different combinations of features. Our results showed that 3D image features generate the best result compared with other combinations of features. This suggests one approach to potentially reducing the dimensionality of the CADx data space and the computational complexity of the system while maintaining diagnostic accuracy.

  14. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  15. Computer Engineers.

    ERIC Educational Resources Information Center

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  16. Computer Engineers.

    ERIC Educational Resources Information Center

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  17. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  18. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  19. Detection and characterization of crystal suspensions using single-source dual-energy computed tomography: a phantom model of crystal arthropathies.

    PubMed

    Diekhoff, Torsten; Kiefer, Tobias; Stroux, Andrea; Pilhofer, Irid; Juran, Ralf; Mews, Jürgen; Blobel, Jörg; Tsuyuki, Masaharu; Ackermann, Beate; Hamm, Bernd; Hermann, Kay-Geert A

    2015-04-01

    The aim of this study was to perform phantom measurements to prove the feasibility of single-source dual-energy computed tomography (DECT) of the extremities using a volume scan mode. In addition, we, for the first time, wanted to determine which concentrations of monosodium urate (MSU) in gout and calcium pyrophosphate (CP) in pseudogout are needed to detect or distinguish these soft tissue depositions with DECT. We created a hand-shaped plastic phantom assembled with a descending order of concentrations of MSU (6.25%-50%) and CP (1.56%-50%) with similar attenuation in conventional computed tomographic (CT) images. Dual-energy imaging was done on a standard 320-row CT scanner with acquisition of 2 volumes: one at 80 and the other at 135 kV. Using linear regression analysis, dual-energy gradients were calculated for MSU and CP. Thereafter, we selected a specific region of interest on the dual-energy graph to color-code MSU and CP on the images. Three blinded readers scored 10 scans of the randomly equipped phantom, corresponding to 60 samples, to determine the sensitivity and specificity of this technique. Receiver operating characteristics analysis was done to determine the diagnostic power. We found a dual-energy gradient for MSU of 1.020 ± 0.006 and for CP of 0.673 ± 0.001. Assessment of the randomized phantom scans indicates reliable detection of MSU at concentrations of 12.5 % or higher and that of CP at 6.25 % or higher, corresponding to deposits with mean Hounsfield unit values of 59.8 for MSU and 101.1 for CP. The sensitivity for MSU ranged from 83.3% to 97.3% at 15/90 mA (135/80 kV) and from 86.7% to 97.3% at 100/570 mA. Specificity was 96.7% to 100% in 15/90 mA and 100% in 100/570 mA of scans. However, there was inferior sensitivity for CP owing to lower concentrations. In the receiver operating characteristics analysis, the area under the curve for MSU ranged from 0.867 to 0.947 at 15/90 mA and from 0.867 to 0.919 at 100/570 mA and that for CP from 0

  20. A minimum data set approach to post-mortem computed tomography reporting for anthropological biological profiling.

    PubMed

    Brough, Alison L; Morgan, Bruno; Robinson, Claire; Black, Sue; Cunningham, Craig; Adams, Catherine; Rutty, Guy N

    2014-12-01

    Anthropological examination of bones is routinely undertaken in medico-legal investigations to establish an individual's biological profile, particularly their age. This often requires the removal of soft tissue from bone (de-fleshing), which, especially when dealing with the recently deceased, is a time consuming and invasive procedure. Recent advances in multi-detector computed tomography have made it practical to rapidly acquire high-resolution morphological skeletal information from images of "fleshed" remains. The aim of this study was to develop a short standard form, created from post-mortem computed tomography images, that contains the minimum image-set required to anthropologically assess an individual. The proposed standard forms were created for 31 juvenile forensic cases with known age-at-death, spanning the full age range of the developing human. Five observers independently used this form to estimate age-at-death. All observers estimated age in all cases, and all estimations were within the accepted ranges for traditional anthropological and odontological assessment. This study supports the implementation of this approach in forensic radiological practice.

  1. What can computed tomography and magnetic resonance imaging tell us about ventilation?

    PubMed Central

    Simon, Brett A.; Kaczka, David W.; Bankier, Alexander A.

    2012-01-01

    This review provides a summary of pulmonary functional imaging approaches for determining pulmonary ventilation, with a specific focus on multi-detector x-ray computed tomography and magnetic resonance imaging (MRI). We provide the important functional definitions of pulmonary ventilation typically used in medicine and physiology and discuss the fact that some of the imaging literature describes gas distribution abnormalities in pulmonary disease that may or may not be related to the physiological definition or clinical interpretation of ventilation. We also review the current state-of-the-field in terms of the key physiological questions yet unanswered related to ventilation and gas distribution in lung disease. Current and emerging imaging research methods are described, including their strengths and the challenges that remain to translate these methods to more wide-spread research and clinical use. We also examine how computed tomography and MRI might be used in the future to gain more insight into gas distribution and ventilation abnormalities in pulmonary disease. PMID:22653989

  2. The Spectrum of Presentations of Cryptogenic Organizing Pneumonia in High Resolution Computed Tomography

    PubMed Central

    Mehrian, Payam; Shahnazi, Makhtoom; Dahaj, Ali Ahmadi; Bizhanzadeh, Sorour; Karimi, Mohammad Ali

    2014-01-01

    Summary Background Various radiologic patterns of cryptogenic organizing pneumonia (COP) in X-rays have been reported for more than 20 years, and later, in computed tomography scans. The aim of the present study was to describe the spectrum of radiologic findings on high resolution computed tomography (HRCT) scans in patients with COP. Material/Methods HRCT scans of 31 sequential patients (mean age: 54.3±11 years; 55% male) with biopsy-proven COP in a tertiary lung center between 2009 and 2012 were reviewed by two experienced pulmonary radiologists with almost perfect interobserver agreement (kappa=0.83). Chest HRCTs from the lung apex to the base were performed using a 16-slice multi-detector CT scanner. Results The most common HRCT presentation of COP was ground-glass opacity (GGO) in 83.9% of cases, followed by consolidation in 71%. Both findings were mostly asymmetric bilateral and multifocal. Other common findings were the reverse halo (48.4%), parenchymal bands (54.8%) and subpleural bands (32.3%). Pulmonary nodules were found in about one-third of patients and were frequently smaller than 5 mm in diameter. Both GGOs and consolidations were revealed more often in the lower lobes. Conclusions The main presentations of COP on HRCT include bilateral GGOs and consolidations in the lower lobes together with the reverse halo sign. PMID:25493105

  3. The spectrum of presentations of cryptogenic organizing pneumonia in high resolution computed tomography.

    PubMed

    Mehrian, Payam; Shahnazi, Makhtoom; Dahaj, Ali Ahmadi; Bizhanzadeh, Sorour; Karimi, Mohammad Ali

    2014-01-01

    Various radiologic patterns of cryptogenic organizing pneumonia (COP) in X-rays have been reported for more than 20 years, and later, in computed tomography scans. The aim of the present study was to describe the spectrum of radiologic findings on high resolution computed tomography (HRCT) scans in patients with COP. HRCT scans of 31 sequential patients (mean age: 54.3±11 years; 55% male) with biopsy-proven COP in a tertiary lung center between 2009 and 2012 were reviewed by two experienced pulmonary radiologists with almost perfect interobserver agreement (kappa=0.83). Chest HRCTs from the lung apex to the base were performed using a 16-slice multi-detector CT scanner. The most common HRCT presentation of COP was ground-glass opacity (GGO) in 83.9% of cases, followed by consolidation in 71%. Both findings were mostly asymmetric bilateral and multifocal. Other common findings were the reverse halo (48.4%), parenchymal bands (54.8%) and subpleural bands (32.3%). Pulmonary nodules were found in about one-third of patients and were frequently smaller than 5 mm in diameter. Both GGOs and consolidations were revealed more often in the lower lobes. The main presentations of COP on HRCT include bilateral GGOs and consolidations in the lower lobes together with the reverse halo sign.

  4. Evaluation of Sinonasal Diseases by Computed Tomography

    PubMed Central

    Phatak, Suresh

    2016-01-01

    Introduction Computed Tomography (CT) plays an important diagnostic role in patients with sinonasal diseases and determines the treatment. The CT images clearly show fine structural architecture of bony anatomy thereby determining various anatomical variation, extent of disease and characterization of various inflammatory, benign and malignant sinonasal diseases. Aim To evaluate sensitivity and specificity of CT in diagnosis of sinonasal diseases and to characterise the benign and malignant lesions with the help of various CT parameters. Also, to correlate findings of CT with histo-pathological and diagnostic nasal endoscopy/ Functional Endoscopic Sinus Surgery (FESS) findings. Materials and Methods In this hospital based prospective study 175 patients with symptomatic sinonasal diseases were evaluated by clinical diagnosis and 16 slice Multi Detector Computed Tomography (MDCT). The details of findings of nasal endoscopy, Functional Endoscopic Sinus Surgery (FESS), histopathological examination and fungal culture were collected in all those cases where those investigations were done. All those findings were correlated with CT findings and statistical analysis was done by using Test statistics (sensitivity, specificity, Positive Predictive Value (PPV), Negative Predictive Value (NPV) and accuracy), Chi-Square test and Z-test for single proportions. Software used in the analysis was SPSS 17.0 version and graph pad prism 6.0 version and p < 0.05 was considered as statistically significant. Results CT diagnosis had higher sensitivity, specificity, PPV and NPV in diagnosing various sinonasal diseases in comparison to clinical diagnosis. On correlating CT diagnosis with final diagnosis, congenital conditions have 100% sensitivity and specificity. Chronic sinusitis has 98.3% sensitivity and 97.8% specificity. For fungal sinusitis the sensitivity was 60% and specificity was 99.3%. Polyps have sensitivity of 94.4% and specificity of 98.1%. Benign neoplasms have sensitivity

  5. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  6. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  7. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  8. Computer Literacy.

    ERIC Educational Resources Information Center

    South Carolina State Dept. of Education, Columbia.

    This self-instructional manual presents basic information about computers and their capabilities in a series of lessons, each of which contains a list of learner objectives and a series of true-false self-check exercises. The lessons cover: (1) what a computer is and what it can do, and the reasons why computers are used; (2) computer types and…

  9. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  10. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  11. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  12. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  13. Quantum computing

    PubMed Central

    Li, Shu-Shen; Long, Gui-Lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization. PMID:11562459

  14. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  15. Library Computing.

    ERIC Educational Resources Information Center

    Library Journal, 1985

    1985-01-01

    This special supplement to "Library Journal" and "School Library Journal" includes articles on technological dependency, promise of computers for reluctant readers, copyright and database downloading, access to neighborhood of Mister Rogers, library acquisitions, circulating personal computers, "microcomputeritis,"…

  16. Parallel computation

    NASA Astrophysics Data System (ADS)

    Huberman, Bernardo A.

    1989-11-01

    This paper reviews three different aspects of parallel computation which are useful for physics. The first part deals with special architectures for parallel computing (SIMD and MIMD machines) and their differences, with examples of their uses. The second section discusses the speedup that can be achieved in parallel computation and the constraints generated by the issues of communication and synchrony. The third part describes computation by distributed networks of powerful workstations without global controls and the issues involved in understanding their behavior.

  17. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  18. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  19. Distributed Computing.

    ERIC Educational Resources Information Center

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  20. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  1. Distributed Computing.

    ERIC Educational Resources Information Center

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  2. Parallel computers

    SciTech Connect

    Treveaven, P.

    1989-01-01

    This book presents an introduction to object-oriented, functional, and logic parallel computing on which the fifth generation of computer systems will be based. Coverage includes concepts for parallel computing languages, a parallel object-oriented system (DOOM) and its language (POOL), an object-oriented multilevel VLSI simulator using POOL, and implementation of lazy functional languages on parallel architectures.

  3. Children's Computers.

    ERIC Educational Resources Information Center

    Samaras, Anastasia P.

    1996-01-01

    Suggests that teachers and social context determine what young children acquire from computer experiences. Provides anecdotes of teachers working with children who are using a computer program to complete a picture puzzle. The computer allowed teachers to present a problem, witness children's cognitive capabilities, listen to their metacognitive…

  4. Portable Computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    SPOC, a navigation monitoring computer used by NASA in a 1983 mission, was a modification of a commercial computer called GRiD Compass, produced by GRiD Systems Corporation. SPOC was chosen because of its small size, large storage capacity, and high processing speed. The principal modification required was a fan to cool the computer. SPOC automatically computes position, orbital paths, communication locations, etc. Some of the modifications were adapted for commercial applications. The computer is presently used in offices for conferences, for on-site development, and by the army as part of a field communications systems.

  5. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  6. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  7. Scanning protocol optimization and dose evaluation in coronary stenosis using multi-slices computed tomography

    NASA Astrophysics Data System (ADS)

    Huang, Yung-hui; Chen, Chia-lin; Sheu, Chin-yin; Lee, Jason J. S.

    2007-02-01

    Cardiovascular diseases are the most common incidence for premature death in developed countries. A major fraction is attributable to atherosclerotic coronary artery disease, which may result in sudden cardiac failure. A reduction of mortality caused by myocardial infarction may be achieved if coronary atherosclerosis can be detected and treated at an early stage before symptoms occur. Therefore, there is need for an effective tool that allows identification of patients at increased risk for future cardiac events. The current multi-detector CT has been widely used for detection and quantification of coronary calcifications as a sign of coronary atherosclerosis. The aim of this study is to optimize the diagnostic values and radiation exposure in coronary artery calcium-screening examination using multi-slice CT (MSCT) with different image scan protocols. The radiation exposure for all protocols is evaluated by using computed tomography dose index (CTDI) phantom measurements. We chose an optimal scanning protocol and evaluated patient radiation dose in the MSCT coronary artery screenings and preserved its expecting diagnostic accuracy. These changes make the MSCT have more operation flexibility and provide more diagnostic values in current practice.

  8. Post-mortem computed tomography and 3D imaging: anthropological applications for juvenile remains.

    PubMed

    Brough, Alison L; Rutty, Guy N; Black, Sue; Morgan, Bruno

    2012-09-01

    Anthropological examination of defleshed bones is routinely used in medico-legal investigations to establish an individual's biological profile. However, when dealing with the recently deceased, the removal of soft tissue from bone can be an extremely time consuming procedure that requires the presence of a trained anthropologist. In addition, due to its invasive nature, in some disaster victim identification scenarios the maceration of bones is discouraged by religious practices and beliefs, or even prohibited by national laws and regulations. Currently, three different radiological techniques may be used in the investigative process; plain X-ray, dental X-ray and fluoroscopy. However, recent advances in multi-detector computed tomography (MDCT) mean that it is now possible to acquire morphological skeletal information from high resolution images, reducing the necessity for invasive procedures. This review paper considers the possible applications of a virtual anthropological examination by reviewing the main juvenile age determination methods used by anthropologists at present and their possible adaption to MDCT.

  9. Assessing vertebral fracture risk on volumetric quantitative computed tomography by geometric characterization of trabecular bone structure

    NASA Astrophysics Data System (ADS)

    Checefsky, Walter A.; Abidin, Anas Z.; Nagarajan, Mahesh B.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2016-03-01

    The current clinical standard for measuring Bone Mineral Density (BMD) is dual X-ray absorptiometry, however more recently BMD derived from volumetric quantitative computed tomography has been shown to demonstrate a high association with spinal fracture susceptibility. In this study, we propose a method of fracture risk assessment using structural properties of trabecular bone in spinal vertebrae. Experimental data was acquired via axial multi-detector CT (MDCT) from 12 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. Common image processing methods were used to annotate the trabecular compartment in the vertebral slices creating a circular region of interest (ROI) that excluded cortical bone for each slice. The pixels inside the ROI were converted to values indicative of BMD. High dimensional geometrical features were derived using the scaling index method (SIM) at different radii and scaling factors (SF). The mean BMD values within the ROI were then extracted and used in conjunction with a support vector machine to predict the failure load of the specimens. Prediction performance was measured using the root-mean-square error (RMSE) metric and determined that SIM combined with mean BMD features (RMSE = 0.82 +/- 0.37) outperformed MDCT-measured mean BMD (RMSE = 1.11 +/- 0.33) (p < 10-4). These results demonstrate that biomechanical strength prediction in vertebrae can be significantly improved through the use of SIM-derived texture features from trabecular bone.

  10. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  11. Computational aerothermodynamics

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.; Green, Michael J.

    1987-01-01

    Computational aerothermodynamics (CAT) has in the past contributed to the understanding of real-gas flows encountered by hypervelocity reentry vehicles. With advances in computational fluid dynamics, in the modeling of high temperature phenomena, and in computer capability, CAT is an enabling technology for the design of many future space vehicles. An overview of the current capabilities of CAT is provided by describing available methods and their applications. Technical challenges that need to be met are discussed.

  12. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  13. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  14. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  15. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  16. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  17. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  18. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  19. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  20. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  1. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  2. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  3. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  4. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  5. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  6. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  7. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  8. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  9. Evolutionary Computing

    SciTech Connect

    Patton, Robert M; Cui, Xiaohui; Jiao, Yu; Potok, Thomas E

    2008-01-01

    The rate at which information overwhelms humans is significantly more than the rate at which humans have learned to process, analyze, and leverage this information. To overcome this challenge, new methods of computing must be formulated, and scientist and engineers have looked to nature for inspiration in developing these new methods. Consequently, evolutionary computing has emerged as new paradigm for computing, and has rapidly demonstrated its ability to solve real-world problems where traditional techniques have failed. This field of work has now become quite broad and encompasses areas ranging from artificial life to neural networks. This chapter focuses specifically on two sub-areas of nature-inspired computing: Evolutionary Algorithms and Swarm Intelligence.

  10. Computer Calculus.

    ERIC Educational Resources Information Center

    Steen, Lynn Arthur

    1981-01-01

    The development of symbolic computer algebra designed to manipulate abstract mathematical expressions is discussed. The ability of this software to mimic the standard patterns of human problem solving represents a major advance toward "true" artificial intelligence. (MP)

  11. Computer Poker

    ERIC Educational Resources Information Center

    Findler, Nicholas V.

    1978-01-01

    This familiar card game has interested mathematicians, economists, and psychologists as a model of decision-making in the real world. It is now serving as a vehicle for investigations in computer science. (Author/MA)

  12. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  13. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  14. Quantum Computing

    DTIC Science & Technology

    1998-04-01

    information representation and processing technology, although faster than the wheels and gears of the Charles Babbage computation machine, is still in...the same computational complexity class as the Babbage machine, with bits of information represented by entities which obey classical (non-quantum...nuclear double resonances Charles M Bowden and Jonathan P. Dowling Weapons Sciences Directorate, AMSMI-RD-WS-ST Missile Research, Development, and

  15. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  16. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  17. Computational oncology.

    PubMed

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  18. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  19. [DNA computing].

    PubMed

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science.

  20. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  1. F18-fluorodeoxyglucose-positron emission tomography and computed tomography is not accurate in preoperative staging of gastric cancer

    PubMed Central

    Ha, Tae Kyung; Choi, Yun Young; Song, Soon Young

    2011-01-01

    Purpose To investigate the clinical benefits of F18-fluorodeoxyglucose-positron emission tomography and computed tomography (18F-FDG-PET/CT) over multi-detector row CT (MDCT) in preoperative staging of gastric cancer. Methods FDG-PET/CT and MDCT were performed on 78 patients with gastric cancer pathologically diagnosed by endoscopy. The accuracy of radiologic staging retrospectively was compared to pathologic result after curative resection. Results Primary tumors were detected in 51 (65.4%) patients with 18F-FDG-PET/CT, and 47 (60.3%) patients with MDCT. Regarding detection of lymph node metastasis, the sensitivity of FDG-PET/CT was 51.5% with an accuracy of 71.8%, whereas those of MDCT were 69.7% and 69.2%, respectively. The sensitivity of 18F-FDG-PET/CT for a primary tumor with signet ring cell carcinoma was lower than that of 18F-FDG-PET/CT for a primary tumor with non-signet ring cell carcinoma (35.3% vs. 73.8%, P < 0.01). Conclusion Due to its low sensitivity, 18F-FDG-PET/CT alone shows no definite clinical benefit for prediction of lymph node metastasis in preoperative staging of gastric cancer. PMID:22066108

  2. Quantum computers.

    PubMed

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  3. Qubus computation

    NASA Astrophysics Data System (ADS)

    Munro, W. J.; Nemoto, Kae; Spiller, T. P.; van Loock, P.; Braunstein, Samuel L.; Milburn, G. J.

    2006-08-01

    Processing information quantum mechanically is known to enable new communication and computational scenarios that cannot be accessed with conventional information technology (IT). We present here a new approach to scalable quantum computing---a "qubus computer"---which realizes qubit measurement and quantum gates through interacting qubits with a quantum communication bus mode. The qubits could be "static" matter qubits or "flying" optical qubits, but the scheme we focus on here is particularly suited to matter qubits. Universal two-qubit quantum gates may be effected by schemes which involve measurement of the bus mode, or by schemes where the bus disentangles automatically and no measurement is needed. This approach enables a parity gate between qubits, mediated by a bus, enabling near-deterministic Bell state measurement and entangling gates. Our approach is therefore the basis for very efficient, scalable QIP, and provides a natural method for distributing such processing, combining it with quantum communication.

  4. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  5. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  6. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  7. Computational vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  8. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  9. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  10. Radiological Protection in Cone Beam Computed Tomography (CBCT). ICRP Publication 129.

    PubMed

    Rehani, M M; Gupta, R; Bartling, S; Sharp, G C; Pauwels, R; Berris, T; Boone, J M

    2015-07-01

    The objective of this publication is to provide guidance on radiological protection in the new technology of cone beam computed tomography (CBCT). Publications 87 and 102 dealt with patient dose management in computed tomography (CT) and multi-detector CT. The new applications of CBCT and the associated radiological protection issues are substantially different from those of conventional CT. The perception that CBCT involves lower doses was only true in initial applications. CBCT is now used widely by specialists who have little or no training in radiological protection. This publication provides recommendations on radiation dose management directed at different stakeholders, and covers principles of radiological protection, training, and quality assurance aspects. Advice on appropriate use of CBCT needs to be made widely available. Advice on optimisation of protection when using CBCT equipment needs to be strengthened, particularly with respect to the use of newer features of the equipment. Manufacturers should standardise radiation dose displays on CBCT equipment to assist users in optimisation of protection and comparisons of performance. Additional challenges to radiological protection are introduced when CBCT-capable equipment is used for both fluoroscopy and tomography during the same procedure. Standardised methods need to be established for tracking and reporting of patient radiation doses from these procedures. The recommendations provided in this publication may evolve in the future as CBCT equipment and applications evolve. As with previous ICRP publications, the Commission hopes that imaging professionals, medical physicists, and manufacturers will use the guidelines and recommendations provided in this publication for implementation of the Commission's principle of optimisation of protection of patients and medical workers, with the objective of keeping exposures as low as reasonably achievable, taking into account economic and societal factors, and

  11. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  12. Computer Corner.

    ERIC Educational Resources Information Center

    Smith, David A.; And Others

    1986-01-01

    APL was invented specifically as a mathematical teaching tool, and is an excellent vehicle for teaching mathematical concepts using computers. This article illustrates the use of APL in teaching many different topics in mathematics, including logic, set theory, functions, statistics, linear algebra, and matrices. (MNS)

  13. COMPUTATIONAL SOCIOLINGUISTICS.

    ERIC Educational Resources Information Center

    SEDELOW, WALTER A., JR.

    THE USE OF THE COMPUTER MAY BE ONE OF THE WAYS IN WHICH VARIED LINGUISTIC INTERESTS (SOCIOLINGUISTICS, PSYCHOLINGUISTICS) COME TO BE RENDERED INTERRELATED AND EVEN INTELLECTUALLY COHERENT. (THE CRITERION OF COHERENCE IS SET HERE AT MONISM AS TO MODELS.) ONE OF THE AUTHOR'S MAJOR INTERESTS IS A SYSTEMATIC APPROACH TO SCIENTIFIC CREATIVITY,…

  14. Computational aeroelasticity

    NASA Technical Reports Server (NTRS)

    Edwards, John W.

    1993-01-01

    Recent large-scale studies of computational unsteady aerodynamics for aeroelastic applications are reviewed. The variety of fluid dynamic flow models available to address such computations are illustrated in two cases: (1) 2D Navier-Stokes (NS) equations illustrating the highest level of modeling usually employed and (2) the transonic small-disturbance potential equation representing the entry level for nonlinear flow modeling. Estimates of computer resources necessary to produce accurate converged results are given. Application of potential and NS equation codes for flows which are generally at lower angles and high speeds are addressed, including (1) the treatment of wings and configuration details using potential equation codes and (2) recent NS code calculations of complete vehicle configurations. Attention is given to high-angle conditions involving separated vortex-dominated flows. Accuracy requirements for vortex shedding over airfoils is discussed and the calculation of vorticity convected over significant distances is addressed. Steady CFD computations about delta wings at high angles are given, including a discussion of the required level of fluid dynamic flow modeling. Recent results on unsteady 'buffetlike flow' about complete vehicle models are reviewed.

  15. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  16. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  17. Computer Guerrillas.

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1983-01-01

    Describes several cases in which microcomputers were used to prevent large organizations (e.g., utility companies, U.S. Government Forestry Commission) from carrying out actions considered not to be in the public's best interests. The use of the computers by social activitists in their efforts to halt environmental destruction is discussed. (EAO)

  18. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  19. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  20. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  1. Computational Hearing

    DTIC Science & Technology

    1998-11-01

    ranging from the anatomy and physiology of the auditory pathway to the perception of speech and music under both ideal and not-so-ideal (but more...physiology of various parts of the auditory pathway, to auditory prostheses, speech and audio coding, computational models of pitch and timbre , the role of

  2. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  3. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  4. Library Computing.

    ERIC Educational Resources Information Center

    Dayall, Susan A.; And Others

    1987-01-01

    Six articles on computers in libraries discuss training librarians and staff to use new software; appropriate technology; system upgrades of the Research Libraries Group's information system; pre-IBM PC microcomputers; multiuser systems for small to medium-sized libraries; and a library user's view of the traditional card catalog. (EM)

  5. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  6. Computational Mathematics

    DTIC Science & Technology

    2012-03-06

    Marsha Berger, NYU) Inclusion of the Adaptation/Adjoint module, Embedded Boundary Methods in the software package Cart3D --- Transition to NASA...ONR, DOE, AFRL, DIA Cart3D used for computing Formation Flight to reduce drag and improve energy efficiency Application to Explosively Formed

  7. Library Computing.

    ERIC Educational Resources Information Center

    Goodgion, Laurel; And Others

    1986-01-01

    Eight articles in special supplement to "Library Journal" and "School Library Journal" cover a computer program called "Byte into Books"; microcomputers and the small library; creating databases with students; online searching with a microcomputer; quality automation software; Meckler Publishing Company's…

  8. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  9. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  10. Rubbery computing

    NASA Astrophysics Data System (ADS)

    Wilson, Katherine E.; Henke, E.-F. Markus; Slipher, Geoffrey A.; Anderson, Iain A.

    2017-04-01

    Electromechanically coupled dielectric elastomer actuators (DEAs) and dielectric elastomer switches (DESs) may form digital logic circuitry made entirely of soft and flexible materials. The expansion in planar area of a DEA exerts force across a DES, which is a soft electrode with strain-dependent resistivity. When compressed, the DES drops steeply in resistance and changes state from non-conducting to conducting. Logic operators may be achieved with different arrangements of interacting DE actuators and switches. We demonstrate combinatorial logic elements, including the fundamental Boolean logic gates, as well as sequential logic elements, including latches and flip-flops. With both data storage and signal processing abilities, the necessary calculating components of a soft computer are available. A noteworthy advantage of a soft computer with mechanosensitive DESs is the potential for responding to environmental strains while locally processing information and generating a reaction, like a muscle reflex.

  11. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  12. Computational Physics

    NASA Astrophysics Data System (ADS)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  13. Computer Game

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Using NASA studies of advanced lunar exploration and colonization, KDT Industries, Inc. and Wesson International have developed MOONBASE, a computer game. The player, or team commander, must build and operate a lunar base using NASA technology. He has 10 years to explore the surface, select a site and assemble structures brought from Earth into an efficient base. The game was introduced in 1991 by Texas Space Grant Consortium.

  14. Computational Electromagnetics

    DTIC Science & Technology

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  15. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  16. Computer files.

    PubMed

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Computational enzymology.

    PubMed

    Lonsdale, Richard; Ranaghan, Kara E; Mulholland, Adrian J

    2010-04-14

    Molecular simulations and modelling are changing the science of enzymology. Calculations can provide detailed, atomic-level insight into the fundamental mechanisms of biological catalysts. Computational enzymology is a rapidly developing area, and is testing theories of catalysis, challenging 'textbook' mechanisms, and identifying novel catalytic mechanisms. Increasingly, modelling is contributing directly to experimental studies of enzyme-catalysed reactions. Potential practical applications include interpretation of experimental data, catalyst design and drug development.

  18. Quantum Computers

    DTIC Science & Technology

    2010-03-04

    empty valence-band states ) into a localized poten- tial with discrete energy levels, which is analagous to an electron bound to an atomic nucleus...seminal work, the ideas for implementing quantum computing have diversified, and the DiVincenzo criteria as originally stated are difficult to apply to...many emerging concepts. Here, we rephrase DiVincenzo’s original considerations into three more general criteria; these are stated with the assumption

  19. Computer Spectrometers

    NASA Astrophysics Data System (ADS)

    Dattani, Nikesh S.

    2017-06-01

    Ideally, the cataloguing of spectroscopic linelists would not demand laborious and expensive experiments. Whatever an experiment might achieve, the same information would be attainable by running a calculation on a computer. Kolos and Wolniewicz were the first to demonstrate that calculations on a computer can outperform even the most sophisticated molecular spectroscopic experiments of the time, when their 1964 calculations of the dissociation energies of H_2 and D_{2} were found to be more than 1 cm^{-1} larger than the best experiments by Gerhard Herzberg, suggesting the experiment violated a strict variational principle. As explained in his Nobel Lecture, it took 5 more years for Herzberg to perform an experiment which caught up to the accuracy of the 1964 calculations. Today, numerical solutions to the Schrödinger equation, supplemented with relativistic and higher-order quantum electrodynamics (QED) corrections can provide ro-vibrational spectra for molecules that we strongly believe to be correct, even in the absence of experimental data. Why do we believe these calculated spectra are correct if we do not have experiments against which to test them? All evidence seen so far suggests that corrections due to gravity or other forces are not needed for a computer simulated QED spectrum of ro-vibrational energy transitions to be correct at the precision of typical spectrometers. Therefore a computer-generated spectrum can be considered to be as good as one coming from a more conventional spectrometer, and this has been shown to be true not just for the H_2 energies back in 1964, but now also for several other molecules. So are we at the stage where we can launch an array of calculations, each with just the atomic number changed in the input file, to reproduce the NIST energy level databases? Not quite. But I will show that for the 6e^- molecule Li_2, we have reproduced the vibrational spacings to within 0.001 cm^{-1} of the experimental spectrum, and I will

  20. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  1. Computer grants

    NASA Astrophysics Data System (ADS)

    The Computer and Information Science and Engineering Directorate of the National Science Foundation will offer educational supplements to CISE grants in Fiscal Year 1990. The purpose of the supplements is to establish closer links between CISE-supported research and undergraduate education and to accelerate transfer into the classroom of research results from work done under existing research grants. Any principal investigator with an active NSF research award from a program in the CISE Directorate can apply for an educational supplement. Proposals should be for creative activities to improve education, not for research.

  2. Computational crystallization

    PubMed Central

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H.

    2016-01-01

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536

  3. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  4. Computed tomography, endoscopic, laparoscopic, and intra-operative sonography for assessing resectability of pancreatic cancer.

    PubMed

    Long, Eliza E; Van Dam, Jacques; Weinstein, Stefanie; Jeffrey, Brooke; Desser, Terry; Norton, Jeffrey A

    2005-08-01

    Pancreas cancer is the fourth leading cancer killer in adults. Cure of pancreas cancer is dependent on the complete surgical removal of localized tumor. A complete surgical resection is dependent on accurate preoperative and intra-operative imaging of tumor and its relationship to vital structures. Imaging of pancreatic tumors preoperatively and intra-operatively is achieved by pancreatic protocol computed tomography (CT), endoscopic ultrasound (EUS), laparoscopic ultrasound (LUS), and intra-operative ultrasound (IOUS). Multi-detector CT with three-dimensional (3-D) reconstruction of images is the most useful preoperative modality to assess resectability. It has a sensitivity and specificity of 90 and 99%, respectively. It is not observer dependent. The images predict operative findings. EUS and LUS have sensitivities of 77 and 78%, respectively. They both have a very high specificity. Further, EUS has the ability to biopsy tumor and obtain a definitive tissue diagnosis. IOUS is a very sensitive (93%) method to assess tumor resectability during surgery. It adds little time and no morbidity to the operation. It greatly facilitates the intra-operative decision-making. In reality, each of these methods adds some information to help in determining the extent of tumor and the surgeon's ability to remove it. We rely on pancreatic protocol CT with 3-D reconstruction and either EUS or IOUS depending on the tumor location and operability of the tumor and patient. With these modern imaging modalities, it is now possible to avoid major operations that only determine an inoperable tumor. With proper preoperative selection, surgery is able to remove tumor in the majority of patients.

  5. Diagnostic performance of fusion of myocardial perfusion imaging (MPI) and computed tomography coronary angiography

    PubMed Central

    Santana, Cesar A.; Garcia, Ernest V.; Faber, Tracy L.; Sirineni, Gopi K. R.; Esteves, Fabio P.; Sanyal, Rupan; Halkar, Raghuveer; Ornelas, Mario; Verdes, Liudmila; Lerakis, Stamatios; Ramos, Julie J.; Aguadé-Bruix, Santiago; Cuéllar, Hugo; Candell-Riera, Jaume; Raggi, Paolo

    2011-01-01

    Background We evaluated the incremental diagnostic value of fusion images of coronary computed tomography angiography (CTA) and myocardial perfusion imaging (MPI) over MPI alone or MPI and CTA side-by-side to identify obstructive coronary artery disease (CAD > 50% stenosis) using invasive coronary angiography (ICA) as the gold standard. Methods 50 subjects (36 men; 56 ± 11 years old) underwent rest-stress MPI and CTA within 12-26 days of each other. CTAs were performed with multi-detector CT-scanners (31 on 64-slice; and 19 on 16-slice). 37 patients underwent ICA while 13 subjects did not because of low (<5%) pre-test likelihood (LLK) of disease. Three blinded readers scored the images in sequential sessions using (1) MPI alone (2) MPI and CTA side-by-side, (3) fused CTA/MPI images. Results One or more critical stenoses during ICA were found in 28 patients and non-critical stenoses were found in 9 patients. MPI, side-by-side MPI-CTA, and fused CTA/MPI showed the same normalcy rate (NR:13/13) in LLK subjects. The fusion technique performed better than MPI and MPI and CTA side-by-side for the presence of CAD in any vessel (overall area under the curve (AUC) for fused images: 0.89; P = .005 vs MPI, P = .04 vs side-by-side MPI-CTA) and for localization of CAD to the left anterior descending coronary artery (AUC: 0.82, P < .001 vs MPI; P = .007 vs side-by-side MPI-CTA). There was a non-significant trend for better detection of multi-vessel disease with fusion. Conclusions Using ICA as the gold standard, fusion imaging provided incremental diagnostic information compared to MPI alone or side-by-side MPI-CTA for the diagnosis of obstructive CAD and for localization of CAD to the left anterior descending coronary artery. PMID:19156478

  6. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  7. Ablation margin assessment of liver tumors with intravenous contrast-enhanced C-arm computed tomography

    PubMed Central

    Kim, Mi Sung; Singh, Sarabjeet; Halpern, Elkan; Saini, Sanjay; Kalra, Mannudeep K

    2012-01-01

    AIM: To determine the influence of anthropomorphic parameters on the relationship between patient centering, mean computed tomography (CT) numbers and quantitative image noise in abdominal CT. METHODS: Our Institutional Review Board approved study included 395 patients (age range 21-108, years; male:female = 195:200) who underwent contrast-enhanced abdominal CT on a 16-section multi-detector row scanner (GE LightSpeed 16). Patient centering in the gantry isocenter was measured from the lateral localizer radiograph (off center S = patient off centered superior to isocenter; off center I = patient off centered inferior to isocenter). Mean CT numbers (Hounsfield Units: HU) and noise (standard deviation of CT numbers: SD) were measured in the anterior (aHU, aSD) and posterior (pHU, pSD) abdominal wall subcutaneous fat and liver parenchyma (LivHU, LivSD) at the level of the porta hepatis. Patients’ age, gender, weight, body mass index and maximal anteroposterior diameter were recorded. The data were analyzed using linear regression analysis. RESULTS: Most patients (81%; 320/395) were not correctly centered in the gantry isocenter for abdominal CT scanning. Mean CT numbers in the abdominal wall increased significantly with an increase in the off-centering distance, regardless of the direction of the off-center (P < 0.05). There was a substantial increase in pSD (P = 0.01) and LivSD (P = 0.017) with off-centering. Change in mean CT numbers and image noise along the off-center distance was influenced by the patient size (P < 0.01). CONCLUSION: Inappropriate patient centering for CT scanning adversely affects the reliability of mean CT numbers and image noise. PMID:22468191

  8. Ultrasonography in the diagnosis of nasal bone fractures: a comparison with conventional radiography and computed tomography.

    PubMed

    Lee, In Sook; Lee, Jung-Hoon; Woo, Chang-Ki; Kim, Hak Jin; Sol, Yu Li; Song, Jong Woon; Cho, Kyu-Sup

    2016-02-01

    The purpose of this study was to evaluate and compare the diagnostic efficacy of ultrasonography (US) with radiography and multi-detector computed tomography (CT) for the detection of nasal bone fractures. Forty-one patients with a nasal bone fracture who underwent prospective US examinations were included. Plain radiographs and CT images were obtained on the day of trauma. For US examinations, radiologist used a linear array transducer (L17-5 MHz) in 24 patients and hockey-stick probe (L15-7 MHz) in 17. The bony component of the nose was divided into three parts (right and left lateral nasal walls, and midline of nasal bone). Fracture detection by three modalities was subjected to analysis. Furthermore, findings made by each modality were compared with intraoperative findings. Nasal bone fractures were located in the right lateral wall (n = 28), midline of nasal bone (n = 31), or left lateral wall (n = 31). For right and left lateral nasal walls, CT had greater sensitivity and specificity than US or radiography, and better agreed with intraoperative findings. However, for midline fractures of nasal bone, US had higher specificity, positive predictive value, and negative predictive value than CT. Although two US evaluations showed good agreements at all three sites, US findings obtained by the hockey-stick probe showed closer agreement with intraoperative findings for both lateral nasal wall and midline of nasal bone. Although CT showed higher sensitivity and specificity than US or radiography, US found to be helpful for evaluating the midline of nasal bone. Furthermore, for US examinations of the nasal bone, a smaller probe and higher frequency may be required.

  9. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  10. Computational micromechanics

    NASA Astrophysics Data System (ADS)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  11. Computational introspection

    SciTech Connect

    Batali, J.

    1983-02-01

    Introspection is the process of thinking about one's own thoughts and feelings. In this paper, the author discusses recent attempts to make computational systems that exhibit introspective behavior. Each presents a system capable of manipulating representations of its own program and current context. He argues that introspective ability is crucial for intelligent systems--without it an agent cannot represent certain problems that it must be able to solve. A theory of intelligent action would describe how and why certain actions intelligently achieve an agent's goals. The agent would both embody and represent this theory: it would be implemented as the program for the agent; and the importance of introspection suggests that the agent represent its theory of action to itself.

  12. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults.

  13. SENSITIVITY OF ENDOSCOPIC ULTRASOUND, MULTIDETECTOR COMPUTER TOMOGRAPHY AND MAGNETIC RESONANCE CHOLANGIOPANCREATOGRAPHY IN THE DIAGNOSIS OF PANCREAS DIVISUM: A TERTIARY CENTER EXPERIENCE

    PubMed Central

    Kushnir, Vladimir M.; Wani, Sachin B.; Fowler, Kathryn; Menias, Christine; Varma, Rakesh; Narra, Vamsi; Hovis, Christine; Murad, Faris; Mullady, Daniel; Jonnalagadda, Sreenivasa S.; Early, Dayna S.; Edmundowicz, Steven A.; Azar, Riad R.

    2014-01-01

    OBJECTIVES There are limited data comparing imaging modalities in the diagnosis of pancreas divisum. We aimed to: 1. Evaluate the sensitivity of endoscopic ultrasound (EUS), magnetic resonance cholangiopancreatography (MRCP) and multi-detector computed tomography (MDCT) for pancreas divisum. 2. Assess interobserver agreement (IOA) among expert radiologists for detecting pancreas divisum on MDCT and MRCP. METHODS For this retrospective cohort study, we identified 45 consecutive patients with pancreaticobiliary symptoms and pancreas divisum established by endoscopic retrograde pancreatography (ERP) who underwent EUS and cross-sectional imaging. The control group was composed of patients without pancreas divisum who underwent ERP and cross-sectional imaging. RESULTS The sensitivity of EUS for pancreas divisum was 86.7%, significantly higher than sensitivity reported in the medical records for MDCT (15.5%) or MRCP (60%) [p<0.001 for each]. On review by expert radiologists the sensitivity of MDCT increased to 83.3% in cases where the pancreatic duct was visualized, with fair IOA (қ=0.34). Expert review of MRCPs did not identify any additional cases of pancreas divisum; IOA was moderate (қ=0.43). CONCLUSIONS EUS is a sensitive test for diagnosing pancreas divisum and is superior to MDCT and MRCP. Review of MDCT studies by expert radiologists substantially raises its sensitivity for pancreas divisum. PMID:23211370

  14. Changes in entrance surface dose in relation to the location of shielding material in chest computed tomography

    NASA Astrophysics Data System (ADS)

    Kang, Y. M.; Cho, J. H.; Kim, S. C.

    2015-07-01

    This study examined the effects of entrance surface dose (ESD) on the abdomen and pelvis of the patient when undergoing chest computed tomography (CT) procedure, and evaluated the effects of ESD reduction depending on the location of radiation shield. For CT scanner, the 64-slice multi-detector computed tomography was used. The alderson radiation therapy phantom and optically stimulated luminescence dosimeter (OSLD), which enabled measurement from low to high dose, were also used. For measurement of radiation dose, the slice number from 9 to 21 of the phantom was set as the test range, which included apex up to both costophrenic angles. A total of 10 OSLD nanoDots were attached for measurement of the front and rear ESD. Cyclic tests were performed using the low-dose chest CT and high-resolution CT (HRCT) protocol on the following set-ups: without shielding; shielding only on the front side; shielding only on the rear side; and shielding for both front and rear sides. According to the test results, ESD for both front and rear sides was higher in HRCT than low-dose CT when radiation shielding was not used. It was also determined that, compared to the set-up that did not use the radiation shield, locating the radiation shield on the front side was effective in reducing front ESD, while locating the radiation shield on the rear side reduced rear ESD level. Shielding both the front and rear sides resulted in ESD reduction. In conclusion, it was confirmed that shielding the front and rear sides was the most effective method to reduce the ESD effect caused by scatter ray during radiography.

  15. Using New Fission Data with the Multi-detector Analysis System for Spent Nuclear Fuel

    SciTech Connect

    A. V. Ramayya; A.V. Daniel; C. J. Beyer; E. L. Reber; G. M. Ter-Akopian; G.S. Popeko; J. D. Cole; J. H. Hamilton; J. K. Jewell; M. W. Drigert; R. Aryaeinejad; Ts.Yu. Oganessian

    1998-11-01

    New experiments using an array of high purity germanium detectors and fast liquid scintillation detectors has been performed to observe the radiation emitted from the induced fission of 235U with a beam of thermal neutrons. The experiment was performed at the Argonne National Laboratory Intense Pulsed Neutron Source. Preliminary observations of the data are presented. A nondestructive analysis system for the characterization of DOE spent nuclear fuel based on these new data is presented.

  16. Using New Fission Data with the Multi-detector Analysis System for Spent Nuclear Fuel

    SciTech Connect

    Cole, Jerald Donald

    1998-11-01

    New experiments using an array of high purity germanium detectors and fast liquid scintillation detectors has been performed to observe the radiation emitted from the induced fission of 235U with a beam of thermal neutrons. The experiment was performed at the Argonne National Laboratory Intense Pulsed Neutron Source. Preliminary observations of the data are presented. A nondestructive analysis system for the characterization of DOE spent nuclear fuel based on these new data is presented.

  17. A multi-detector, digitizer based neutron depth profiling device for characterizing thin film materials

    SciTech Connect

    Mulligan, P. L.; Cao, L. R.; Turkoglu, D.

    2012-07-15

    Neutron depth profiling (NDP) is a mature, nondestructive technique used to characterize the concentration of certain light isotopes in a material as a function of depth by measuring the residual energy of charged particles in neutron induced reactions. Historically, NDP has been performed using a single detector, resulting in low intrinsic detection efficiency, and limiting the technique largely to high flux research reactors. In this work, we describe a new NDP instrument design with higher detection efficiency by way of spectrum summing across multiple detectors. Such a design is capable of acquiring a statistically significant charged particle spectrum at facilities limited in neutron flux and operation time.

  18. A multi-detector, digitizer based neutron depth profiling device for characterizing thin film materials

    NASA Astrophysics Data System (ADS)

    Mulligan, P. L.; Cao, L. R.; Turkoglu, D.

    2012-07-01

    Neutron depth profiling (NDP) is a mature, nondestructive technique used to characterize the concentration of certain light isotopes in a material as a function of depth by measuring the residual energy of charged particles in neutron induced reactions. Historically, NDP has been performed using a single detector, resulting in low intrinsic detection efficiency, and limiting the technique largely to high flux research reactors. In this work, we describe a new NDP instrument design with higher detection efficiency by way of spectrum summing across multiple detectors. Such a design is capable of acquiring a statistically significant charged particle spectrum at facilities limited in neutron flux and operation time.

  19. A multi-detector, digitizer based neutron depth profiling device for characterizing thin film materials.

    PubMed

    Mulligan, P L; Cao, L R; Turkoglu, D

    2012-07-01

    Neutron depth profiling (NDP) is a mature, nondestructive technique used to characterize the concentration of certain light isotopes in a material as a function of depth by measuring the residual energy of charged particles in neutron induced reactions. Historically, NDP has been performed using a single detector, resulting in low intrinsic detection efficiency, and limiting the technique largely to high flux research reactors. In this work, we describe a new NDP instrument design with higher detection efficiency by way of spectrum summing across multiple detectors. Such a design is capable of acquiring a statistically significant charged particle spectrum at facilities limited in neutron flux and operation time.

  20. A multi-detector neutron spectrometer with nearly isotropic response for environmental and workplace monitoring

    NASA Astrophysics Data System (ADS)

    Gómez-Ros, J. M.; Bedogni, R.; Moraleda, M.; Delgado, A.; Romero, A.; Esposito, A.

    2010-01-01

    This communication describes an improved design for a neutron spectrometer consisting of 6Li thermoluminescent dosemeters located at selected positions within a single moderating polyethylene sphere. The spatial arrangement of the dosemeters has been designed using the MCNPX Monte Carlo code to calculate the response matrix for 56 log-equidistant energies from 10 -9 to 100 MeV, looking for a configuration that permits to obtain a nearly isotropic response for neutrons in the energy range from thermal to 20 MeV. The feasibility of the proposed spectrometer and the isotropy of its response have been evaluated by simulating exposures to different reference and workplace neutron fields. The FRUIT code has been used for unfolding purposes. The results of the simulations as well as the experimental tests confirm the suitability of the prototype for environmental and workplace monitoring applications.

  1. Real-time operating system for a multi-laser/multi-detector system

    NASA Technical Reports Server (NTRS)

    Coles, G.

    1980-01-01

    The laser-one hazard detector system, used on the Rensselaer Mars rover, is reviewed briefly with respect to the hardware subsystems, the operation, and the results obtained. A multidetector scanning system was designed to improve on the original system. Interactive support software was designed and programmed to implement real time control of the rover or platform with the elevation scanning mast. The formats of both the raw data and the post-run data files were selected. In addition, the interface requirements were selected and some initial hardware-software testing was completed.

  2. Sub-10-Minute Characterization of an Ultrahigh Molar Mass Polymer by Multi-detector Hydrodynamic Chromatography

    USDA-ARS?s Scientific Manuscript database

    Molar mass averages, distributions, and architectural information of polymers are routinely obtained using size-exclusion chromatography (SEC). It has previously been shown that ultrahigh molar mass polymers may experience degradation during SEC analysis, leading to inaccurate molar mass averages a...

  3. Using air bronchograms on multi-detector CT to predict the invasiveness of small lung adenocarcinoma.

    PubMed

    Zhang, Yu; Qiang, Jin Wei; Shen, Yan; Ye, Jian Ding; Zhang, Jie; Zhu, Lei

    2016-03-01

    To investigate the prevalence of multidetector CT (MDCT) air bronchograms and their value in predicting the invasiveness of lung adenocarcinomas. MDCT scans of 606 nodules in 582 patients with a lung adenocarcinoma less than 2cm in diameter confirmed by surgery and pathology were reviewed. Air bronchograms were classified into three patterns: type I, bronchus with intact lumen; type II, bronchus with dilated or tortuous lumen; and type III, bronchus with obstructed lumen. Air bronchograms were demonstrated on MDCT in 210 of 606 (34.7%) lung adenocarcinomas with 16.6% (35/211) preinvasive lesions (PL), 30.5% (50/164) minimally invasive adenocarcinoma (MIA), and 54.1% (125/231) invasive adenocarcinoma (IAC) (P=0.000); 18.3% (44/240) pure ground-glass nodules (GGNs), 44.2% (137/310) mixed GGNs, and 51.8% (29/56) solid nodules (P=0.000). Type I was slightly more common in MIA (36/164, 22.0%) than IAC (40/231, 17.3%) and PL (30/211, 14.2%) but without differences among them (P=0.147). Type II (PL: 5/211, 2.4%; MIA: 13/164, 7.9%; IAC: 53/231, 22.9%) and type III (PL: 0/211; MIA: 1/164, 0.6%; IAC: 32/231, 13.9%) were observed more frequently with increasing lung adenocarcinoma invasiveness (both P=0.000). The prevalence and patterns of air bronchograms on MDCT can predict the invasiveness of small lung adenocarcinomas. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. CyberKnife beam output factor measurements: A multi-site and multi-detector study.

    PubMed

    Masi, Laura; Russo, Serenella; Francescon, Paolo; Doro, Raffaela; Frassanito, Maria Cristina; Fumagalli, Maria Luisa; Reggiori, Giacomo; Marinelli, Marco; Redaelli, Irene; Pimpinella, Maria; Verona Rinati, Gianluca; Siragusa, Carmelo; Vigorito, Sabrina; Mancosu, Pietro

    2016-12-01

    New promising detectors are available for measuring small field size output factors (OFs). This study focused on a multicenter evaluation of two new generation detectors for OF measurements on CyberKnife systems. PTW-60019 microDiamond and W1 plastic scintillation detector (PSD) were used to measure OFs on eight CyberKnife units of various generations for 5-60mm fixed cones. MicroDiamond and PSD OF were compared to routinely used silicon diodes data corrected applying published Monte Carlo (MC) factors. PSD data were corrected for Čerenkov Light Ratio (CLR). The uncertainties related to CLR determination were estimated. Considering OF values averaged over all centers, the differences between MC corrected diode and the other two detectors were within 1.5%. MicroDiamond exhibited an over-response of 1.3% at 7.5mm and a trend inversion at 5mm with a difference of 0.2%. This behavior was consistent among the different units. OFs measured by PSD slightly under-responded compared to MC corrected diode for the smaller cones and the differences were within 1%. The observed CLR variability was 2.5% and the related variation in OF values was 1.9%. This study indicates that CyberKnife microDiamond OF require corrections below 2%. The results are enhanced by the consistency observed among different units. Scintillator shows a good agreement to MC corrected diode but CLR determination remains critical requiring further investigations. The results emphasized the value of a multi-center validation over a single center approach. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  6. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  7. Program Facilitates Distributed Computing

    NASA Technical Reports Server (NTRS)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  8. Computing the Profession.

    ERIC Educational Resources Information Center

    Denning, Peter J.

    1998-01-01

    Discussion of computing as a science and profession examines the chasm between computer scientists and users, barriers to the use and growth of computing, experimental computer science, computational science, software engineering, professional identity, professional practices, applications of technology, innovation, field boundaries, and…

  9. The assumptions of computing

    SciTech Connect

    Huggins, J.K.

    1994-12-31

    The use of computers, like any technological activity, is not content-neutral. Users of computers constantly interact with assumptions regarding worthwhile activity which are embedded in any computing system. Directly questioning these assumptions in the context of computing allows us to develop an understanding of responsible computing.

  10. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  11. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  12. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  13. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  14. Overview of Computer Hardware.

    ERIC Educational Resources Information Center

    Tidball, Charles S.

    1980-01-01

    Reviews development in electronics technology of digital computers, considering the binary number representation, miniaturization of electronic components, cost and space requirements of computers, ways in which computers are used, and types of computers appropriate for teaching computer literacy and demonstrating physiological simulation. (CS)

  15. Computers for Everybody.

    ERIC Educational Resources Information Center

    Willis, Jerry; Miller, Merl

    This book explains how computers can be used in the home, office or school, and provides a consumer's guide to computer equipment for the novice user. The first sections of the book offer a brief sketch of computer history, a listing of entertaining and easily available computer programs, a step-by-step guide to buying a computer, and advice on…

  16. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  17. CAA: Computer Assisted Athletics.

    ERIC Educational Resources Information Center

    Hall, John H.

    Computers have been used in a variety of applications for athletics since the late 1950's. These have ranged from computer-controlled electric scoreboards to computer-designed pole vaulting poles. Described in this paper are a computer-based athletic injury reporting system and a computer-assisted football scouting system. The injury reporting…

  18. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  19. The New Administrative Computing.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    1988-01-01

    The past decade has seen dramatic changes in administrative computing, including more systems, more applications, a new group of computer users, and new opportunities for computer use in campus administration. (Author/MSE)

  20. Environmentalists and the Computer.

    ERIC Educational Resources Information Center

    Baron, Robert C.

    1982-01-01

    Review characteristics, applications, and limitations of computers, including word processing, data/record keeping, scientific and industrial, and educational applications. Discusses misuse of computers and role of computers in environmental management. (JN)

  1. Computers: Instruments of Change.

    ERIC Educational Resources Information Center

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  2. On Teaching Computer Programming.

    ERIC Educational Resources Information Center

    Er, M. C.

    1984-01-01

    Points out difficulties associated with teaching introductory computer programing courses, discussing the importance of computer programing and explains activities associated with its use. Possible solutions to help teachers resolve problem areas in computer instruction are also offered. (ML)

  3. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  4. Computer Literacy Revisited.

    ERIC Educational Resources Information Center

    Klassen, Daniel

    1983-01-01

    This examination of important trends in the field of computing and education identifies and discusses four key computer literacy goals for educators as well as obstacles facing educators concerned with computer literacy. Nine references are listed. (Author/MBR)

  5. Avoiding Computer Viruses.

    ERIC Educational Resources Information Center

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  6. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  7. PR Educators Stress Computers.

    ERIC Educational Resources Information Center

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  8. Computers and the landscape

    Treesearch

    Gary H. Elsner

    1979-01-01

    Computers can analyze and help to plan the visual aspects of large wildland landscapes. This paper categorizes and explains current computer methods available. It also contains a futuristic dialogue between a landscape architect and a computer.

  9. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  10. Computer Literacy Revisited.

    ERIC Educational Resources Information Center

    Klassen, Daniel

    1983-01-01

    This examination of important trends in the field of computing and education identifies and discusses four key computer literacy goals for educators as well as obstacles facing educators concerned with computer literacy. Nine references are listed. (Author/MBR)

  11. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  12. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation…

  13. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation…

  14. Brain perfusion imaging using a Reconstruction-of-Difference (RoD) approach for cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Mow, M.; Zbijewski, W.; Sisniega, A.; Xu, J.; Dang, H.; Stayman, J. W.; Wang, X.; Foos, D. H.; Koliatsos, V.; Aygun, N.; Siewerdsen, J. H.

    2017-03-01

    Purpose: To improve the timely detection and treatment of intracranial hemorrhage or ischemic stroke, recent efforts include the development of cone-beam CT (CBCT) systems for perfusion imaging and new approaches to estimate perfusion parameters despite slow rotation speeds compared to multi-detector CT (MDCT) systems. This work describes development of a brain perfusion CBCT method using a reconstruction of difference (RoD) approach to enable perfusion imaging on a newly developed CBCT head scanner prototype. Methods: A new reconstruction approach using RoD with a penalized-likelihood framework was developed to image the temporal dynamics of vascular enhancement. A digital perfusion simulation was developed to give a realistic representation of brain anatomy, artifacts, noise, scanner characteristics, and hemo-dynamic properties. This simulation includes a digital brain phantom, time-attenuation curves and noise parameters, a novel forward projection method for improved computational efficiency, and perfusion parameter calculation. Results: Our results show the feasibility of estimating perfusion parameters from a set of images reconstructed from slow scans, sparse data sets, and arc length scans as short as 60 degrees. The RoD framework significantly reduces noise and time-varying artifacts from inconsistent projections. Proper regularization and the use of overlapping reconstructed arcs can potentially further decrease bias and increase temporal resolution, respectively. Conclusions: A digital brain perfusion simulation with RoD imaging approach has been developed and supports the feasibility of using a CBCT head scanner for perfusion imaging. Future work will include testing with data acquired using a 3D-printed perfusion phantom currently and translation to preclinical and clinical studies.

  15. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  16. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  17. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  18. Quantum computational supremacy

    NASA Astrophysics Data System (ADS)

    Harrow, Aram W.; Montanaro, Ashley

    2017-09-01

    The field of quantum algorithms aims to find ways to speed up the solution of computational problems by using a quantum computer. A key milestone in this field will be when a universal quantum computer performs a computational task that is beyond the capability of any classical computer, an event known as quantum supremacy. This would be easier to achieve experimentally than full-scale quantum computing, but involves new theoretical challenges. Here we present the leading proposals to achieve quantum supremacy, and discuss how we can reliably compare the power of a classical computer to the power of a quantum computer.

  19. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  20. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  1. Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  2. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  3. The Glass Computer

    ERIC Educational Resources Information Center

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  4. Computers and Conceptual Change.

    ERIC Educational Resources Information Center

    Olson, John

    A systematic study was conducted with a group of 10- to 12-year-olds using computer assisted instruction in a unit on fire which included a computer simulation of combustion. Three research questions were addressed to learn more about how the computer experience challenged the students' preconceptions: what the students thought the computer knew,…

  5. Computers and Conceptual Change.

    ERIC Educational Resources Information Center

    Olson, John

    A systematic study was conducted with a group of 10- to 12-year-olds using computer assisted instruction in a unit on fire which included a computer simulation of combustion. Three research questions were addressed to learn more about how the computer experience challenged the students' preconceptions: what the students thought the computer knew,…

  6. French Computer Terminology.

    ERIC Educational Resources Information Center

    Gray, Eugene F.

    1985-01-01

    Characteristics, idiosyncrasies, borrowings, and other aspects of the French terminology for computers and computer-related matters are discussed and placed in the context of French computer use. A glossary provides French equivalent terms or translations of English computer terminology. (MSE)

  7. The Story of Computers.

    ERIC Educational Resources Information Center

    Meadow, Charles T.

    The aim of this book is to interest young people from the ages of ten to fourteen in computers, particularly to show them that computers are exciting machines, controlled by people, and to dispel myths that computers can do magic. This is not a detailed exposition of computers, nor is it a textbook. It is an attempt to impart flavor and general…

  8. EPICS personal computer evaluation

    SciTech Connect

    Kramper, B.; MacKinnon, B.

    1984-02-01

    This document is an evaluation of five personal computers to be used as intelligent terminals on the beamline control system (EPICS). It is not intended to be a general comment on the computers themselves. Rather, it is an evaluation of these computers for a specific need. Nevertheless, this document should be useful for those considering the acquisition of a personal computer.

  9. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  10. My Computer Romance

    ERIC Educational Resources Information Center

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  11. Computer Innovations in Education.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    Computers in education are put in context by a brief review of current social and technological trends, a short history of the development of computers and the vast expansion of their use, and a brief description of computers and their use. Further chapters describe instructional applications, administrative uses, uses of computers for libraries…

  12. How Computer Graphics Work.

    ERIC Educational Resources Information Center

    Prosise, Jeff

    This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…

  13. Teach Teachers Computers.

    ERIC Educational Resources Information Center

    Levin, Dan

    1983-01-01

    Suggests eight steps for training teachers to use computers, including establishing a computer committee, structuring and budgeting for inservice courses, cooperating with other schools, and sending some teachers to the computer company's maintenance and repair school. Lists 25 computer skills teachers need and describes California and Minnesota…

  14. My Computer Romance

    ERIC Educational Resources Information Center

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  15. Elementary School Computer Literacy.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  16. What is a Computer?

    ERIC Educational Resources Information Center

    Ball, Marion J.

    This colorfully illustrated book is designed to introduce elementary school students to the world of computers. Computers, their component parts and their capabilities are personified using full page sketches and color drawings. The differences between analog and digital computers are given as is a short history of computer use and development.…

  17. The Glass Computer

    ERIC Educational Resources Information Center

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  18. Parallel computing works

    SciTech Connect

    Not Available

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  19. Distributed computing in bioinformatics.

    PubMed

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  20. (Computer vision and robotics)

    SciTech Connect

    Jones, J.P.

    1989-02-13

    The traveler attended the Fourth Aalborg International Symposium on Computer Vision at Aalborg University, Aalborg, Denmark. The traveler presented three invited lectures entitled, Concurrent Computer Vision on a Hypercube Multicomputer'', The Butterfly Accumulator and its Application in Concurrent Computer Vision on Hypercube Multicomputers'', and Concurrency in Mobile Robotics at ORNL'', and a ten-minute editorial entitled, It Concurrency an Issue in Computer Vision.'' The traveler obtained information on current R D efforts elsewhere in concurrent computer vision.

  1. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  2. Computers and Computation. Readings from Scientific American.

    ERIC Educational Resources Information Center

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  3. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  4. Understanding student computational thinking with computational modeling

    NASA Astrophysics Data System (ADS)

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2013-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.

  5. Heterogeneous Distributed Computing for Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  6. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications.

  7. Cloud Computing for radiologists

    PubMed Central

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560

  8. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  9. Computers in manufacturing.

    PubMed

    Hudson, C A

    1982-02-12

    Computers are now widely used in product design and in automation of selected areas in factories. Within the next decade, the use of computers in the entire spectrum of manufacturing applications, from computer-aided design to computer-aided manufacturing and robotics, is expected to be practical and economically justified. Such widespread use of computers on the factory floor awaits further advances in computer capabilities, the emergence of systems that are adaptive to the workplace, and the development of interfaces to link islands of automation and to allow effective user communications.

  10. Computational aerodynamics and design

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1982-01-01

    The role of computational aerodynamics in design is reviewed with attention given to the design process; the proper role of computations; the importance of calibration, interpretation, and verification; the usefulness of a given computational capability; and the marketing of new codes. Examples of computational aerodynamics in design are given with particular emphasis on the Highly Maneuverable Aircraft Technology. Finally, future prospects are noted, with consideration given to the role of advanced computers, advances in numerical solution techniques, turbulence models, complex geometries, and computational design procedures. Previously announced in STAR as N82-33348

  11. Light multinary computing

    NASA Astrophysics Data System (ADS)

    Arago, Jaime

    2012-11-01

    Next-generation optical communication and optical computing imply an evolution from binary to multinary computing. Light multinary computing encodes data using pulses of light components in higher orders than binary and processes it using truth tables larger than Boolean ones. This results in lesser encoded data that can be processed at faster speeds. We use a general-purpose optical transistor as the building block to develop the main computing units for counting, distributing, storing, and logically operating the arithmetic addition of two bytes of base-10 data. Currently available optical switching technologies can be used to physically implement light multinary computing to achieve ultra-high speed communication and computing.

  12. Computational aerodynamics and design

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1982-01-01

    The role of computational aerodynamics in design is reviewed with attention given to the design process; the proper role of computations; the importance of calibration, interpretation, and verification; the usefulness of a given computational capability; and the marketing of new codes. Examples of computational aerodynamics in design are given with particular emphasis on the Highly Maneuverable Aircraft Technology. Finally, future prospects are noted, with consideration given to the role of advanced computers, advances in numerical solution techniques, turbulence models, complex geometries, and computational design procedures. Previously announced in STAR as N82-33348

  13. Scientific Grid computing.

    PubMed

    Coveney, Peter V

    2005-08-15

    We introduce a definition of Grid computing which is adhered to throughout this Theme Issue. We compare the evolution of the World Wide Web with current aspirations for Grid computing and indicate areas that need further research and development before a generally usable Grid infrastructure becomes available. We discuss work that has been done in order to make scientific Grid computing a viable proposition, including the building of Grids, middleware developments, computational steering and visualization. We review science that has been enabled by contemporary computational Grids, and associated progress made through the widening availability of high performance computing.

  14. Polymorphous computing fabric

    DOEpatents

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  15. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  16. Computer Intrusions and Attacks.

    ERIC Educational Resources Information Center

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  17. Computer Crime and Insurance.

    ERIC Educational Resources Information Center

    Beaudoin, Ralph H.

    1985-01-01

    The susceptibility of colleges and universities to computer crime is great. While insurance coverage is available to cover the risks, an aggressive loss-prevention program is the wisest approach to limiting the exposures presented by computer technology. (MLW)

  18. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  19. Cognitive Computing for Security.

    SciTech Connect

    Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  20. Novel Applications of Computers

    ERIC Educational Resources Information Center

    Levi, Barbara G.

    1970-01-01

    Presents some novel applications of the computer to physics research. They include (1) a computer program for calculating Compton scattering, (2) speech simulation, (3) data analysis in spectrometry, and (4) measurement of complex alpha-particle spectrum. Bibliography. (LC)

  1. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  2. Moving Beyond Computer Literacy.

    ERIC Educational Resources Information Center

    Rhodes, Lewis A.

    1985-01-01

    This article reviews Sherry Turkle's book, "The Second Self: Computers and the Human Spirit," which explores the subjective impact of the computer on children, adults, and our consciousness and culture in general. Two references are provided. (DCS)

  3. Computer Vision Syndrome.

    PubMed

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  4. Computers in engineering, 1991

    SciTech Connect

    Gupta, G.; Shoup, T.E.

    1991-01-01

    This book is covered under the following topics: Robotics, Computers in Fluid Mechanics/Thermal Systems, CAD/CAM/CAE, Finite Element techniques, Computers in Education, Engineering Database Management, and Artificial Intelligence and Expert Systems.

  5. Computer Intrusions and Attacks.

    ERIC Educational Resources Information Center

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  6. Dedicated to computation

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2017-03-01

    Princeton University astrophysicist David Spergel, founding director of the Center for Computational Astrophsyics at the Flatiron Institute, talks to Michael Banks about plans to make New York a major centre for computational science

  7. Human Computers 1947

    NASA Technical Reports Server (NTRS)

    1947-01-01

    Langley's human computers at work in 1947. The female presence at Langley, who performed mathematical computations for male staff. Photograph published in Winds of Change, 75th Anniversary NASA publication (page 48), by James Schultz.

  8. The Merit Computer Network

    ERIC Educational Resources Information Center

    Aupperle, Eric M.; Davis, Donna L.

    1978-01-01

    The successful Merit Computer Network is examined in terms of both technology and operational management. The network is fully operational and has a significant and rapidly increasing usage, with three major institutions currently sharing computer resources. (Author/CMV)

  9. Computer Software Roundup.

    ERIC Educational Resources Information Center

    Bitter, Gary; And Others

    1983-01-01

    Twenty-five outstanding new computer software programs for drill and practice, simulation, and tutoring are reviewed. Programs deal with classroom management and with the teaching of language arts, computer literacy, social studies, science and logic, and mathematics. (PP)

  10. Computed Tomography (CT) -- Head

    MedlinePlus

    ... ray beam follows a spiral path. A special computer program processes this large volume of data to create ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ...

  11. Computed Tomography (CT) - Spine

    MedlinePlus

    ... ray beam follows a spiral path. A special computer program processes this large volume of data to create ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ...

  12. Computers in Education.

    ERIC Educational Resources Information Center

    Rao, K. Srinivasa

    1991-01-01

    The important role of the digital computer in education is outlined. The existing Indian situation and needs and the scope of the Computer Literacy And Studies in Schools (CLASS) program are briefly discussed. (Author/KR)

  13. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  14. My Computer Is Learning.

    ERIC Educational Resources Information Center

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  15. Jacobi Set Computation

    SciTech Connect

    Bhatia, Harsh

    2016-07-28

    Jacobi Set Computation is a software to compute the Jacobi set of 2 piecewise linear scalar functions defined on a triangular mesh. This functionality is useful for analyzing multiple scalar fields simultaneously.

  16. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  17. The Computing World

    DTIC Science & Technology

    1992-04-01

    to modern computers. In the 1840s Augusta Ada, the Countess of Lovelace, translated and wrote several scientific papers regarding Charles P...Babbage’s ideas on an analytical engine problem solving machine. 3 Babbage , "the father of computers," developed ways to store results via memory devices. Ada...simple arithmetic calculating machines to small complex and powerful integrated circuit computers. Jacquard, Babbage and Lovelace set the computer

  18. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  19. Chapter on Distributed Computing

    DTIC Science & Technology

    1989-02-01

    MASSACHUSETTS LABORATORY FOR INSTITUTE OF COMPUTER SCIENCE TECHNOLOGY ("D / o O MIT/LCS/TM-384 CHAPTER ON DISTRIBUTED COMPUTING Leslie Lamport Nancy...22217 ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (Miude Secuwity Ciaifiation) Chapter on Distributed Computing 12. PERSONAL AUTHOR(S) Lamport... distributed computing , distributed systems models, dis- tributed algorithms, message-passing, shared variables, 19. UBSTRACT (Continue on reverse if

  20. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  1. Physics and Computation

    DTIC Science & Technology

    1988-03-01

    com- puter for which the time development is generated by the Schr ~ dinger equa- tion must be a reversible computer . Feynman presented the first... computer for which the time development is generated by the Schrodinger equation must be a reversible computer . Feynman presented the first convincing... computation ................ 139 6.3 Time -evolution operator approach ............... 139 6.4 Hamiltonian operator approach ................. 141 6.4.1

  2. Nanoelectronics: Metrology and Computation

    SciTech Connect

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-09-26

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example.

  3. The fifth generation computer

    SciTech Connect

    Moto-Oka, T.; Kitsuregawa, M.

    1985-01-01

    The leader of Japan's Fifth Generation computer project, known as the 'Apollo' project, and a young computer scientist elucidate in this book the process of how the idea came about, international reactions, the basic technology, prospects for realization, and the abilities of the Fifth Generation computer. Topics considered included forecasting, research programs, planning, and technology impacts.

  4. Education for Computers

    ERIC Educational Resources Information Center

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  5. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2016-07-12

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  6. Computer Literacy. Focus 11.

    ERIC Educational Resources Information Center

    Benderson, Albert

    One of a series on the responses by the Educational Testing Service (ETS) and others to critical problems in education, this overview addresses a variety of issues related to computer literacy. Topics discussed include the pace of the transition to the computer age, the development of microprocessors, and the computer as fad or revolution.…

  7. Writing, Thinking and Computers.

    ERIC Educational Resources Information Center

    Hartley, James

    1993-01-01

    Reviews the potential of word processors for changing the ways in which students process written text and think about writing. Three levels of computer-aided writing are considered: simple word processors; computer-aided writing programs; and higher-level computer-aided processing; and improvements in writing quality. (41 references) (LRW)

  8. Getting To Know Computers.

    ERIC Educational Resources Information Center

    Lundgren, Mary Beth

    Originally written for adult new readers involved in literacy programs, this book is also helpful to those individuals who want a basic book about computers. It uses the carefully controlled vocabulary with which adult new readers are familiar. Chapter 1 addresses the widespread use of computers. Chapter 2 discusses what a computer is and…

  9. Education for Computers

    ERIC Educational Resources Information Center

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  10. Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Insolia, Gerard

    This document contains course outlines in computer-aided manufacturing developed for a business-industry technology resource center for firms in eastern Pennsylvania by Northampton Community College. The four units of the course cover the following: (1) introduction to computer-assisted design (CAD)/computer-assisted manufacturing (CAM); (2) CAM…

  11. Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  12. Computers in Engineering Teaching.

    ERIC Educational Resources Information Center

    Rushby, N. J.

    This bibliography cites 26 books, papers, and reports dealing with various uses of computers in engineering education; and describes several computer programs available for use in teaching aeronautical, chemical, civil, electrical and electronic, mechanical, and nuclear engineering. Each computer program entry is presented by name, author,…

  13. The Next Computer Revolution.

    ERIC Educational Resources Information Center

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  14. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  15. Quantum walk computation

    SciTech Connect

    Kendon, Viv

    2014-12-04

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer.

  16. Computers for Intellectual Regeneration.

    ERIC Educational Resources Information Center

    Kendall, Diane; Budin, Howard

    1987-01-01

    This article presents an interview with Robert Taylor, a leading figure in educational computing. Among the topics covered are: (1) current developments in educational computing, (2) the changing role of teachers, (3) the resources computer technology will provide to social studies teachers, and (4) whether the probable future uses of computers…

  17. Children's Computer Drawings.

    ERIC Educational Resources Information Center

    Alexander, David

    Computer drawing programs have several characteristics that make them appropriate for use in early childhood education. Drawing at the computer is an activity that captures and holds children's attention. Children at all developmental levels of graphic ability can draw at the computer, and their products can be stored in a disc or printed for…

  18. BNL ATLAS Grid Computing

    SciTech Connect

    Michael Ernst

    2008-10-02

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  19. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  20. Optimizing Computer Technology Integration

    ERIC Educational Resources Information Center

    Dillon-Marable, Elizabeth; Valentine, Thomas

    2006-01-01

    The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…

  1. Computer Training at Harwell

    ERIC Educational Resources Information Center

    Hull, John

    1969-01-01

    By using teletypewriters connected to the Harwell multi-access computing system, lecturers can easily demonstrate the operation of the computer in the classroom; this saves time and eliminates errors and staff can carry out exercises using the main computer. (EB)

  2. Reading, Writing, and Computing.

    ERIC Educational Resources Information Center

    Adams, Dennis M.; And Others

    Reading, writing, and computing, which are interrelated and can thrive on each other for literacy and intellectual growth, are in the process of becoming linked in instructional practice. As reading and writing become more demanding, their task is eased with computer use. The computer seems to provide the connection between composing,…

  3. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  4. Computers in Science Fiction.

    ERIC Educational Resources Information Center

    La Faille, Eugene

    1985-01-01

    This 136-item annotated bibliography listing science fiction suitable for 12-19 age range is divided into five sections, each organized alphabetically by author's name: in-print novels featuring computers, in-print anthologies of computer stories, in-print studies of computer science fiction, out-of-print novels, short stories. Current publication…

  5. Getting To Know Computers.

    ERIC Educational Resources Information Center

    Lundgren, Mary Beth

    Originally written for adult new readers involved in literacy programs, this book is also helpful to those individuals who want a basic book about computers. It uses the carefully controlled vocabulary with which adult new readers are familiar. Chapter 1 addresses the widespread use of computers. Chapter 2 discusses what a computer is and…

  6. Understanding Computer Terms.

    ERIC Educational Resources Information Center

    Lilly, Edward R.

    Designed to assist teachers and administrators approaching the subject of computers for the first time to acquire a feel for computer terminology, this document presents a computer term glossary on three levels. (1) The terms most frequently used, called a "basic vocabulary," are presented first in three paragraphs which explain their meanings:…

  7. Computer applications in bioprocessing.

    PubMed

    Bungay, H R

    2000-01-01

    Biotechnologists have stayed at the forefront for practical applications for computing. As hardware and software for computing have evolved, the latest advances have found eager users in the area of bioprocessing. Accomplishments and their significance can be appreciated by tracing the history and the interplay between the computing tools and the problems that have been solved in bioprocessing.

  8. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  9. Computers a la Cart.

    ERIC Educational Resources Information Center

    Barba, Robertta H.

    1988-01-01

    Explains uses of computer hardware when it's no longer needed to teach computer literacy. Describes a mini-lab pilot project which consists of several computers mounted on carts used similarly as audiovisual equipment. Lists advantages, disadvantages, and percentages of time used for instruction, interfacing, word processing, graphing, and…

  10. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  11. Personal Computer Communications.

    ERIC Educational Resources Information Center

    Leclerc, Gerry

    The interconnection between personal computers and other personal, mini, or mainframe computer systems is discussed. The following topics relevant to college personnel are addressed: hardware techniques for tying computers together, advantages and disadvantages of available software, the prospects for sophisticated micro/mainframe links with major…

  12. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  13. Computers + Student Activities Handbook.

    ERIC Educational Resources Information Center

    Masie, Elliott; Stein, Michele

    Designed to provide schools with the tools to start utilizing computers for student activity programs without additional expenditures, this handbook provides beginning computer users with suggestions and ideas for using computers in such activities as drama clubs, yearbooks, newspapers, activity calendars, accounting programs, room utilization,…

  14. Computing environment logbook

    DOEpatents

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  15. Chippy's Computer Numbers.

    ERIC Educational Resources Information Center

    Girard, Suzanne; Willing, Kathlene R.

    Intended for young children just becoming familiar with computers, this counting book introduces and reinforces new computer vocabulary and concepts. The numbers, from one to twelve, are presented along with words and illustrations from the world of computers, allowing for different activities in which children can count or match and name the…

  16. Quantum Computing and High Performance Computing

    DTIC Science & Technology

    2006-12-01

    any hardware device. 15. SUBJECT TERMS Quantum Computing, FPGA, Quantum Computer Simulator, Paralelize 16. SECURITY CLASSIFICATION OF: 19a. NAME OF...Case Figure 2 repeatedly references a specific unitary operator—the CNot gate. The definition of the CNot, and any other gate elements that may...standard gate definition , we can reduce the general problem to the specific problem of simulating a gate in standard position, producing the

  17. The science of computing - Parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  18. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  19. Neural Computation and the Computational Theory of Cognition

    ERIC Educational Resources Information Center

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  20. Neural Computation and the Computational Theory of Cognition

    ERIC Educational Resources Information Center

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  1. Assessment of sub-milli-sievert abdominal computed tomography with iterative reconstruction techniques of different vendors

    PubMed Central

    Padole, Atul; Sainani, Nisha; Lira, Diego; Khawaja, Ranish Deedar Ali; Pourjabbar, Sarvenaz; Lo Gullo, Roberto; Otrakji, Alexi; Kalra, Mannudeep K

    2016-01-01

    AIM: To assess diagnostic image quality of reduced dose (RD) abdominal computed tomography (CT) with 9 iterative reconstruction techniques (IRTs) from 4 different vendors to the standard of care (SD) CT. METHODS: In an Institutional Review Board approved study, 66 patients (mean age 60 ± 13 years, 44 men, and 22 women) undergoing routine abdomen CT on multi-detector CT (MDCT) scanners from vendors A, B, and C (≥ 64 row CT scanners) (22 patients each) gave written informed consent for acquisition of an additional RD CT series. Sinogram data of RD CT was reconstructed with two vendor-specific and a vendor-neutral IRTs (A-1, A-2, A-3; B-1, B-2, B-3; and C-1, C-2, C-3) and SD CT series with filtered back projection. Subjective image evaluation was performed by two radiologists for each SD and RD CT series blinded and independently. All RD CT series (198) were assessed first followed by SD CT series (66). Objective image noise was measured for SD and RD CT series. Data were analyzed by Wilcoxon signed rank, kappa, and analysis of variance tests. RESULTS: There were 13/50, 18/57 and 9/40 missed lesions (size 2-7 mm) on RD CT for vendor A, B, and C, respectively. Missed lesions includes liver cysts, kidney cysts and stone, gall stone, fatty liver, and pancreatitis. There were also 5, 4, and 4 pseudo lesions (size 2-3 mm) on RD CT for vendor A, B, and C, respectively. Lesions conspicuity was sufficient for clinical diagnostic performance for 6/24 (RD-A-1), 10/24 (RD-A-2), and 7/24 (RD-A-3) lesions for vendor A; 5/26 (RD-B-1), 6/26 (RD-B-2), and 7/26 (RD-B-3) lesions for vendor B; and 4/20 (RD-C-1) 6/20 (RD-C-2), and 10/20 (RD-C-3) lesions for vendor C (P = 0.9). Mean objective image noise in liver was significantly lower for RD A-1 compared to both RD A-2 and RD A-3 images (P < 0.001). Similarly, mean objective image noise lower for RD B-2 (compared to RD B-1, RD B-3) and RD C-3 (compared to RD C-1 and C-2) (P = 0.016). CONCLUSION: Regardless of IRTs and MDCT vendors

  2. Computer algebra and operators

    NASA Technical Reports Server (NTRS)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  3. Scalable optical quantum computer

    SciTech Connect

    Manykin, E A; Mel'nichenko, E V

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  4. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  5. ALMA correlator computer systems

    NASA Astrophysics Data System (ADS)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  6. Richard Feynman and computation

    NASA Astrophysics Data System (ADS)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  7. Left ventricular remodelling and systolic function measurement with 64 multi-slice computed tomography versus second harmonic echocardiography in patients with coronary artery disease: a double blind study.

    PubMed

    Palazzuoli, Alberto; Cademartiri, Filippo; Geleijnse, Marcel L; Meijboom, Bob; Pugliese, Francesca; Soliman, Osama; Calabrò, Anna; Nuti, Ranuccio; de Feyter, Pim

    2010-01-01

    The present study evaluated LV volumes, ejection fraction (LVEF) and stroke volume (SV) obtained by 64-MDCT and to compare these data with those obtained by second harmonic 2D Echo, in patients referred for non-invasive coronary vessels evaluation. The most common technique in daily clinical practice used for determination of LV function is two-dimensional echocardiography (2D-TTE). Multi-detector computed tomography (MDCT) is an emerging new technique to detect coronary artery disease (CAD) and was recently proposed to assess LV function. 93 patients underwent to 64-MDCT for LV function and volumes assessment by segmental reconstruction algorithm (Argus) and compared with recent (2 months) 2D-TTE, all images were processed and interpreted by two observers blinded to the Echo and MDCT results. A close correlation between TTE and 64 MDCT was demonstrated for the ejection fraction LVEF (r=0.84), end-diastolic volume LVEDV (r=0.80) and end-systolic volume LVESV (r=0.85); acceptable correlation was recruited for stroke volume LVSV (r=0.58). Optimal results were recruited for inter-observer variability for 64-MDCT measured in 45 patients: LVESV (r=0.82, p<0.001), LVEDV (r=0.83, p<0.001), LVEF (r=0.69, p<0.002) and SV (r=0.66, p<0.001). Our results, showed that functional and temporal information contained in a coronary 64-MDCT study can be used to assess left ventricular (LV) systolic function and LV dimensions with good reproducibility and acceptable correlation respect to 2D-TTE. The combination of non-invasive coronary artery imaging and assessment of global LV function might became in the future a fast and conclusive cardiac work-up in patients with CAD.

  8. Diagnostic accuracy of 64-slice computed tomography coronary angiography for the detection of in-stent restenosis: a meta-analysis.

    PubMed

    Carrabba, Nazario; Schuijf, Joanne D; de Graaf, Fleur R; Parodi, Guido; Maffei, Erica; Valenti, Renato; Palumbo, Alessandro; Weustink, Annick C; Mollet, Nico R; Accetta, Gabriele; Cademartiri, Filippo; Antoniucci, David; Bax, Jeroen J

    2010-06-01

    We sought to evaluate the diagnostic accuracy of 64-slice multi-detector row computed tomography (MDCT) compared with invasive coronary angiography for in-stent restenosis (ISR) detection. MEDLINE, Cochrane library, and BioMed Central database searches were performed until April 2009 for original articles. Inclusion criteria were (1) 64-MDCT was used as a diagnostic test for ISR, with >50% diameter stenosis selected as the cut-off criterion for significant ISR, using invasive coronary angiography and quantitative coronary angiography as the standard of reference; (2) absolute numbers of true positive, false positive, true negative, and false negative results could be derived. Standard meta-analytic methods were applied. Nine studies with a total of 598 patients with 978 stents included were considered eligible. On average, 9% of stents were unassessable (range 0-42%). Accuracy tests with 95% confidence intervals (CIs) comparing 64-MDCT vs invasive coronary angiography showed that pooled sensitivity, specificity, positive and negative likelihood ratio (random effect model) values were: 86% (95% CI 80-91%), 93% (95% CI 91-95%), 12.32 (95% CI 7.26-20.92), 0.18 (95% CI 0.12-0.28) for binary ISR detection. The symmetric area under the curve value was 0.94, indicating good agreement between 64-MDCT and invasive coronary angiography. 64-MDCT has a good diagnostic accuracy for ISR detection with a particularly high negative predictive value. However, still a relatively large proportion of stents remains uninterpretable. Accordingly, only in selected patients, 64-MDCT may serve as a potential alternative noninvasive method to rule out ISR.

  9. Computer-assisted psychotherapy

    PubMed Central

    Wright, Jesse H.; Wright, Andrew S.

    1997-01-01

    The rationale for using computers in psychotherapy includes the possibility that therapeutic software could improve the efficiency of treatment and provide access for greater numbers of patients. Computers have not been able to reliably duplicate the type of dialogue typically used in clinician-administered therapy. However, computers have significant strengths that can be used to advantage in designing treatment programs. Software developed for computer-assisted therapy generally has been well accepted by patients. Outcome studies have usually demonstrated treatment effectiveness for this form of therapy. Future development of computer tools may be influenced by changes in health care financing and rapid growth of new technologies. An integrated care delivery model incorporating the unique attributes of both clinicians and computers should be adopted for computer-assisted therapy. PMID:9292446

  10. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  11. Optimal Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Mantri, Atul; Pérez-Delgado, Carlos A.; Fitzsimons, Joseph F.

    2013-12-01

    Blind quantum computation allows a client with limited quantum capabilities to interact with a remote quantum computer to perform an arbitrary quantum computation, while keeping the description of that computation hidden from the remote quantum computer. While a number of protocols have been proposed in recent years, little is currently understood about the resources necessary to accomplish the task. Here, we present general techniques for upper and lower bounding the quantum communication necessary to perform blind quantum computation, and use these techniques to establish concrete bounds for common choices of the client’s quantum capabilities. Our results show that the universal blind quantum computation protocol of Broadbent, Fitzsimons, and Kashefi, comes within a factor of (8)/(3) of optimal when the client is restricted to preparing single qubits. However, we describe a generalization of this protocol which requires exponentially less quantum communication when the client has a more sophisticated device.

  12. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  13. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  14. Computational Biology and High Performance Computing 2000

    SciTech Connect

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  15. Navier-Stokes Computations on Commodity Computers

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Faulkner, Thomas R.

    1998-01-01

    In this paper we discuss and demonstrate the feasibility of solving high-fidelity, nonlinear computational fluid dynamics (CFD) problems of practical interest on commodity machines, namely Pentium Pro PC's. Such calculations have now become possible due to the progress in computational power and memory of the off-the-shelf commodity computers, along with the growth in bandwidth and communication speeds of networks. A widely used CFD code known as TLNS3D, which was developed originally on large shared memory computers was selected for this effort. This code has recently been ported to massively parallel processor (MPP) type machines, where natural partitioning along grid blocks is adopted in which one or more blocks are distributed to each of the available processors. In this paper, a similar approach is adapted to port this code to a cluster of Pentium Pro computers. The message passing among the processors is accomplished through the use of standard message passing interface (MPI) libraries. Scaling studies indicate fairly high level of parallelism on such clusters of commodity machines, thus making solutions to Navier-Stokes equations for practical problems more affordable.

  16. Computer Health Score

    SciTech Connect

    2016-08-03

    The algorithm develops a single health score for office computers, today just Windows, but we plan to extend this to Apple computers. The score is derived from various parameters, including: CPU Utilization Memory Utilization Various Error logs Disk Problems Disk write queue length It then uses a weighting scheme to balance these parameters and provide an overall health score. By using these parameters, we are not just assessing the theoretical performance of the components of the computer, rather we are using actual performance metrics that are selected to be a more realistic representation of the experience of the person using the computer. This includes compensating for the nature of their use. If there are two identical computers and the user of one places heavy demands on their computer compared with the user of the second computer, the former will have a lower health score. This allows us to provide a 'fit for purpose' score tailored to the assigned user. This is very helpful data to inform the mangers when individual computers need to be replaced. Additionally it provides specific information that can facilitate the fixing of the computer, to extend it's useful lifetime. This presents direct financial savings, time savings for users transferring from one computer to the next, and better environmental stewardship.

  17. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  18. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  19. Computational approaches to computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    The various techniques by which the goal of computational aeroacoustics (the calculation and noise prediction of a fluctuating fluid flow) may be achieved are reviewed. The governing equations for compressible fluid flow are presented. The direct numerical simulation approach is shown to be computationally intensive for high Reynolds number viscous flows. Therefore, other approaches, such as the acoustic analogy, vortex models and various perturbation techniques that aim to break the analysis into a viscous part and an acoustic part are presented. The choice of the approach is shown to be problem dependent.

  20. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  1. Reconfigurable Computing for High Performance Computing Computational Science

    DTIC Science & Technology

    2007-06-01

    the C code for validation, key test of Blowfish algorithm is composed of: the secret key load, pre-processing, and encryption / decryption . The time...block bound easily add to the clock quantum to produce cipher , the algorithm supports a hash operation by using a interesting results. Overall, with...standard execution of encryption and with integer and bit-based computing technology. Here decryption functions with constant secret key for a set of we

  2. LPF Computation Revisited

    NASA Astrophysics Data System (ADS)

    Crochemore, Maxime; Ilie, Lucian; Iliopoulos, Costas S.; Kubica, Marcin; Rytter, Wojciech; Waleń, Tomasz

    We present efficient algorithms for storing past segments of a text. They are computed using two previously computed read-only arrays (SUF and LCP) composing the Suffix Array of the text. They compute the maximal length of the previous factor (subword) occurring at each position of the text in a table called LPF. This notion is central both in many conservative text compression techniques and in the most efficient algorithms for detecting motifs and repetitions occurring in a text.

  3. Sensor sentinel computing device

    DOEpatents

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  4. Threats to Computer Systems

    DTIC Science & Technology

    1973-03-01

    subjects and objects of attacks contribute to the uniqueness of computer-related crime. For example, as the cashless , checkless society approaches...advancing computer tech- nology and security methods, and proliferation of computers in bringing about the paperless society . The universal use of...organizations do to society . Jerry Schneider, one of the known perpetrators, said that he was motivated to perform his acts to make money, for the

  5. Computer aided production engineering

    SciTech Connect

    Not Available

    1986-01-01

    This book presents the following contents: CIM in avionics; computer analysis of product designs for robot assembly; a simulation decision mould for manpower forecast and its application; development of flexible manufacturing system; advances in microcomputer applications in CAD/CAM; an automated interface between CAD and process planning; CAM and computer vision; low friction pneumatic actuators for accurate robot control; robot assembly of printed circuit boards; information systems design for computer integrated manufacture; and a CAD engineering language to aid manufacture.

  6. The Rabi Quantum Computer

    DTIC Science & Technology

    2001-04-01

    example that other students learn to make quantum computers does not quite meet the RQC specification, consider useful in many fields . I also want to...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010869 TITLE: The Rabi Quantum Computer DISTRIBUTION: Approved for...comprise the compilation report: ADP010865 thru ADP010894 UNCLASSIFIED 5-1 The Rabi Quantum Computer Rudolph A. Krutar Advanced Information Technology’ U.S

  7. Computational structures technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1992-01-01

    Computational structures technology (CST), which has emerged from FEM developments, is a fusion of materials modeling, structural and dynamic analysis and synthesis methods, on the one hand, and numerical analysis and approximation theory, on the other. In addition to computational materials modeling, CST encompasses computational methods for predicting the response, performance, failure, and service life of structures and their components, as well as automated methods for structural synthesis and optimization.

  8. Mobile computing for radiology.

    PubMed

    Auffermann, William F; Chetlen, Alison L; Sharma, Arjun; Colucci, Andrew T; DeQuesada, Ivan M; Grajo, Joseph R; Kung, Justin W; Loehfelm, Thomas W; Sherry, Steven J

    2013-12-01

    The rapid advances in mobile computing technology have the potential to change the way radiology and medicine as a whole are practiced. Several mobile computing advances have not yet found application to the practice of radiology, while others have already been applied to radiology but are not in widespread clinical use. This review addresses several areas where radiology and medicine in general may benefit from adoption of the latest mobile computing technologies and speculates on potential future applications.

  9. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  10. Factors Affecting Computer Anxiety in High School Computer Science Students.

    ERIC Educational Resources Information Center

    Hayek, Linda M.; Stephens, Larry

    1989-01-01

    Examines factors related to computer anxiety measured by the Computer Anxiety Index (CAIN). Achievement in two programing courses was inversely related to computer anxiety. Students who had a home computer and had computer experience before high school had lower computer anxiety than those who had not. Lists 14 references. (YP)

  11. Computational fluid dynamics simulation of airflow in the trachea and main bronchi for the subjects with left pulmonary artery sling.

    PubMed

    Qi, Shouliang; Li, Zhenghua; Yue, Yong; van Triest, Han J W; Kang, Yan

    2014-06-24

    Left pulmonary artery sling (LPAS) is a rare but severe congenital anomaly, in which the stenoses are formed in the trachea and/or main bronchi. Multi-detector computed tomography (MDCT) provides useful anatomical images, but does not offer functional information. The objective of the present study is to quantitatively analyze the airflow in the trachea and main bronchi of LPAS subjects through computational fluid dynamics (CFD) simulation. Five subjects (four LPAS patients, one normal control) aging 6-19 months are analyzed. The geometric model of the trachea and the two main bronchi is extracted from the MDCT images. The inlet velocity is determined based on the body weight and the inlet area. Both the geometric model and personalized inflow conditions are imported into CFD software, ANSYS. The pressure drop, mass flow ratio through two bronchi, wall pressure, flow velocity and wall shear stress (WSS) are obtained, and compared to the normal control. Due to the tracheal and/or bronchial stenosis, the pressure drop for the LPAS patients ranges 78.9-914.5 Pa, much higher than for the normal control (0.7 Pa). The mass flow ratio through the two bronchi does not correlate with the sectional area ratio if the anomalous left pulmonary artery compresses the trachea or bronchi. It is suggested that the C-shaped trachea plays an important role on facilitating the air flow into the left bronchus with the inertia force. For LPAS subjects, the distributions of velocities, wall pressure and WSS are less regular than for the normal control. At the stenotic site, high velocity, low wall pressure and high WSS are observed. Using geometric models extracted from CT images and the patient-specified inlet boundary conditions, CFD simulation can provide vital quantitative flow information for LPAS. Due to the stenosis, high pressure drops, inconsistent distributions of velocities, wall pressure and WSS are observed. The C-shaped trachea may facilitate a larger flow of air into the

  12. The Social Computer

    NASA Astrophysics Data System (ADS)

    Serugendo, Giovanna Di Marzo; Risoldi, Matteo; Solemayni, Mohammad

    The following sections are included: * Introduction * Problem and Research Questions * State of the Art * TSC Structure and Computational Awareness * Methodology and Research Directions * Case Study: Democracy * Conclusions

  13. Highly parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Among the highly parallel computing architectures required for advanced scientific computation, those designated 'MIMD' and 'SIMD' have yielded the best results to date. The present development status evaluation of such architectures shown neither to have attained a decisive advantage in most near-homogeneous problems' treatment; in the cases of problems involving numerous dissimilar parts, however, such currently speculative architectures as 'neural networks' or 'data flow' machines may be entailed. Data flow computers are the most practical form of MIMD fine-grained parallel computers yet conceived; they automatically solve the problem of assigning virtual processors to the real processors in the machine.

  14. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  15. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  16. Optimization of computations

    SciTech Connect

    Mikhalevich, V.S.; Sergienko, I.V.; Zadiraka, V.K.; Babich, M.D.

    1994-11-01

    This article examines some topics of optimization of computations, which have been discussed at 25 seminar-schools and symposia organized by the V.M. Glushkov Institute of Cybernetics of the Ukrainian Academy of Sciences since 1969. We describe the main directions in the development of computational mathematics and present some of our own results that reflect a certain design conception of speed-optimal and accuracy-optimal (or nearly optimal) algorithms for various classes of problems, as well as a certain approach to optimization of computer computations.

  17. Computational approaches to vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  18. Computing by Observing Changes

    NASA Astrophysics Data System (ADS)

    Cavaliere, Matteo; Leupold, Peter

    Computing by Observing is a paradigm for the implementation of models of Natural Computing. It was inspired by the setup of experiments in biochemistry. One central feature is an observer that translates the evolution of an underlying observed system into sequences over a finite alphabet. We take a step toward more realistic observers by allowing them to notice only an occurring change in the observed system rather than to read the system's entire configuration. Compared to previous implementations of the Computing by Observing paradigm, this decreases the computational power; but with relatively simple systems we still obtain the language class generated by matrix grammars.

  19. Hot ice computer

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2009-12-01

    We experimentally demonstrate that supersaturated solution of sodium acetate, commonly called ‘hot ice’, is a massively-parallel unconventional computer. In the hot ice computer data are represented by a spatial configuration of crystallization induction sites and physical obstacles immersed in the experimental container. Computation is implemented by propagation and interaction of growing crystals initiated at the data-sites. We discuss experimental prototypes of hot ice processors which compute planar Voronoi diagram, shortest collision-free paths and implement AND and OR logical gates.

  20. Scalable optical quantum computer

    NASA Astrophysics Data System (ADS)

    Manykin, E. A.; Mel'nichenko, E. V.

    2014-12-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr3+, regularly located in the lattice of the orthosilicate (Y2SiO5) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications.

  1. Fifth generation computers

    NASA Astrophysics Data System (ADS)

    Treleaven, Philip C.; Lima, Isabel Gouveia

    1982-06-01

    Fifth generation computers are analogous to LEGO building blocks, with each block corresponding to a microcomputer and a group of blocks working together as a computer system. These computers will represent a unification of currently separate areas of research into parallel processing and into VLSI processors. Parallel processing based on data driven and demand driven computer organisations are under investigation in well over thirty laboratories in the United States, Japan and Europe. Basically, in data driven (e.g. data flow) computers the availability of operands triggers the execution of the operation to be performed on them; whereas in demand driven (e.g. reduction) computers the requirement for a result triggers the operation that will generate the value. VLSI processors exploit very large scale integration and the new simplified chip design methodology pioneered in US universities by Mead and Conway, allowing users to design their own chips. These novel VLSI processors are implementable by simple replicated cells and use extensive pipelining and multiprocessing to achieve a high performance. Examples range from a powerful image processing device configured from identical special-purpose chips, to a large parallel computer built from replicated general-purpose microcomputers. This paper outlines these topics contributing to fifth generation computers, and speculates on their effect on computing.

  2. Computational approaches to vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  3. Computer Confrontation: Suppes and Albrecht

    ERIC Educational Resources Information Center

    Suppes, Patrick; Albrecht, Bob

    1973-01-01

    Two well-known computer specialists argue about the function of computers in schools. Patrick Suppes believes mastery of basic skills is the prime function of computers. Bob Albrecht believes computers should be learning devices and not drill masters. (DS)

  4. Computer Architecture's Changing Role in Rebooting Computing

    DOE PAGES

    DeBenedictis, Erik P.

    2017-04-26

    In this paper, Windows 95 started the Wintel era, in which Microsoft Windows running on Intel x86 microprocessors dominated the computer industry and changed the world. Retaining the x86 instruction set across many generations let users buy new and more capable microprocessors without having to buy software to work with new architectures.

  5. Educational Computer Utilization and Computer Communications.

    ERIC Educational Resources Information Center

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  6. Introduction: The Computer After Me

    NASA Astrophysics Data System (ADS)

    Pitt, Jeremy

    The following sections are included: * Introduction * Computer Awareness in Science Fiction * Computer Awareness and Self-Awareness * How many senses does a computer have? * Does a computer know that it is a computer? * Does metal know when it is weakening? * Why Does Computer Awareness Matter? * Chapter Overviews * Summary and Conclusions

  7. Computer Awareness for Rural Educators.

    ERIC Educational Resources Information Center

    Barker, Bruce O.

    The meteoric rise of the computer age is a challenge for public educators, many of whom are still unfamiliar with basic computer technology. Yet many educators are finding that they can correct their misconceptions about computers by becoming "computer aware." Computer awareness comes from gaining a knowledge of computer history; a basic…

  8. Nature, computation and complexity

    NASA Astrophysics Data System (ADS)

    Binder, P.-M.; Ellis, G. F. R.

    2016-06-01

    The issue of whether the unfolding of events in the world can be considered a computation is explored in this paper. We come to different conclusions for inert and for living systems (‘no’ and ‘qualified yes’, respectively). We suggest that physical computation as we know it exists only as a tool of complex biological systems: us.

  9. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  10. Computers and Personal Privacy.

    ERIC Educational Resources Information Center

    Ware, Willis H.

    Privacy is an issue that arises from the intersection of a demand for improved recordkeeping processes, and computing technology as the response to the demand. Recordkeeping in the United States centers on information about people. Modern day computing technology has the ability to maintain, store, and retrieve records quickly; however, this…

  11. Preventing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  12. Flexible Animation Computer Program

    NASA Technical Reports Server (NTRS)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  13. Flexible Animation Computer Program

    NASA Technical Reports Server (NTRS)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  14. Computers and School Reform.

    ERIC Educational Resources Information Center

    McDaniel, Ernest; And Others

    1993-01-01

    Discusses ways in which computers can be used to help school reform by shifting the emphasis from information transmission to information processing. Highlights include creating learning communities that extend beyond the classroom; educationally oriented computer networks; Professional Development Schools for curriculum development; and new…

  15. Learning with Ubiquitous Computing

    ERIC Educational Resources Information Center

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  16. Campus Computing Strategies.

    ERIC Educational Resources Information Center

    McCredie, John W., Ed.

    Ten case studies that describe the planning process and strategies employed by colleges who use computing and communication systems are presented, based on a 1981-1982 study conducted by EDUCOM. An introduction by John W. McCredie summarizes several current and future effects of the rapid spread and integration of computing and communication…

  17. Computations in Plasma Physics.

    ERIC Educational Resources Information Center

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  18. The Economics of Computers.

    ERIC Educational Resources Information Center

    Sharpe, William F.

    A microeconomic theory is applied in this book to computer services and costs and for the benefit of those who are decision-makers in the selection, financing, and use of computers. Subtopics of the theory discussed include value and demand; revenue and profits; time and risk; and costs, inputs, and outputs. Application of the theory is explained…

  19. Computer Aided Art Major.

    ERIC Educational Resources Information Center

    Gibson, Jim

    The Computer Aided Art program offered at Northern State State University (Aberdeen, South Dakota), is coordinated with the traditional art major. The program is designed to familiarize students with a wide range of art-related computer hardware and software and their applications and to prepare students for problem-solving with unfamiliar…

  20. Computational Modeling of Tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)

    1995-01-01

    This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.

  1. Computer Series, 78.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1986-01-01

    Presents six brief articles dealing with the use of computers in teaching various topics in chemistry. Describes hardware and software applications which relate to protein graphics, computer simulated metabolism, interfaces between microcomputers and measurement devices, courseware available for spectrophotometers, and the calculation of elemental…

  2. Computers, Networks and Education.

    ERIC Educational Resources Information Center

    Kay, Alan C.

    1991-01-01

    Discussed is how globally networked, easy-to-use computers can enhance learning only within an educational environment that encourages students to question "facts" and seek challenges. The strengths and weaknesses of computers used as amplifiers for learning are described. (KR)

  3. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  4. Programming the social computer.

    PubMed

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  5. Videodisc-Computer Interfaces.

    ERIC Educational Resources Information Center

    Zollman, Dean

    1984-01-01

    Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…

  6. Computers and Personal Privacy.

    ERIC Educational Resources Information Center

    Ware, Willis H.

    Privacy is an issue that arises from the intersection of a demand for improved recordkeeping processes, and computing technology as the response to the demand. Recordkeeping in the United States centers on information about people. Modern day computing technology has the ability to maintain, store, and retrieve records quickly; however, this…

  7. Computer controlled antenna system

    NASA Technical Reports Server (NTRS)

    Raumann, N. A.

    1972-01-01

    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  8. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell, Ed.

    1988-01-01

    Describes three situations in which computer software was used in a chemistry laboratory. Discusses interfacing voltage output instruments with Apple II computers and using spreadsheet programs to simulate gas chromatography and analysis of kinetic data. Includes information concerning procedures, hardware, and software used in each situation. (CW)

  9. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  10. Statistical Mapping by Computer.

    ERIC Educational Resources Information Center

    Utano, Jack J.

    The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…

  11. Computational chemistry at Janssen

    NASA Astrophysics Data System (ADS)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2016-12-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  12. Quantum Analog Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  13. Teaching Using Computer Games

    ERIC Educational Resources Information Center

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  14. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  15. African Studies Computer Resources.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    African studies computer resources that are readily available in the United States with linkages to Africa are described, highlighting those most directly corresponding to African content. Africanists can use the following four fundamental computer systems: (1) Internet/Bitnet; (2) Fidonet; (3) Usenet; and (4) dial-up bulletin board services. The…

  16. Chippy's Computer Words.

    ERIC Educational Resources Information Center

    Willing, Kathlene R.; Girard, Suzanne

    Intended for young children just becoming familiar with computers, this naming book introduces and reinforces new computer vocabulary and concepts. The 20 words are presented alphabetically, along with illustrations, providing room for different activities in which children can match and name the pictures and words. The 20 vocabulary items are…

  17. Profiling Computing Coordinators.

    ERIC Educational Resources Information Center

    Edwards, Sigrid; Morton, Allan

    The people responsible for managing school computing resources in Australia have become known as Computing Coordinators. To date there has been no large systematic study of the role, responsibilities and characteristics of this position. This paper represents a first attempt to provide information on the functions and attributes of the Computing…

  18. Computer Yearbook 72.

    ERIC Educational Resources Information Center

    1972

    Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…

  19. Operating efficiency of computers

    NASA Technical Reports Server (NTRS)

    Pac, J.

    1977-01-01

    A method is outlined which can be used to guarantee to users of computing systems a measure of operating efficiency. The monthly utilization coefficient should be equal to or exceed a value agreed on in advance. In addition, the repair time during a computer breakdown should not be longer than a period agreed on in advance.

  20. Computers on Wheels.

    ERIC Educational Resources Information Center

    Rosemead Elementary School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: How does a school provide the computer learning experiences for students given the paucity of available funding for hardware, software, and staffing? Here is what one school, Emma W. Shuey in Rosemead, did after exploratory research on computers by a committee of teachers and administrators. The…