Science.gov

Sample records for 320-row multi-detector computed

  1. Image Quality of Coronary Computed Tomography Angiography with 320-Row Area Detector Computed Tomography in Children with Congenital Heart Disease.

    PubMed

    Tada, Akihiro; Sato, Shuhei; Kanie, Yuichiro; Tanaka, Takashi; Inai, Ryota; Akagi, Noriaki; Morimitsu, Yusuke; Kanazawa, Susumu

    2016-03-01

    The objective of this study was to assess factors affecting image quality of 320-row computed tomography angiography (CTA) of coronary arteries in children with congenital heart disease (CHD). We retrospectively reviewed 28 children up to 3 years of age with CHD who underwent prospective electrocardiography (ECG)-gated 320-row CTA with iterative reconstruction. We assessed image quality of proximal coronary artery segments using a five-point scale. Age, body weight, average heart rate, and heart rate variability were recorded and compared between two groups: patients with good diagnostic image quality in all four coronary artery segments and patients with at least one coronary artery segment with nondiagnostic image quality. Altogether, 96 of 112 segments (85.7 %) had diagnostic-quality images. Patients with nondiagnostic segments were significantly younger (10.0 ± 11.6 months) and had lower body weight (5.9 ± 2.9 kg) (each p < 0.05) than patients with diagnostic image quality of all four segments (20.6 ± 13.8 months and 8.4 ± 2.5 kg, respectively; each p < 0.05). Differences in heart rate and heart rate variability between the two imaging groups were not significant. Receiver operating characteristic analyses for predicting patients with nondiagnostic image quality revealed an optimal body weight cutoff of ≤5.6 kg and an optimal age cutoff of ≤12.5 months. Prospective ECG-gated 320-row CTA with iterative reconstruction provided feasible image quality of coronary arteries in children with CHD. Younger age and lower body weight were factors that led to poorer image quality of coronary arteries.

  2. Entrance surface dose measurements using a small OSL dosimeter with a computed tomography scanner having 320 rows of detectors.

    PubMed

    Takegami, Kazuki; Hayashi, Hiroaki; Yamada, Kenji; Mihara, Yoshiki; Kimoto, Natsumi; Kanazawa, Yuki; Higashino, Kousaku; Yamashita, Kazuta; Hayashi, Fumio; Okazaki, Tohru; Hashizume, Takuya; Kobayashi, Ikuo

    2017-03-01

    Entrance surface dose (ESD) measurements are important in X-ray computed tomography (CT) for examination, but in clinical settings it is difficult to measure ESDs because of a lack of suitable dosimeters. We focus on the capability of a small optically stimulated luminescence (OSL) dosimeter. The aim of this study is to propose a practical method for using an OSL dosimeter to measure the ESD when performing a CT examination. The small OSL dosimeter has an outer width of 10 mm; it is assumed that a partial dose may be measured because the slice thickness and helical pitch can be set to various values. To verify our method, we used a CT scanner having 320 rows of detectors and checked the consistencies of the ESDs measured using OSL dosimeters by comparing them with those measured using Gafchromic™ films. The films were calibrated using an ionization chamber on the basis of half-value layer estimation. On the other hand, the OSL dosimeter was appropriately calibrated using a practical calibration curve previously proposed by our group. The ESDs measured using the OSL dosimeters were in good agreement with the reference ESDs from the Gafchromic™ films. Using these data, we also estimated the uncertainty of ESDs measured with small OSL dosimeters. We concluded that a small OSL dosimeter can be considered suitable for measuring the ESD with an uncertainty of 30 % during CT examinations in which pitch factors below 1.000 are applied.

  3. Comparison of cerebral blood flow data obtained by computed tomography (CT) perfusion with that obtained by xenon CT using 320-row CT.

    PubMed

    Takahashi, Satoshi; Tanizaki, Yoshio; Kimura, Hiroaki; Akaji, Kazunori; Kano, Tadashige; Suzuki, Kentaro; Takayama, Youhei; Kanzawa, Takao; Shidoh, Satoka; Nakazawa, Masaki; Yoshida, Kazunari; Mihara, Ban

    2015-03-01

    Cerebral blood flow (CBF) data obtained by computed tomography perfusion (CTP) imaging have been shown to be qualitative data rather than quantitative, in contrast with data obtained by other imaging methods, such as xenon CT (XeCT) imaging. Thus, interpatient comparisons of CBF values themselves obtained by CTP may be inaccurate. In this study, we have compared CBF ratios as well as CBF values obtained from CTP-CBF data to those obtained from XeCT-CBF data for the same patients to determine CTP-CBF parameters that can be used for interpatient comparisons. The data used in the present study were obtained as volume data using 320-row CT. The volume data were applied to an automated region of interest-determining software (3DSRT, version 3.5.2 ) and converted to 59 slices of 2 mm interval standardized images. In the present study, we reviewed 10 patients with occlusive cerebrovascular diseases (CVDs) undergoing both CTP and XeCT in the same period. Our study shows that ratios of CBF measurements, such as hemodynamic stress distribution (perforator-to-cortical flow ratio of middle cerebral artery [MCA] region) or the left/right ratio for the region of the MCA, calculated using CTP data have been shown to correlate well with the same ratios calculated using XeCT data. These results suggest that such CBF ratios could be useful for generating interpatient comparisons of CTP-CBF data obtained by 320-row CT among patients with occlusive CVD.

  4. Diagnostic performance of combined noninvasive coronary angiography and myocardial perfusion imaging using 320 row detector computed tomography: design and implementation of the CORE320 multicenter, multinational diagnostic study.

    PubMed

    Vavere, Andrea L; Simon, Gregory G; George, Richard T; Rochitte, Carlos E; Arai, Andrew E; Miller, Julie M; Di Carli, Marcello; Arbab-Zadeh, Armin; Zadeh, Armin A; Dewey, Marc; Niinuma, Hiroyuki; Laham, Roger; Rybicki, Frank J; Schuijf, Joanne D; Paul, Narinder; Hoe, John; Kuribyashi, Sachio; Sakuma, Hajime; Nomura, Cesar; Yaw, Tan Swee; Kofoed, Klaus F; Yoshioka, Kunihiro; Clouse, Melvin E; Brinker, Jeffrey; Cox, Christopher; Lima, Joao A C

    2011-01-01

    Multidetector coronary computed tomography angiography (CTA) is a promising modality for widespread clinical application because of its noninvasive nature and high diagnostic accuracy as found in previous studies using 64 to 320 simultaneous detector rows. It is, however, limited in its ability to detect myocardial ischemia. In this article, we describe the design of the CORE320 study ("Combined coronary atherosclerosis and myocardial perfusion evaluation using 320 detector row computed tomography"). This prospective, multicenter, multinational study is unique in that it is designed to assess the diagnostic performance of combined 320-row CTA and myocardial CT perfusion imaging (CTP) in comparison with the combination of invasive coronary angiography and single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI). The trial is being performed at 16 medical centers located in 8 countries worldwide. CT has the potential to assess both anatomy and physiology in a single imaging session. The co-primary aim of the CORE320 study is to define the per-patient diagnostic accuracy of the combination of coronary CTA and myocardial CTP to detect physiologically significant coronary artery disease compared with (1) the combination of conventional coronary angiography and SPECT-MPI and (2) conventional coronary angiography alone. If successful, the technology could revolutionize the management of patients with symptomatic CAD.

  5. Australian diagnostic reference levels for multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony; Marks, Paul; Edmonds, Keith; Tingey, David; Johnston, Peter

    2013-03-01

    The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) is undertaking web based surveys to obtain data to establish national diagnostic reference levels (DRLs) for diagnostic imaging. The first set of DRLs to be established are for multi detector computed tomography (MDCT). The survey samples MDCT dosimetry metrics: dose length product (DLP, mGy.cm) and volume computed tomography dose index (CTDIvol, mGy), for six common protocols/habitus: Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine from individual radiology clinics and platforms. A practice reference level (PRL) for a given platform and protocol is calculated from a compliant survey containing data collected from at least ten patients. The PRL is defined as the median of the DLP/CTDIvol values for a single compliant survey. Australian National DRLs are defined as the 75th percentile of the distribution of the PRLs for each protocol and age group. Australian National DRLs for adult MDCT have been determined in terms of DLP and CTDIvol. In terms of DLP the national DRLs are 1,000 mGy cm, 600 mGy cm, 450 mGy cm, 700 mGy cm, 1,200 mGy cm, and 900 mGy cm for the protocols Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine respectively. Average dose values obtained from the European survey Dose Datamed I reveal Australian doses to be higher by comparison for four out of the six protocols. The survey is ongoing, allowing practices to optimise dose delivery as well as allowing the periodic update of DRLs to reflect changes in technology and technique.

  6. 320-row CT renal perfusion imaging in patients with aortic dissection: A preliminary study

    PubMed Central

    Liu, Dongting; Liu, Jiayi; Wen, Zhaoying; Li, Yu; Sun, Zhonghua; Xu, Qin; Fan, Zhanming

    2017-01-01

    Objective To investigate the clinical value of renal perfusion imaging in patients with aortic dissection (AD) using 320-row computed tomography (CT), and to determine the relationship between renal CT perfusion imaging and various factors of aortic dissection. Methods Forty-three patients with AD who underwent 320-row CT renal perfusion before operation were prospectively enrolled in this study. Diagnosis of AD was confirmed by transthoracic echocardiography. Blood flow (BF) of bilateral renal perfusion was measured and analyzed. CT perfusion imaging signs of AD in relation to the type of AD, number of entry tears and the false lumen thrombus were observed and compared. Results The BF values of patients with type A AD were significantly lower than those of patients with type B AD (P = 0.004). No significant difference was found in the BF between different numbers of intimal tears (P = 0.288), but BF values were significantly higher in cases with a false lumen without thrombus and renal arteries arising from the true lumen than in those with thrombus (P = 0.036). The BF values measured between the true lumen, false lumen and overriding groups were different (P = 0.02), with the true lumen group having the highest. Also, the difference in BF values between true lumen and false lumen groups was statistically significant (P = 0.016), while no statistical significance was found in the other two groups (P > 0.05). The larger the size of intimal entry tears, the greater the BF values (P = 0.044). Conclusions This study shows a direct correlation between renal CT perfusion changes and AD, with the size, number of intimal tears, different types of AD, different renal artery origins and false lumen thrombosis, significantly affecting the perfusion values. PMID:28182709

  7. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography.

    PubMed

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center.

  8. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography

    PubMed Central

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center. PMID:26430541

  9. Derivation of Australian diagnostic reference levels for paediatric multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony

    2016-09-01

    Australian National Diagnostic Reference Levels for paediatric multi detector computed tomography were established for three protocols, Head, Chest and AbdoPelvis, across two age groups, Baby/Infant 0-4 years and Child 5-14 years by the Australian Radiation Protection and Nuclear Safety Agency in 2012. The establishment of Australian paediatric DRLs is an important step towards lowering patient CT doses on a national scale. While Adult DRLs were calculated with data collected from the web based Australian National Diagnostic Reference Level Service, no paediatric data was submitted in the first year of service operation. Data from an independent Royal Australian and New Zealand College of Radiologists Quality Use of Diagnostic Imaging paediatric optimisation survey was used. The paediatric DRLs were defined for CTDIvol (mGy) and DLP (mGy·cm) values that referenced the 16 cm PMMA phantom for the Head protocol and the 32 cm PMMA phantom for body protocols for both paediatric age groups. The Australian paediatric DRLs for multi detector computed tomography are for the Head, Chest and AbdoPelvis protocols respectively, 470, 60 and 170 mGy·cm for the Baby/Infant age group, and 600, 110 and 390 mGy·cm for the Child age group. A comparison with published international paediatric DRLs for computed tomography reveal the Australian paediatric DRLs to be lower on average. However, the comparison is complicated by misalignment of defined age ranges. It is the intention of ARPANSA to review the paediatric DRLs in conjunction with a review of the adult DRLs, which should occur within 5 years of their publication.

  10. Colonic perforation by a transmural and transvalvular migrated retained sponge: multi-detector computed tomography findings.

    PubMed

    Camera, Luigi; Sagnelli, Marco; Guadagno, Paolo; Mainenti, Pier Paolo; Marra, Teresa; Scotto di Santolo, Maria; Fei, Landino; Salvatore, Marco

    2014-04-21

    Transmural migrated retained sponges usually impact at the level of the ileo-cecal valve leading to a small bowel obstruction. Once passed through the ileo-cecal valve, a retained sponge can be propelled forward by peristaltic activity and eliminated with feces. We report the case of a 52-year-old female with a past surgical history and recurrent episodes of abdominal pain and constipation. On physical examination, a generalized resistance was observed with tenderness in the right flank. Contrast-enhanced multi-detector computed tomography findings were consistent with a perforated right colonic diverticulitis with several out-pouchings at the level of the ascending colon and evidence of free air in the right parieto-colic gutter along with an air-fluid collection within the mesentery. In addition, a ring-shaped hyperdense intraluminal material was also noted. At surgery, the ascending colon appeared irregularly thickened and folded with a focal wall interruption and a peri-visceral abscess at the level of the hepatic flexure, but no diverticula were found. A right hemi-colectomy was performed and on dissection of the surgical specimen a retained laparotomy sponge was found in the bowel lumen.

  11. Managing patient dose in multi-detector computed tomography(MDCT). ICRP Publication 102.

    PubMed

    Valentin, J

    2007-01-01

    Computed tomography (CT) technology has changed considerably in recent years with the introduction of increasing numbers of multiple detector arrays. There are several parameters specific to multi-detector computed tomography (MDCT) scanners that increase or decrease patient dose systematically compared to older single detector computed tomography (SDCT) scanners. This document briefly reviews the MDCT technology, radiation dose in MDCT, including differences from SDCT and factors that affect dose, radiation risks, and the responsibilities for patient dose management. The document recommends that users need to understand the relationship between patient dose and image quality and be aware that image quality in CT is often higher than that necessary for diagnostic confidence. Automatic exposure control (AEC) does not totally free the operator from selection of scan parameters, and awareness of individual systems is important. Scanning protocols cannot simply be transferred between scanners from different manufacturers and should be determined for each MDCT. If the image quality is appropriately specified by the user, and suited to the clinical task, there will be a reduction in patient dose for most patients. Understanding of some parameters is not intuitive and the selection of image quality parameter values in AEC systems is not straightforward. Examples of some clinical situation shave been included to demonstrate dose management, e.g. CT examinations of the chest, the heart for coronary calcium quantification and non-invasive coronary angiography, colonography, the urinary tract, children, pregnant patients, trauma cases, and CT guided interventions. CT is increasingly being used to replace conventional x-ray studies and it is important that patient dose is given careful consideration, particularly with repeated or multiple examinations.

  12. Quadricuspid pulmonary valve in an adult patient identified by transthoracic echocardiography and multi-detector computed tomography.

    PubMed

    Jung, Soo-Yeon

    2015-01-01

    Quadricuspid pulmonary valve is a rare congenital heart disease. It is infrequently associated with significant clinical complications and tends to be clinically silent. Because of its benign nature, it has been diagnosed mainly post mortem. Its diagnosis by transthoracic echocardiography is very difficult because of the anatomical features. We describe a case of quadricuspid pulmonary valve diagnosed by transthoracic echocardiography and electrocardiography-gated multi-detector row computed tomography.

  13. A Study of Internal Thoracic Arteriovenous Principal Perforators by Using Multi-detector Row Computed Tomography Angiography

    PubMed Central

    Hashikawa, Kazunobu; Sakakibara, Shunsuke; Onishi, Hiroyuki; Terashi, Hiroto

    2016-01-01

    Objective: There are numerous reports of perforating branches from the intercostal spaces of the internal thoracic vessels. These branches have varying diameters, and a main perforating branch, the principal perforator, most often found in the second or third intercostal space. We report different results based on multi-detector row computed tomography. Methods: We evaluated 121 sides from 70 women scheduled for breast reconstruction with free lower abdominal skin flaps who underwent preoperative multi-detector row computed tomographic scan between June 2008 and June 2015. For primary reconstruction, we analyzed both sides, and for 1-sided secondary reconstruction, we analyzed only the unaffected side. We evaluated both early arterial phase and late venous phase 5-mm horizontal, cross-sectional, and volume-rendering images for perforation sites and internal thoracic arteriovenous perforating branches’ intercostal space thickness. We analyzed differences in thickness between the internal thoracic arteries and veins and symmetry in cases involving both sides. Results: Venous principal perforators nearly always perforated the same intercostal spaces as accompanying veins of arterial principal perforators (99.2%), forming arteriovenous principal perforators. We found 49 principal perforators in the first intercostal space (37.4%), 52 in the second intercostal space (39.7%), 23 in the third intercostal space (17.6%), 6 in the fourth intercostal space (4.6%), and 1 in the fifth intercostal space (0.7%). Of the 51 cases in which we studied both sides, 25 cases (49%) had principal perforators with bilateral symmetry. Conclusions: In contrast to findings from past reports, we found that internal thoracic arteriovenous principal perforators were often present in almost the same numbers in the first and second intercostal spaces. PMID:26958104

  14. Time Efficiency and Diagnostic Accuracy of New Automated Myocardial Perfusion Analysis Software in 320-Row CT Cardiac Imaging

    PubMed Central

    Rief, Matthias; Stenzel, Fabian; Kranz, Anisha; Schlattmann, Peter

    2013-01-01

    Objective We aimed to evaluate the time efficiency and diagnostic accuracy of automated myocardial computed tomography perfusion (CTP) image analysis software. Materials and Methods 320-row CTP was performed in 30 patients, and analyses were conducted independently by three different blinded readers by the use of two recent software releases (version 4.6 and novel version 4.71GR001, Toshiba, Tokyo, Japan). Analysis times were compared, and automated epi- and endocardial contour detection was subjectively rated in five categories (excellent, good, fair, poor and very poor). As semi-quantitative perfusion parameters, myocardial attenuation and transmural perfusion ratio (TPR) were calculated for each myocardial segment and agreement was tested by using the intraclass correlation coefficient (ICC). Conventional coronary angiography served as reference standard. Results The analysis time was significantly reduced with the novel automated software version as compared with the former release (Reader 1: 43:08 ± 11:39 min vs. 09:47 ± 04:51 min, Reader 2: 42:07 ± 06:44 min vs. 09:42 ± 02:50 min and Reader 3: 21:38 ± 3:44 min vs. 07:34 ± 02:12 min; p < 0.001 for all). Epi- and endocardial contour detection for the novel software was rated to be significantly better (p < 0.001) than with the former software. ICCs demonstrated strong agreement (≥ 0.75) for myocardial attenuation in 93% and for TPR in 82%. Diagnostic accuracy for the two software versions was not significantly different (p = 0.169) as compared with conventional coronary angiography. Conclusion The novel automated CTP analysis software offers enhanced time efficiency with an improvement by a factor of about four, while maintaining diagnostic accuracy. PMID:23323027

  15. Evaluation of the pelvic apophysis with multi-detector computed tomography for legal age estimation in living individuals

    PubMed Central

    Karami, Mehdi; Rabiei, Meisam; Riahinezhad, Maryam

    2015-01-01

    Background: Legal age estimations of living individuals are gaining increasing importance for radiologists involved in delivering expert opinions. The present study aimed to assess the correlation between chronological age and apophyseal centers distance from pelvic bone. Materials and Methods: This was a cross-sectional study carried out on 2013. Subjects were chosen from 15 to 25 years old people who had previous pelvic multi-detector computed tomography for any reason. The distance of iliac crest apophysis to iliac bone, and pubic apophysis to pubic bone were assessed. Results: There was a reverse linear correlation between chronological age and distance of iliac crest apophysis (P < 0.001, r = 0.899) and pubic apophysis to pelvic bone (P < 0.001, r = 0.898). Pubic apophysis was not appeared in subjects before 16 years old and it was appeared in all of the subjects with 18 years old and more. Subjects with age of 21 had near ossification of iliac or pubic apophysis and subjects with age of 24 had full ossification of iliac or pubic apophysis. Conclusion: skeletalage can be estimated by assessing the apophyseal centers distance from the pelvic bone in adolescents 15-25 years old. PMID:26109964

  16. Criteria for establishing shielding of multi-detector computed tomography (MDCT) rooms.

    PubMed

    Verdun, F R; Aroua, A; Baechler, S; Schmidt, S; Trueb, P R; Bochud, F O

    2010-01-01

    The aim of this work is to compare two methods used for determining the proper shielding of computed tomography (CT) rooms while considering recent technological advances in CT scanners. The approaches of the German Institute for Standardisation and the US National Council on Radiation Protection and Measurements were compared and a series of radiation measurements were performed in several CT rooms at the Lausanne University Hospital. The following three-step procedure is proposed for assuring sufficient shielding of rooms hosting new CT units with spiral mode acquisition and various X-ray beam collimation widths: (1) calculate the ambient equivalent dose for a representative average weekly dose length product at the position where shielding is required; (2) from the maximum permissible weekly dose at the location of interest, calculate the transmission factor F that must be taken to ensure proper shielding and (3) convert the transmission factor into a thickness of lead shielding. A similar approach could be adopted to use when designing shielding for fluoroscopy rooms, where the basic quantity would be the dose area product instead of the load of current (milliampere-minute).

  17. Contribution of diffusion weighted MRI to diagnosis and staging in gastric tumors and comparison with multi-detector computed tomography

    PubMed Central

    Fatih Özbay, Mehmet; Çallı, İskan; Doğan, Erkan; Çelik, Sebahattin; Batur, Abdussamet; Bora, Aydın; Yavuz, Alpaslan; Bulut, Mehmet Deniz; Özgökçe, Mesut; Çetin Kotan, Mehmet

    2017-01-01

    Abstract Background Diagnostic performance of Diffusion-Weighted magnetic resonance Imaging (DWI) and Multi-Detector Computed Tomography (MDCT) for TNM (Tumor, Lymph node, Metastasis) staging of gastric cancer was compared. Patients and methods We used axial T2-weighted images and DWI (b-0,400 and b-800 s/mm2) protocol on 51 pre-operative patients who had been diagnosed with gastric cancer. We also conducted MDCT examinations on them. We looked for a signal increase in the series of DWI images. The depth of tumor invasion in the stomach wall (tumor (T) staging), the involvement of lymph nodes (nodal (N) staging), and the presence or absence of metastases (metastatic staging) in DWI and CT images according to the TNM staging system were evaluated. In each diagnosis of the tumors, sensitivity, specificity, positive and negative accuracy rates of DWI and MDCT examinations were found through a comparison with the results of the surgical pathology, which is the gold standard method. In addition to the compatibilities of each examination with surgical pathology, kappa statistics were used. Results Sensitivity and specificity of DWI and MDCT in lymph node staging were as follows: N1: DWI: 75.0%, 84.6%; MDCT: 66.7%, 82%;N2: DWI: 79.3%, 77.3%; MDCT: 69.0%, 68.2%; N3: DWI: 60.0%, 97.6%; MDCT: 50.0%, 90.2%. The diagnostic tool DWI seemed more compatible with the gold standard method (surgical pathology), especially in the staging of lymph node, when compared to MDCT. On the other hand, in T staging, the results of DWI and MDCT were better than the gold standard when the T stage increased. However, DWI did not demonstrate superiority to MDCT. The sensitivity and specificity of both imaging techniques for detecting distant metastasis were 100%. Conclusions The diagnostic accuracy of DWI for TNM staging in gastric cancer before surgery is at a comparable level with MDCT and adding DWI to routine protocol of evaluating lymph nodes metastasis might increase diagnostic accuracy

  18. 320-Row Detector Dynamic 4D-CTA for the Assessment of Brain and Spinal Cord Vascular Shunting Malformations. A Technical Note.

    PubMed

    D'Orazio, Federico; Splendiani, Alessandra; Gallucci, Massimo

    2014-12-01

    Shunting vascular malformations of the brain and spinal cord are traditionally studied using digital subtraction angiography (DSA), the current gold standard imaging method routinely used because of its favourable combination in terms of spatial and temporal resolution. Because DSA is relatively expensive, time-consuming and carries a risk of silent embolic events and a small risk of transient or permanent neurologic deterioration, a non-invasive alternative angiographic method is of interest. New 320 row-detector CT scanners allow volumetric imaging of the whole brain with temporal resolution up to ≌ 3 Hz. Those characteristics make computed tomography angiography (CTA) an affordable imaging method to study the haemodynamics of the whole brain and can also be applied to the study of limited portions of the spinal cord. The aim of this paper is to make a brief summary of our experience in studying shunting vascular malformation of the brain and spinal cord using dynamic 4D-CTA, explaining the technical details of the studies performed at our institution, and the state-of-the-art major advantages and drawbacks of this new technique. We found that dynamic 4D-CTA is able to depict the main architectural characteristics of previously untreated vascular shunting malformations both in brain and spinal cord (i.e. their main arterial feeders and draining veins) allowing their correct diagnosis and exhaustive classification, limiting the use of DSA for therapeutic purposes.

  19. Surgically Cured, Relapsed Pneumococcal Meningitis Due to Bone Defects, Non-invasively Identified by Three-dimensional Multi-detector Computed Tomography

    PubMed Central

    Akimoto, Takayoshi; Morita, Akihiko; Shiobara, Keiji; Hara, Makoto; Minami, Masayuki; Shijo, Katsunori; Nomura, Yasuyuki; Shigihara, Shuntaro; Haradome, Hiroki; Abe, Osamu; Kamei, Satoshi

    2016-01-01

    A 43-year-old Japanese man presented with a history of bacterial meningitis (BM). He was admitted to our department with a one-day history of headache and was diagnosed with relapse of BM based on the cerebrospinal fluid findings. The conventional imaging studies showed serial findings suggesting left otitis media, a temporal cephalocele, and meningitis. Three-dimensional multi-detector computed tomography (3D-MDCT) showed left petrous bone defects caused by the otitis media, and curative surgical treatment was performed. Skull bone structural abnormalities should be considered a cause of relapsed BM. 3D-MDCT was useful for revealing the causal minimal bone abnormality and performing pre-surgical mapping. PMID:27980270

  20. Validation of the Australian diagnostic reference levels for paediatric multi detector computed tomography: a comparison of RANZCR QUDI data and subsequent NDRLS data from 2012 to 2015.

    PubMed

    Anna, Hayton; Wallace, Anthony; Thomas, Peter

    2017-03-01

    The national diagnostic reference level service (NDRLS), was launched in 2011, however no paediatric data were submitted during the first calendar year of operation. As such, Australian national diagnostic reference levels (DRLs), for paediatric multi detector computed tomography (MDCT), were established using data obtained from a Royal Australian and New Zealand College of Radiologists (RANZCR), Quality Use of Diagnostic Imaging (QUDI), study. Paediatric data were submitted to the NDRLS in 2012 through 2015. An analysis has been made of the NDRLS paediatric data using the same method as was used to analyse the QUDI data to establish the Australian national paediatric DRLs for MDCT. An analysis of the paediatric NDRLS data has also been made using the method used to calculate the Australian national adult DRLs for MDCT. A comparison between the QUDI data and subsequent NDRLS data shows the NDRLS data to be lower on average for the Head and AbdoPelvis protocol and similar for the chest protocol. Using an average of NDRLS data submitted between 2012 and 2015 implications for updated paediatric DRLS are considered.

  1. Non-invasive coronary angiography with multi-detector computed tomography: comparison to conventional X-ray angiography.

    PubMed

    Schoenhagen, Paul; Stillman, Arthur E; Halliburton, Sandy S; Kuzmiak, Stacie A; Painter, Tracy; White, Richard D

    2005-02-01

    Selective coronary angiography introduced clinical coronary imaging in the late 1950s. The angiographic identification of high-grade coronary lesions in patients with acute and chronic symptomatic coronary artery disease (CAD) led to the development of surgical and percutaneous coronary revascularization. However, the fact that CAD remains the major cause of death in North America and Europe demonstrates the need for novel, complementary diagnostic strategies. These are driven by the need to characterize both increasingly advanced disease stages but also early, asymptomatic disease development. Complex revascularization techniques for patients with advanced disease stages will initiate a growing demand for 3-dimensional coronary imaging and integration of imaging modalities with new mechanical therapeutic devices. An emerging focus is atherosclerosis imaging with the goal to identify subclinical disease stages as the basis for pharmacological intervention aimed at disease stabilization or reversal. Non-invasive coronary imaging with coronary multidetector computed tomographic angiography (MDCTA) allows both assessment of luminal stenosis and subclinical disease of the arterial wall. Its complementary role in the assessment of early and advanced stages of CAD is increasingly recognized.

  2. Validation of multi-detector computed tomography as a non-invasive method for measuring ovarian volume in macaques (Macaca fascicularis).

    PubMed

    Jones, Jeryl C; Appt, Susan E; Werre, Stephen R; Tan, Joshua C; Kaplan, Jay R

    2010-06-01

    The purpose of this study was to validate low radiation dose, contrast-enhanced, multi-detector computed tomography (MDCT) as a non-invasive method for measuring ovarian volume in macaques. Computed tomography scans of four known-volume phantoms and nine mature female cynomolgus macaques were acquired using a previously described, low radiation dose scanning protocol, intravenous contrast enhancement, and a 32-slice MDCT scanner. Immediately following MDCT, ovaries were surgically removed and the ovarian weights were measured. The ovarian volumes were determined using water displacement. A veterinary radiologist who was unaware of actual volumes measured ovarian CT volumes three times, using a laptop computer, pen display tablet, hand-traced regions of interest, and free image analysis software. A statistician selected and performed all tests comparing the actual and CT data. Ovaries were successfully located in all MDCT scans. The iliac arteries and veins, uterus, fallopian tubes, cervix, ureters, urinary bladder, rectum, and colon were also consistently visualized. Large antral follicles were detected in six ovaries. Phantom mean CT volume was 0.702+/-SD 0.504 cc and the mean actual volume was 0.743+/-SD 0.526 cc. Ovary mean CT volume was 0.258+/-SD 0.159 cc and mean water displacement volume was 0.257+/-SD 0.145 cc. For phantoms, the mean coefficient of variation for CT volumes was 2.5%. For ovaries, the least squares mean coefficient of variation for CT volumes was 5.4%. The ovarian CT volume was significantly associated with actual ovarian volume (ICC coefficient 0.79, regression coefficient 0.5, P=0.0006) and the actual ovarian weight (ICC coefficient 0.62, regression coefficient 0.6, P=0.015). There was no association between the CT volume accuracy and mean ovarian CT density (degree of intravenous contrast enhancement), and there was no proportional or fixed bias in the CT volume measurements. Findings from this study indicate that MDCT is a valid non

  3. Magnetic resonance imaging and multi-detector computed tomography assessment of extracellular compartment in ischemic and non-ischemic myocardial pathologies

    PubMed Central

    Saeed, Maythem; Hetts, Steven W; Jablonowski, Robert; Wilson, Mark W

    2014-01-01

    Myocardial pathologies are major causes of morbidity and mortality worldwide. Early detection of loss of cellular integrity and expansion in extracellular volume (ECV) in myocardium is critical to initiate effective treatment. The three compartments in healthy myocardium are: intravascular (approximately 10% of tissue volume), interstitium (approximately 15%) and intracellular (approximately 75%). Myocardial cells, fibroblasts and vascular endothelial/smooth muscle cells represent intracellular compartment and the main proteins in the interstitium are types I/III collagens. Microscopic studies have shown that expansion of ECV is an important feature of diffuse physiologic fibrosis (e.g., aging and obesity) and pathologic fibrosis [heart failure, aortic valve disease, hypertrophic cardiomyopathy, myocarditis, dilated cardiomyopathy, amyloidosis, congenital heart disease, aortic stenosis, restrictive cardiomyopathy (hypereosinophilic and idiopathic types), arrythmogenic right ventricular dysplasia and hypertension]. This review addresses recent advances in measuring of ECV in ischemic and non-ischemic myocardial pathologies. Magnetic resonance imaging (MRI) has the ability to characterize tissue proton relaxation times (T1, T2, and T2*). Proton relaxation times reflect the physical and chemical environments of water protons in myocardium. Delayed contrast enhanced-MRI (DE-MRI) and multi-detector computed tomography (DE-MDCT) demonstrated hyper-enhanced infarct, hypo-enhanced microvascular obstruction zone and moderately enhanced peri-infarct zone, but are limited for visualizing diffuse fibrosis and patchy microinfarct despite the increase in ECV. ECV can be measured on equilibrium contrast enhanced MRI/MDCT and MRI longitudinal relaxation time mapping. Equilibrium contrast enhanced MRI/MDCT and MRI T1 mapping is currently used, but at a lower scale, as an alternative to invasive sub-endomyocardial biopsies to eliminate the need for anesthesia, coronary

  4. Relationship between routine multi-detector cardiac computed tomographic angiography prior to reoperative cardiac surgery, length of stay, and hospital charges.

    PubMed

    Goldstein, Matthew A; Roy, Sion K; Hebsur, Shinivas; Maluenda, Gabriel; Weissman, Gaby; Weigold, Guy; Landsman, Marc J; Hill, Peter C; Pita, Francisco; Corso, Paul J; Boyce, Steven W; Pichard, Augusto D; Waksman, Ron; Taylor, Allen J

    2013-03-01

    While multi-detector cardiac computed tomography angiography (MDCCTA) prior to reoperative cardiac surgery (RCS) has been associated with improved clinical outcomes, its impact on hospital charges and length of stay remains unclear. We studied 364 patients undergoing RCS at Washington Hospital Center between 2004 and 2008, including 137 clinically referred for MDCCTA. Baseline demographics, procedural data, and perioperative outcomes were recorded at the time of the procedure. The primary clinical endpoint was the composite of perioperative death, myocardial infarction (MI), stroke, and hemorrhage-related reoperation. Secondary clinical endpoints included surgical procedural variables and the perioperative volume of bleeding and transfusion. Length of stay was determined using the hospital's electronic medical record. Cost data were extracted from the hospital's billing summary. Analysis was performed on individual categories of care, as well as on total hospital charges. Data were compared between subjects with and without MDCCTA, after adjustment for the Society of Thoracic Surgeons score. Baseline characteristics were similar between the two groups. MDCCTA was associated with shorter procedural times, shorter intensive care unit stays, fewer blood transfusions, and less frequent perioperative MI. There was additionally a trend towards a lower incidence of the primary endpoint (17.5 vs. 24.2 %, p = 0.13) primarily due to a lower incidence of perioperative MI (0 vs. 5.7 %, p = 0.002). MDCCTA was also associated with lower median recovery room [$1,325 (1,250-3,302) vs. $3,217 (1,325-5,353) p < 0.001] and nursing charges [$6,335 (3,623-10,478) vs. $6,916 (3,915-14,499) p = 0.03], although operating room charges were higher [$24,100 (22,300-29,700) vs. $23,500 (19,900-27,700) p < 0.05]. Median total charges [$127,000 (95,000-188,000) vs. $123,000 (86,800-226,000) p = 0.77] and length of stay [9 days (6-19) vs. 11 days (7-19), p = 0.21] were similar. Means analysis

  5. CT venography after knee replacement surgery: comparison of dual-energy CT-based monochromatic imaging and single-energy metal artifact reduction techniques on a 320-row CT scanner

    PubMed Central

    Utsunomiya, Daisuke; Oda, Seitaro; Nakaura, Takeshi; Funama, Yoshinori; Yuki, Hideaki; Hirata, Kenichiro; Hatemura, Masahiro; Namimoto, Tomohiro; Yamashita, Yasuyuki

    2017-01-01

    Background An optimal metal artifact reduction (MAR) technique is needed for a reliable and accurate image-based diagnosis. Purpose Using a 320-row scanner, we compared the dual-energy computed tomography (CT)-based monochromatic and the single-energy metal artifact reduction (SEMAR) techniques for CT venography (CTV) to identify the better imaging method for diagnosing deep vein thrombosis (DVT) in patients who had undergone knee replacement surgery. Material and Methods Twenty-three consecutive patients with suspected DVT after unilateral knee replacement surgery underwent dual-energy CT (135/80 kVp). Monochromatic images of 35–135 keV were generated; the monochromatic image with the best signal-to-noise ratio (SNR) of the popliteal vein near the metal prosthesis were selected. The projection data of 80 kVp were reconstructed using MAR algorithm. The mean SNR ON MAR and the best SNR ON monochromatic images were compared. Two radiologists evaluated visualization of the metal artifacts on a four-point scale where 1 = extensive artifacts, 2 = strong artifacts, 3 = mild artifacts, and 4 = minimal artifacts. Results The mean SNR was significantly higher on the MAR than the monochromatic images (12.8 ± 4.7 versus 7.7 ± 5.1, P < 0.01) and the visual scores were significantly higher for MAR than monochromatic images (2.6 ± 0.8 versus 1.3 ± 0.4, P < 0.01). Conclusion For CTV after knee replacement surgery, the MAR technique is superior to the monochromatic imaging technique. PMID:28321330

  6. Optic Strut and Para-clinoid Region – Assessment by Multi-detector Computed Tomography with Multiplanar and 3 Dimensional Reconstructions

    PubMed Central

    Ravikiran, S.R.; Kumar, Ashvini; Chavadi, Channabasappa; Pulastya, Sanyal

    2015-01-01

    Purpose To evaluate thickness, location and orientation of optic strut and anterior clinoid process and variations in paraclinoid region, solely based on multidetector computed tomography (MDCT) images with multiplanar (MPR) and 3 dimensional (3D) reconstructions, among Indian population. Materials and Methods Ninety five CT scans of head and paranasal sinuses patients were retrospectively evaluated with MPR and 3D reconstructions to assess optic strut thickness, angle and location, variations like pneumatisation, carotico-clinoid foramen and inter-clinoid osseous ridge. Results Mean optic strut thickness was 3.64mm (±0.64), optic strut angle was 42.67 (±6.16) degrees. Mean width and length of anterior clinoid process were 10.65mm (±0.79) and 11.20mm (±0.95) respectively. Optic strut attachment to sphenoid body was predominantly sulcal as in 52 cases (54.74%) and was most frequently attached to anterior 2/5th of anterior clinoid process, seen in 93 sides (48.95%). Pneumatisation of optic strut occurred in 23 sides. Carotico-clinoid foramen was observed in 42 cases (22.11%), complete foramen in 10 cases (5.26%), incomplete foramen in 24 cases (12.63%) and contact type in 8 cases (4.21%). Inter-clinoid osseous bridge was seen unilaterally in 4 cases. Conclusion The study assesses morphometric features and anatomical variations of paraclinoid region using MDCT 3D and multiplanar reconstructions in Indian population. PMID:26557589

  7. Gd-EOB-DTPA-enhanced 3.0-Tesla MRI findings for the preoperative detection of focal liver lesions: Comparison with iodine-enhanced multi-detector computed tomography

    NASA Astrophysics Data System (ADS)

    Park, Hyong-Hu; Goo, Eun-Hoe; Im, In-Chul; Lee, Jae-Seung; Kim, Moon-Jib; Kwak, Byung-Joon; Chung, Woon-Kwan; Dong, Kyung-Rae

    2012-12-01

    The safety of gadolinium-ethoxybenzyl-diethylenetriamine-pentaacetic-acid (Gd-EOB-DTPA) has been confirmed, but more study is needed to assess the diagnostic accuracy of Gd-EOB-DTPA-enhanced magnetic resonance imaging (MRI) in patients with a hepatocellular carcinoma (HCC) for whom surgical treatment is considered or with a metastatic hepatoma. Research is also needed to examine the rate of detection of hepatic lesions compared to multi-detector computed tomography (MDCT), which is used most frequently to localize and characterize a HCC. Gd-EOB-DTPA-enhanced MRI and iodine-enhanced MDCT imaging were compared for the preoperative detection of focal liver lesions. The clinical usefulness of each method was examined. The current study enrolled 79 patients with focal liver lesions who preoperatively underwent MRI and MDCT. In these patients, there was less than one month between the two diagnostic modalities. Imaging data were taken before and after contrast enhancement in both methods. To evaluate the images, we analyzed the signal-to-noise ratio (SNR) and the contrast-to-noise ratio (CNR) in the lesions and the liver parenchyma. To compare the sensitivity of the two methods, we performed a quantitative analysis of the percentage signal intensity of the liver (PSIL) on a high resolution picture archiving and communication system (PACS) monitor (paired-samples t-test, p < 0.05). The enhancement was evaluated based on a consensus of four observers. The enhancement pattern and the morphological features during the arterial and the delayed phases were correlated between the Gd-EOB-DTPA-enhanced MRI findings and the iodine-enhanced MDCT by using an adjusted x2 test. The SNRs, CNRs, and PSIL all had a greater detection rate in Gd-EOB-DTPA enhanced MRI than in iodine-enhanced MDCT. Hepatocyte-selective uptake was observed 20 minutes after the injection in the focal nodular hyperplasia (FNH, 9/9), adenoma (9/10), and highly-differentiated HCC (grade G1, 27/30). Rim

  8. Dedicated multi-detector CT of the esophagus: spectrum of diseases.

    PubMed

    Ba-Ssalamah, Ahmed; Zacherl, Johannes; Noebauer-Huhmann, Iris Melanie; Uffmann, Martin; Matzek, Wolfgang Karl; Pinker, Katja; Herold, Christian; Schima, Wolfgang

    2009-01-01

    Multi-detector computed tomography (CT) offers new opportunities in the imaging of the gastrointestinal tract. Its ability to cover a large volume in a very short scan time, and in a single breath hold with thin collimation and isotropic voxels, allows the imaging of the entire esophagus with high-quality multiplanar reformation and 3D reconstruction. Proper distention of the esophagus and stomach (by oral administration of effervescent granules and water) and optimally timed administration of intravenous contrast material are required to detect and characterize disease. In contrast to endoscopy and double-contrast studies of the upper GI tract, CT provides information about both the esophageal wall and the extramural extent of disease. Preoperative staging of esophageal carcinoma appears to be the main indication for MDCT. In addition, MDCT allows detection of other esophageal malignancies, such as lymphoma and benign esophageal tumors, such as leiomyma. A diagnosis of rupture or fistula of the esophagus can be firmly established using MDCT. Furthermore, miscellaneous esophageal conditions, such as achalasia, esophagitis, diverticula, and varices, are incidental findings and can also be visualized with hydro-multi-detector CT. Multi-detector CT is a valuable tool for the evaluation of esophageal wall disease and serves as an adjunct to endoscopy.

  9. Relationship between noise, dose, and pitch in cardiac multi-detector row CT.

    PubMed

    Primak, Andrew N; McCollough, Cynthia H; Bruesewitz, Michael R; Zhang, Jie; Fletcher, Joel G

    2006-01-01

    In spiral computed tomography (CT), dose is always inversely proportional to pitch. However, the relationship between noise and pitch (and hence noise and dose) depends on the scanner type (single vs multi-detector row) and reconstruction mode (cardiac vs noncardiac). In single detector row spiral CT, noise is independent of pitch. Conversely, in noncardiac multi-detector row CT, noise depends on pitch because the spiral interpolation algorithm makes use of redundant data from different detector rows to decrease noise for pitch values less than 1 (and increase noise for pitch values > 1). However, in cardiac spiral CT, redundant data cannot be used because such data averaging would degrade the temporal resolution. Therefore, the behavior of noise versus pitch returns to the single detector row paradigm, with noise being independent of pitch. Consequently, since faster rotation times require lower pitch values in cardiac multi-detector row CT, dose is increased without a commensurate decrease in noise. Thus, the use of faster rotation times will improve temporal resolution, not alter noise, and increase dose. For a particular application, the higher dose resulting from faster rotation speeds should be justified by the clinical benefits of the improved temporal resolution.

  10. Three-dimensional imaging in the context of minimally invasive and transcatheter cardiovascular interventions using multi-detector computed tomography: from pre-operative planning to intra-operative guidance.

    PubMed

    Schoenhagen, Paul; Numburi, Uma; Halliburton, Sandra S; Aulbach, Peter; von Roden, Martin; Desai, Milind Y; Rodriguez, Leonardo L; Kapadia, Samir R; Tuzcu, E Murat; Lytle, Bruce W

    2010-11-01

    The rapid expansion of less invasive surgical and transcatheter cardiovascular procedures for a wide range of cardiovascular conditions, including coronary, valvular, structural cardiac, and aortic disease has been paralleled by novel three-dimensional (3-D) approaches to imaging. Three-dimensional imaging allows acquisition of volumetric data sets and subsequent off-line reconstructions along unlimited 2-D planes and 3-D volumes. Pre-procedural 3-D imaging provides detailed understanding of the operative field for surgical/interventional planning. Integration of imaging modalities during the procedure allows real-time guidance. Because computed tomography routinely acquires 3-D data sets, it has been one of the early imaging modalities applied in the context of surgical and interventional planning. This review describes the continuum of applications from pre-operative planning to procedural integration, based on the emerging experience with computed tomography and rotational angiography, respectively. At the same time, the potential adverse effects of imaging with X-ray-based tomographic or angiographic modalities are discussed. It is emphasized that the role of imaging guidance in this context remains unclear and will need to be evaluated in clinical trials. This is in particular true, because data showing improved outcome or even non-inferiority for most of the emerging transcatheter procedures are still lacking.

  11. Noninvasive imaging of coronary arteries: current and future role of multi-detector row CT.

    PubMed

    Schoenhagen, Paul; Halliburton, Sandra S; Stillman, Arthur E; Kuzmiak, Stacie A; Nissen, Steven E; Tuzcu, E Murat; White, Richard D

    2004-07-01

    While invasive imaging techniques, especially selective conventional coronary angiography, will remain vital to planning and guiding catheter-based and surgical treatment of significantly stenotic coronary lesions, the comprehensive and serial assessment of asymptomatic or minimally symptomatic stages of coronary artery disease (CAD) for preventive purposes will eventually need to rely on noninvasive imaging techniques. Cardiovascular imaging with tomographic modalities, including computed tomography (CT) and magnetic resonance imaging, has great potential for providing valuable information. This review article will describe the current and future role of cardiac CT, and in particular that of multi-detector row CT, for imaging of atherosclerotic and other pathologic changes of the coronary arteries. It will describe how tomographic coronary imaging may eventually supplement traditional angiographic techniques in understanding the patterns of atherosclerotic CAD development.

  12. Toroid cavity/coil NMR multi-detector

    DOEpatents

    Gerald, II, Rex E.; Meadows, Alexander D.; Gregar, Joseph S.; Rathke, Jerome W.

    2007-09-18

    An analytical device for rapid, non-invasive nuclear magnetic resonance (NMR) spectroscopy of multiple samples using a single spectrometer is provided. A modified toroid cavity/coil detector (TCD), and methods for conducting the simultaneous acquisition of NMR data for multiple samples including a protocol for testing NMR multi-detectors are provided. One embodiment includes a plurality of LC resonant circuits including spatially separated toroid coil inductors, each toroid coil inductor enveloping its corresponding sample volume, and tuned to resonate at a predefined frequency using a variable capacitor. The toroid coil is formed into a loop, where both ends of the toroid coil are brought into coincidence. Another embodiment includes multiple micro Helmholtz coils arranged on a circular perimeter concentric with a central conductor of the toroid cavity.

  13. Enhanced security for multi-detector quantum random number generators

    NASA Astrophysics Data System (ADS)

    Marangon, Davide G.; Vallone, Giuseppe; Zanforlin, Ugo; Villoresi, Paolo

    2016-11-01

    Quantum random number generators (QRNG) represent an advanced solution for randomness generation, which is essential in every cryptographic application. In this context, integrated arrays of single-photon detectors have promising applications as QRNGs based on the spatial detection of photons. For the employment of QRNGs in cryptography, it is necessary to have efficient methods to evaluate the so-called quantum min-entropy that corresponds to the amount of the true extractable quantum randomness from the QRNG. Here, we present an efficient method that allows the estimation of the quantum min-entropy for a multi-detector QRNG. In particular, we consider a scenario in which an attacker can control the efficiency of the detectors and knows the emitted number of photons. Eventually, we apply the method to a QRNG with 103 detectors.

  14. Recent technologic advances in multi-detector row cardiac CT.

    PubMed

    Halliburton, Sandra Simon

    2009-11-01

    Recent technical advances in multi-detector row CT have resulted in lower radiation dose, improved temporal and spatial resolution, decreased scan time, and improved tissue differentiation. Lower radiation doses have resulted from the use of pre-patient z collimators, the availability of thin-slice axial data acquisition, the increased efficiency of ECG-based tube current modulation, and the implementation of iterative reconstruction algorithms. Faster gantry rotation and the simultaneous use of two x-ray sources have led to improvements in temporal resolution, and gains in spatial resolution have been achieved through application of the flying x-ray focal-spot technique in the z-direction. Shorter scan times have resulted from the design of detector arrays with increasing numbers of detector rows and through the simultaneous use of two x-ray sources to allow higher helical pitch. Some improvement in tissue differentiation has been achieved with dual energy CT. This article discusses these recent technical advances in detail.

  15. Contrast enhanced multi-detector CT and MR findings of a well-differentiated pancreatic vipoma.

    PubMed

    Camera, Luigi; Severino, Rosa; Faggiano, Antongiulio; Masone, Stefania; Mansueto, Gelsomina; Maurea, Simone; Fonti, Rosa; Salvatore, Marco

    2014-10-28

    Pancreatic vipoma is an extremely rare tumor accounting for less than 2% of endocrine pancreatic neoplasms with a reported incidence of 0.1-0.6 per million. While cross-sectional imaging findings are usually not specific, exact localization of the tumor by means of either computed tomography (CT) or magnetic resonance (MR) is pivotal for surgical planning. However, cross-sectional imaging findings are usually not specific and further characterization of the tumor may only be achieved by somatostatin-receptor scintigraphy (SRS). We report the case of a 70 years old female with a two years history of watery diarrhoea who was found to have a solid, inhomogeneously enhancing lesion at the level of the pancreatic tail at Gadolinium-enhanced MR (Somatom Trio 3T, Siemens, Germany). The tumor had been prospectively overlooked at a contrast-enhanced multi-detector CT (Aquilion 64, Toshiba, Japan) performed after i.v. bolus injection of only 100 cc of iodinated non ionic contrast media because of a chronic renal failure (3.4 mg/mL) but it was subsequently confirmed by SRS. The patient first underwent a successful symptomatic treatment with somatostatin analogues and was then submitted to a distal pancreasectomy with splenectomy to remove a capsulated whitish tumor which turned out to be a well-differentiated vipoma at histological and immuno-histochemical analysis.

  16. Follow-up of multicentric HCC according to the mRECIST criteria: role of 320-Row CT with semi-automatic 3D analysis software for evaluating the response to systemic therapy

    PubMed Central

    TELEGRAFO, M.; DILORENZO, G.; DI GIOVANNI, G.; CORNACCHIA, I.; STABILE IANORA, A.A.; ANGELELLI, G.; MOSCHETTA, M.

    2016-01-01

    Aim To evaluate the role of 320-detector row computed tomography (MDCT) with 3D analysis software in follow up of patients affected by multicentric hepatocellular carcinoma (HCC) treated with systemic therapy by using modified response evaluation criteria in solid tumors (mRECIST). Patients and methods 38 patients affected by multicentric HCC underwent MDCT. All exams were performed before and after iodinate contrast material intravenous injection by using a 320-detection row CT device. CT images were analyzed by two radiologists using multi-planar reconstructions (MPR) in order to assess the response to systemic therapy according to mRECIST criteria: complete response (CR), partial response (PR), progressive disease (PD), stable disease (SD). 30 days later, the same two radiologists evaluated target lesion response to systemic therapy according to mRECIST criteria by using 3D analysis software. The difference between the two systems in assessing HCC response to therapy was assessed by the analysis of the variance (Anova Test). Interobserver agreement between the two radiologists by using MPR images and 3D analysis software was calculated by using Cohen’s Kappa test. Results PR occurred in 10/38 cases (26%), PD in 6/38 (16%), SD in 22/38 (58%). Anova Test showed no statistically significant difference between the two systems for assessing target lesion response to therapy (p >0.05). Inter-observer agreement (k) was respectively of 0.62 for MPR images measurements and 0.86 for 3D analysis ones. Conclusions 3D Analysis software provides a semiautomatic system for assessing target lesion response to therapy according to mRE-CIST criteria in patient affected by multifocal HCC treated with systemic therapy. The reliability of 3D analysis software makes it useful in the clinical practice. PMID:28098056

  17. Coronary artery calcium measurement with multi-detector row CT: in vitro assessment of effect of radiation dose.

    PubMed

    Hong, Cheng; Bae, Kyongtae T; Pilgram, Thomas K; Suh, Jongdae; Bradley, David

    2002-12-01

    The authors assessed in vitro the effect of radiation dose on coronary artery calcium quantification with multi-detector row computed tomography. A cardiac phantom with calcified cylinders was scanned at various milliampere second settings (20-160 mAs). A clear tendency was found for image noise to decrease as tube current increased (P <.001). No tendency was found for the Agatson score or calcium volume and mass errors to vary with tube current. Calcium measurements were not significantly affected by the choice of tube current. Calcium mass error was strongly correlated with calcium volume error (P <.001). The calcium mass measurement was more accurate and less variable than the calcium volume measurement.

  18. Novel ultrahigh resolution data acquisition and image reconstruction for multi-detector row CT

    SciTech Connect

    Flohr, T. G.; Stierstorfer, K.; Suess, C.; Schmidt, B.; Primak, A. N.; McCollough, C. H.

    2007-05-15

    We present and evaluate a special ultrahigh resolution mode providing considerably enhanced spatial resolution both in the scan plane and in the z-axis direction for a routine medical multi-detector row computed tomography (CT) system. Data acquisition is performed by using a flying focal spot both in the scan plane and in the z-axis direction in combination with tantalum grids that are inserted in front of the multi-row detector to reduce the aperture of the detector elements both in-plane and in the z-axis direction. The dose utilization of the system for standard applications is not affected, since the grids are moved into place only when needed and are removed for standard scanning. By means of this technique, image slices with a nominal section width of 0.4 mm (measured full width at half maximum=0.45 mm) can be reconstructed in spiral mode on a CT system with a detector configuration of 32x0.6 mm. The measured 2% value of the in-plane modulation transfer function (MTF) is 20.4 lp/cm, the measured 2% value of the longitudinal (z axis) MTF is 21.5 lp/cm. In a resolution phantom with metal line pair test patterns, spatial resolution of 20 lp/cm can be demonstrated both in the scan plane and along the z axis. This corresponds to an object size of 0.25 mm that can be resolved. The new mode is intended for ultrahigh resolution bone imaging, in particular for wrists, joints, and inner ear studies, where a higher level of image noise due to the reduced aperture is an acceptable trade-off for the clinical benefit brought about by the improved spatial resolution.

  19. Multi-detector row CT scanning in Paleoanthropology at various tube current settings and scanning mode.

    PubMed

    Badawi-Fayad, J; Yazbeck, C; Balzeau, A; Nguyen, T H; Istoc, A; Grimaud-Hervé, D; Cabanis, E- A

    2005-12-01

    The purpose of this study was to determine the optimal tube current setting and scanning mode for hominid fossil skull scanning, using multi-detector row computed tomography (CT). Four fossil skulls (La Ferrassie 1, Abri Pataud 1, CroMagnon 2 and Cro-Magnon 3) were examined by using the CT scanner LightSpeed 16 (General Electric Medical Systems) with varying dose per section (160, 250, and 300 mAs) and scanning mode (helical and conventional). Image quality of two-dimensional (2D) multiplanar reconstructions, three-dimensional (3D) reconstructions and native images was assessed by four reviewers using a four-point grading scale. An ANOVA (analysis of variance) model was used to compare the mean score for each sequence and the overall mean score according to the levels of the scanning parameters. Compared with helical CT (mean score=12.03), the conventional technique showed sustained poor image quality (mean score=4.17). With the helical mode, we observed a better image quality at 300 mAs than at 160 in the 3D sequences (P=0.03). Whereas in native images, a reduction in the effective tube current induced no degradation in image quality (P=0.05). Our study suggests a standardized protocol for fossil scanning with a 16 x 0.625 detector configuration, a 10 mm beam collimation, a 0.562:1 acquisition mode, a 0.625/0.4 mm slice thickness/reconstruction interval, a pitch of 5.62, 120 kV and 300 mAs especially when a 3D study is required.

  20. Validity of blood flow measurement using 320 multi-detectors CT and first-pass distribution theory: a phantom study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Yu, Xuefang; Xu, Shaopeng; Zhou, Kenneth J.

    2015-03-01

    To evaluate the feasibility of measuring the myocardial blood flow using 320 row detector CT by first-pass technique. Heart was simulated with a container that was filled with pipeline of 3mm diameter; coronary artery was simulated with a pipeline of 2 cm diameter and connected with the simulated heart. The simulated coronary artery was connected with a big container with 1500 ml saline and 150ml contrast agent. One pump linking with simulated heart will withdraw with a speed of 10 ml/min, 15 ml/min, 20 ml/min, 25 ml/min and 30 ml/min. First CT scan starts after 30 s of pumpback with certain speed. The second CT scan starts 5 s after first CT scans. CT images processed as follows: The second CT scan images subtract first CT scan images, calculate the increase of CT value of simulated heart and the CT value of the unit volume of simulated coronary artery and then to calculate the total inflow of myocardial blood flow. CT myocardial blood flows were calculated as: 0.94 ml/s, 2.09 ml/s, 2.74 ml/s, 4.18 ml/s, 4.86 ml/s. The correlation coefficient is 0.994 and r2 = 0.97. The method of measuring the myocardial blood flow using 320 row detector CT by 2 scans is feasible. It is possible to develop a new method for quantitatively and functional assessment of myocardial perfusion blood flow with less radiation does.

  1. Evaluation of accuracy of 3D reconstruction images using multi-detector CT and cone-beam CT

    PubMed Central

    Kim, Mija; YI, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul

    2012-01-01

    Purpose This study was performed to determine the accuracy of linear measurements on three-dimensional (3D) images using multi-detector computed tomography (MDCT) and cone-beam computed tomography (CBCT). Materials and Methods MDCT and CBCT were performed using 24 dry skulls. Twenty-one measurements were taken on the dry skulls using digital caliper. Both types of CT data were imported into OnDemand software and identification of landmarks on the 3D surface rendering images and calculation of linear measurements were performed. Reproducibility of the measurements was assessed using repeated measures ANOVA and ICC, and the measurements were statistically compared using a Student t-test. Results All assessments under the direct measurement and image-based measurements on the 3D CT surface rendering images using MDCT and CBCT showed no statistically difference under the ICC examination. The measurements showed no differences between the direct measurements of dry skull and the image-based measurements on the 3D CT surface rendering images (P>.05). Conclusion Three-dimensional reconstructed surface rendering images using MDCT and CBCT would be appropriate for 3D measurements. PMID:22474645

  2. Multi-detector CT imaging in the postoperative orthopedic patient with metal hardware.

    PubMed

    Vande Berg, Bruno; Malghem, Jacques; Maldague, Baudouin; Lecouvet, Frederic

    2006-12-01

    Multi-detector CT imaging (MDCT) becomes routine imaging modality in the assessment of the postoperative orthopedic patients with metallic instrumentation that degrades image quality at MR imaging. This article reviews the physical basis and CT appearance of such metal-related artifacts. It also addresses the clinical value of MDCT in postoperative orthopedic patients with emphasis on fracture healing, spinal fusion or arthrodesis, and joint replacement. MDCT imaging shows limitations in the assessment of the bone marrow cavity and of the soft tissues for which MR imaging remains the imaging modality of choice despite metal-related anatomic distortions and signal alteration.

  3. Control electronics for a multi-laser/multi-detector scanning system

    NASA Technical Reports Server (NTRS)

    Kennedy, W.

    1980-01-01

    The Mars Rover Laser Scanning system uses a precision laser pointing mechanism, a photodetector array, and the concept of triangulation to perform three dimensional scene analysis. The system is used for real time terrain sensing and vision. The Multi-Laser/Multi-Detector laser scanning system is controlled by a digital device called the ML/MD controller. A next generation laser scanning system, based on the Level 2 controller, is microprocessor based. The new controller capabilities far exceed those of the ML/MD device. The first draft circuit details and general software structure are presented.

  4. Collimated prompt gamma TOF measurements with multi-slit multi-detector configurations

    NASA Astrophysics Data System (ADS)

    Krimmer, J.; Chevallier, M.; Constanzo, J.; Dauvergne, D.; De Rydt, M.; Dedes, G.; Freud, N.; Henriquet, P.; La Tessa, C.; Létang, J. M.; Pleskač, R.; Pinto, M.; Ray, C.; Reithinger, V.; Richard, M. H.; Rinaldi, I.; Roellinghoff, F.; Schuy, C.; Testa, E.; Testa, M.

    2015-01-01

    Longitudinal prompt-gamma ray profiles have been measured with a multi-slit multi-detector configuration at a 75 MeV/u 13C beam and with a PMMA target. Selections in time-of-flight and energy have been applied in order to discriminate prompt-gamma rays produced in the target from background events. The ion ranges which have been extracted from each individual detector module agree amongst each other and are consistent with theoretical expectations. In a separate dedicated experiment with 200 MeV/u 12C ions the fraction of inter-detector scattering has been determined to be on the 10%-level via a combination of experimental results and simulations. At the same experiment different collimator configurations have been tested and the shielding properties of tungsten and lead for prompt-gamma rays have been measured.

  5. Assessment of coronary bypass graft patency by first-line multi-detector computed tomography.

    PubMed

    Pesenti-Rossi, D; Baron, N; Georges, J-L; Augusto, S; Gibault-Genty, G; Livarek, B

    2014-11-01

    The purpose of the study was to assess whether a strategy based on a MDCT performed routinely before CA can reduce the radiation dose during the CA, without increased global exposure in patients who need imaging of CABG. A total of 147 consecutive patients were included. The radiation dose during CA (KAP 12.1 vs 22.0 Gy/cm(2), P<.01) and the volume of iodinated contrast (155 vs 200 mL, P<.02) were reduced when preceded by a MDCT. Patients' cumulative exposures were not different in the 2 strategies (5.0 vs 5.1 mSv, P=.76). MDCT performed in first line is a valuable strategy for the assessment of CABG.

  6. Evaluation of different small bowel contrast agents by multi - detector row CT

    PubMed Central

    Wang, Yong-Ren; Yu, Xiao-Li; Peng, Zhi-Yi

    2015-01-01

    Objective: This study aims to evaluate the effects of different oral small bowel contrast agents towards the intestinal dilatation and intestinal wall structure exhibition by the abdominal multi - detector row CT (MDCT) examination. Methods: 80 patients were performed the whole abdominal CT examination, then randomly divided into four groups, with 20 patients in each group. 45 minutes before the CT examination, the patients were served with a total of 1800 ml pure water, pure milk, dilute lactulose solution and isotonic mannitol solution, respectively. Results: The images were blinded read by two experienced abdominal radiologists in the workstation, the cross-sectional diameters of duodenum, jejunum, proximal and terminal ends of ileum of each patient were measured, then the analysis of variance was performed to analyze the differences in the intestinal dilatation among the experimental groups. The scoring method was used to score the intestinal dilatation and intestinal structure exhibition. The diluted lactulose solution and 2.5% mannitol exhibited the best intestinal dilation degrees. Similarly, the diluted lactulose solution and 2.5% mannitol exhibited the highest scores in the entire small bowel dilatation degree and intestinal structure exhibition. Conclusions: 2.5% osmotic mannitol and the diluted lactulose solution enabled the full dilatation of small bowel, and could clearly exhibit the wall structure. PMID:26629131

  7. Bronchial anatomy of left lung: a study of multi-detector row CT.

    PubMed

    Zhao, Xinya; Ju, Yuanrong; Liu, Cheng; Li, Jianfeng; Huang, Min; Sun, Jian; Wang, Tao

    2009-02-01

    Familiarity with prevailing pattern and variations in the bronchial tree is not only essential for the anatomist to explain bronchial variation in bronchial specimens, but also useful for guiding bronchoscopy and instructing pulmonary segmental resection. The purpose of this study was designed to demonstrate various branching patterns of left lung with 3D images, with special attention given to identify the major types at transverse thin-section CT. Two hundred and sixteen patients with routine thorax scans were enrolled. The images of bronchial tree, virtual bronchoscopy were reconstructed using post-processing technique of multi-detector row CT. We attempted to classify the segmental bronchi by interpreting the post-processing images, and identified them in transverse thin-section CT. Our results showed that the segmental bronchial ramifications of the left superior lobe were classified into three types mainly, i.e., common stem of apical and posterior segmental bronchi (64%, 138/216); trifurcation (23%, 50/216); common stem of apical and anterior segmental bronchi (10%, 22/216), and they could be identified at two typical sections of transverse thin-section CT. There were two major types in left basal segmental bronchi, i.e., bifurcation (75%, 163/216), trifurcation (18%, 39/216), and they could also be identified at two typical sections of transverse thin-section CT. In conclusion, our study have offered simplified branching patterns of bronchi and demonstrated various unusual bronchial branching patterns perfectly with 3D images, and have also revealed how to identify the main branching patterns in transverse thin-section CT.

  8. [Comparison of a dental cone beam CT with a multi-detector row CT on effective doses and physical image quality].

    PubMed

    Yoshida, Yutaka; Tokumori, Kenji; Okamura, Kazutoshi; Yoshiura, Kazunori

    2011-01-01

    The purpose of this study was to compare a dental cone beam computed tomography (dental CBCT) and a multi-detector row CT (MDCT) using effective doses and physical image quality. A dental mode (D-mode) and an implant mode (I-mode) were employed for calculating effective doses. Field of view (FOV) size of the MDCT was 150 mm. Three types of images were obtained using 3 different reconstruction functions: FC1 (for abdomen images), FC30 (for internal ear and bone images) and FC81 (for high resolution images). Effective doses obtained with the D-mode and with the I-mode were about 20% and 50% of those obtained with the MDCT, respectively. Resolution properties obtained with the D-mode and I-mode were superior to that of the MDCT in a high frequency range. Noise properties of the D-mode and the I-mode were better than those with FC81. It was found that the dental CBCT has better potential as compared with MDCT in both dental and implant modes.

  9. Multi-detector row CT as a "one-stop" examination in the preoperative evaluation of the morphology and function of living renal donors: preliminary study.

    PubMed

    Su, Chen; Yan, Chaogui; Guo, Yan; Zhou, Xuhui; Chen, Yaqing; Liu, Mingjuan; Wang, Wenjuan; Zhang, Xiaoling

    2011-02-01

    We designed to investigate the feasibility of multi-detector row computerized tomography (CT) as a "one-stop" examination for the simultaneous preoperative evaluation of the morphology and function of living renal donors. 21 living renal donors were examined by 64-slice spiral CT with a three-phase enhancement CT scan and two inserted dynamic scans. The maximum intensity projection (MIP), multi-planar reformation (MPR), and volume reconstruction (VR) procedures were performed to compare the renal parenchyma, renal vessels, and collecting system with operational findings. The known Patlak equation was used to calculate the glomerular filtration rate (GFR); exact GFR information was acquired by single photon emission computed tomography (SPECT). Our results as following, there were 3 cases of artery variation and 3 cases of vein variation. CT findings all corresponded with the operation, and the sensitivity, positive predictive value, specialty, and negative predictive value of CT were all 100%. The r of the GFR values estimated from CT is 0.894 (left) (P < 0.001) and 0.881 (right) (P < 0.001). In conclusions, our findings demonstrate that 64-slice spiral CT may offer a "one-stop" examination to replace SPECT in the preoperative evaluation of living renal donors to simultaneously provide information regarding both anatomy and the GFR of living renal donors.

  10. Radiation doses for pregnant women in the late pregnancy undergoing fetal-computed tomography: a comparison of dosimetry and Monte Carlo simulations.

    PubMed

    Matsunaga, Yuta; Kawaguchi, Ai; Kobayashi, Masanao; Suzuki, Shigetaka; Suzuki, Shoichi; Chida, Koichi

    2016-09-19

    The purposes of this study were (1) to compare the radiation doses for 320- and 80-row fetal-computed tomography (CT), estimated using thermoluminescent dosimeters (TLDs) and the ImPACT Calculator (hereinafter referred to as the "CT dosimetry software"), for a woman in her late pregnancy and her fetus and (2) to estimate the overlapped fetal radiation dose from a 320-row CT examination using two different estimation methods of the CT dosimetry software. The direct TLD data in the present study were obtained from a previous study. The exposure parameters used for TLD measurements were entered into the CT dosimetry software, and the appropriate radiation dose for the pregnant woman and her fetus was estimated. When the whole organs (e.g., the colon, small intestine, and ovaries) and the fetus were included in the scan range, the difference in the estimated doses between the TLD measurement and the CT dosimetry software measurement was <1 mGy (<23 %) in both CT units. In addition, when the whole organs were within the scan range, the CT dosimetry software was used for evaluating the fetal radiation dose and organ-specific doses for the woman in the late pregnancy. The conventional method using the CT dosimetry software cannot take into account the overlap between volumetric sections. Therefore, the conventional method using a 320-row CT unit in a wide-volume mode might result in the underestimation of radiation doses for the fetus and the colon, small intestine, and ovaries.

  11. Assessment of trabecular bone structure of the calcaneus using multi-detector CT: correlation with microCT and biomechanical testing.

    PubMed

    Diederichs, Gerd; Link, Thomas M; Kentenich, Marie; Schwieger, Karsten; Huber, Markus B; Burghardt, Andrew J; Majumdar, Sharmila; Rogalla, Patrik; Issever, Ahi S

    2009-05-01

    The prediction of bone strength can be improved when determining bone mineral density (BMD) in combination with measures of trabecular microarchitecture. The goal of this study was to assess parameters of trabecular bone structure and texture of the calcaneus by clinical multi-detector row computed tomography (MDCT) in an experimental in situ setup and to correlate these parameters with microCT (microCT) and biomechanical testing. Thirty calcanei in 15 intact cadavers were scanned using three different protocols on a 64-slice MDCT scanner with an in-plane pixel size of 208 microm and 500 microm slice thickness. Bone cores were harvested from each specimen and microCT images with a voxel size of 16 microm were obtained. After image coregistration, trabecular bone structure and texture were evaluated in identical regions on the MDCT images. After data acquisition, uniaxial compression testing was performed. Significant correlations between MDCT- and microCT-derived measures of bone volume fraction (BV/TV), trabecular thickness (Tb.Th) and trabecular separation (Tb.Sp) were found (range, R(2)=0.19-0.65, p<0.01 or 0.05). The MDCT-derived parameters of volumetric BMD, app. BV/TV, app. Tb.Th and app. Tb.Sp were capable of predicting 60%, 63%, 53% and 25% of the variation in bone strength (p<0.01). When combining those measures with one additional texture index (either GLCM, TOGLCM or MF.euler), prediction of mechanical competence was significantly improved to 86%, 85%, 71% and 63% (p<0.01). In conclusion, this study showed the feasibility of trabecular microarchitecture assessment using MDCT in an experimental setup simulating the clinical situation. Multivariate models of BMD or structural parameters combined with texture indices improved prediction of bone strength significantly and might provide more reliable estimates of fracture risk in patients.

  12. Assessment of organ absorbed doses and estimation of effective doses from pediatric anthropomorphic phantom measurements for multi-detector row CT with and without automatic exposure control.

    PubMed

    Brisse, Hervé J; Robilliard, Magalie; Savignoni, Alexia; Pierrat, Noelle; Gaboriaud, Geneviève; De Rycke, Yann; Neuenschwander, Sylvia; Aubert, Bernard; Rosenwald, Jean-Claude

    2009-10-01

    This study was designed to measure organ absorbed doses from multi-detector row computed tomography (MDCT) on pediatric anthropomorphic phantoms, calculate the corresponding effective doses, and assess the influence of automatic exposure control (AEC) in terms of organ dose variations. Four anthropomorphic phantoms (phantoms represent the equivalent of a newborn, 1-, 5-, and 10-y-old child) were scanned with a four-channel MDCT coupled with a z-axis-based AEC system. Two CT torso protocols were compared: a first protocol without AEC and constant tube current-time product and a second protocol with AEC using age-adjusted noise indices. Organ absorbed doses were monitored by thermoluminescent dosimeters (LiF: Mg, Cu, P). Effective doses were calculated according to the tissue weighting factors of the International Commission on Radiological Protection (). For fixed mA acquisitions, organ doses normalized to the volume CT dose index in a 16-cm head phantom (CTDIvol16) ranged from 0.6 to 1.5 and effective doses ranged from 8.4 to 13.5 mSv. For the newborn-equivalent phantom, the AEC-modulated scan showed almost no significant dose variation compared to the fixed mA scan. For the 1-, 5- and 10-y equivalent phantoms, the use of AEC induced a significant dose decrease on chest organs (ranging from 61 to 31% for thyroid, 37 to 21% for lung, 34 to 17% for esophagus, and 39 to 10% for breast). However, AEC also induced a significant dose increase (ranging from 28 to 48% for salivary glands, 22 to 51% for bladder, and 24 to 70% for ovaries) related to the high density of skull base and pelvic bones. These dose increases should be considered before using AEC as a dose optimization tool in children.

  13. Multi-detector CT assessment in pulmonary hypertension: techniques, systematic approach to interpretation and key findings.

    PubMed

    Lewis, Gareth; Hoey, Edward T D; Reynolds, John H; Ganeshan, Arul; Ment, Jerome

    2015-06-01

    Pulmonary arterial hypertension (PAH) may be suspected based on the clinical history, physical examination and electrocardiogram findings but imaging is usually central to confirming the diagnosis, establishing a cause and guiding therapy. The diagnostic pathway of PAH involves a variety of complimentary investigations of which computed tomography pulmonary angiography (CTPA) has established a central role both in helping identify an underlying cause for PAH and assessing resulting functional compromise. In particular CTPA is considered as the gold standard technique for the diagnosis of thromboembolic disease. This article reviews the CTPA evaluation in PAH, describing CTPA techniques, a systematic approach to interpretation and spectrum of key imaging findings.

  14. A retrospective comparison of smart prep and test bolus multi-detector CT pulmonary angiography protocols

    SciTech Connect

    Suckling, Tara; Smith, Tony; Reed, Warren

    2013-06-15

    Optimal arterial opacification is crucial in imaging the pulmonary arteries using computed tomography (CT). This poses the challenge of precisely timing data acquisition to coincide with the transit of the contrast bolus through the pulmonary vasculature. The aim of this quality assurance exercise was to investigate if a change in CT pulmonary angiography (CTPA) scanning protocol resulted in improved opacification of the pulmonary arteries. Comparison was made between the smart prep protocol (SPP) and the test bolus protocol (TBP) for opacification in the pulmonary trunk. A total of 160 CTPA examinations (80 using each protocol) performed between January 2010 and February 2011 were assessed retrospectively. CT attenuation coefficients were measured in Hounsfield Units (HU) using regions of interest at the level of the pulmonary trunk. The average pixel value, standard deviation (SD), maximum, and minimum were recorded. For each of these variables a mean value was then calculated and compared for these two CTPA protocols. Minimum opacification of 200 HU was achieved in 98% of the TBP sample but only 90% of the SPP sample. The average CT attenuation over the pulmonary trunk for the SPP was 329 (SD = ±21) HU, whereas for the TBP it was 396 (SD = ±22) HU (P = 0.0017). The TBP also recorded higher maximum (P = 0.0024) and minimum (P = 0.0039) levels of opacification. This study has found that a TBP resulted in significantly better opacification of the pulmonary trunk than the SPP.

  15. Improving Image Quality of On-Board Cone-Beam CT in Radiation Therapy Using Image Information Provided by Planning Multi-Detector CT: A Phantom Study

    PubMed Central

    Yang, Ching-Ching; Chen, Fong-Lin; Lo, Yeh-Chi

    2016-01-01

    Purpose The aim of this study was to improve the image quality of cone-beam computed tomography (CBCT) mounted on the gantry of a linear accelerator used in radiation therapy based on the image information provided by planning multi-detector CT (MDCT). Methods MDCT-based shading correction for CBCT and virtual monochromatic CT (VMCT) synthesized using the dual-energy method were performed. In VMCT, the high-energy data were obtained from CBCT, while the low-energy data were obtained from MDCT. An electron density phantom was used to investigate the efficacy of shading correction and VMCT on improving the target detectability, Hounsfield unit (HU) accuracy and variation, which were quantified by calculating the contrast-to-noise ratio (CNR), the percent difference (%Diff) and the standard deviation of the CT numbers for tissue equivalent background material, respectively. Treatment plan studies for a chest phantom were conducted to investigate the effects of image quality improvement on dose planning. Results For the electron density phantom, the mean value of CNR was 17.84, 26.78 and 34.31 in CBCT, shading-corrected CBCT and VMCT, respectively. The mean value of %Diff was 152.67%, 11.93% and 7.66% in CBCT, shading-corrected CBCT and VMCT, respectively. The standard deviation within a uniform background of CBCT, shading-corrected CBCT and VMCT was 85, 23 and 15 HU, respectively. With regards to the chest phantom, the monitor unit (MU) difference between the treatment plan calculated using MDCT and those based on CBCT, shading corrected CBCT and VMCT was 6.32%, 1.05% and 0.94%, respectively. Conclusions Enhancement of image quality in on-board CBCT can contribute to daily patient setup and adaptive dose delivery, thus enabling higher confidence in patient treatment accuracy in radiation therapy. Based on our results, VMCT has the highest image quality, followed by the shading corrected CBCT and the original CBCT. The research results presented in this study should be

  16. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    SciTech Connect

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.

    2014-04-15

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same

  17. Application of an asymmetric flow field flow fractionation multi-detector approach for metallic engineered nanoparticle characterization--prospects and limitations demonstrated on Au nanoparticles.

    PubMed

    Hagendorfer, Harald; Kaegi, Ralf; Traber, Jacqueline; Mertens, Stijn F L; Scherrers, Roger; Ludwig, Christian; Ulrich, Andrea

    2011-11-14

    In this work we discuss about the method development, applicability and limitations of an asymmetric flow field flow fractionation (A4F) system in combination with a multi-detector setup consisting of UV/vis, light scattering, and inductively coupled plasma mass spectrometry (ICPMS). The overall aim was to obtain a size dependent-, element specific-, and quantitative method appropriate for the characterization of metallic engineered nanoparticle (ENP) dispersions. Thus, systematic investigations of crucial method parameters were performed by employing well characterized Au nanoparticles (Au-NPs) as a defined model system. For good separation performance, the A4F flow-, membrane-, and carrier conditions were optimized. To obtain reliable size information, the use of laser light scattering based detectors was evaluated, where an online dynamic light scattering (DLS) detector showed good results for the investigated Au-NP up to a size of 80 nm in hydrodynamic diameter. To adapt large sensitivity differences of the various detectors, as well as to guarantee long term stability and minimum contamination of the mass spectrometer a split-flow concept for coupling ICPMS was evaluated. To test for reliable quantification, the ICPMS signal response of ionic Au standards was compared to that of Au-NP. Using proper stabilization with surfactants, no difference for concentrations of 1-50 μg Au L(-1) in the size range from 5 to 80 nm for citrate stabilized dispersions was observed. However, studies using different A4F channel membranes showed unspecific particle-membrane interaction resulting in retention time shifts and unspecific loss of nanoparticles, depending on the Au-NP system as well as membrane batch and type. Thus, reliable quantification and discrimination of ionic and particular species was performed using ICPMS in combination with ultracentrifugation instead of direct quantification with the A4F multi-detector setup. Figures of merit were obtained, by comparing the

  18. Analysis of the topological properties of the proximal femur on a regional scale: evaluation of multi-detector CT-scans for the assessment of biomechanical strength using local Minkowski functionals in 3D

    NASA Astrophysics Data System (ADS)

    Boehm, H. F.; Link, T. M.; Monetti, R. A.; Kuhn, V.; Eckstein, F.; Raeth, C. W.; Reiser, M.

    2006-03-01

    In our recent studies on the analysis of bone texture in the context of Osteoporosis, we could already demonstrate the great potential of the topological evaluation of bone architecture based on the Minkowski Functionals (MF) in 2D and 3D for the prediction of the mechanical strength of cubic bone specimens depicted by high resolution MRI. Other than before, we now assess the mechanical characteristics of whole hip bone specimens imaged by multi-detector computed tomography. Due to the specific properties of the imaging modality and the bone tissue in the proximal femur, this requires to introduce a new analysis method. The internal architecture of the hip is functionally highly specialized to withstand the complex pattern of external and internal forces associated with human gait. Since the direction, connectivity and distribution of the trabeculae changes considerably within narrow spatial limits it seems most reasonable to evaluate the femoral bone structure on a local scale. The Minkowski functionals are a set of morphological descriptors for the topological characterization of binarized, multi-dimensional, convex objects with respect to shape, structure, and the connectivity of their components. The MF are usually used as global descriptors and may react very sensitively to minor structural variations which presents a major limitation in a number of applications. The objective of this work is to assess the mechanical competence of whole hip bone specimens using parameters based on the MF. We introduce an algorithm that considers the local topological aspects of the bone architecture of the proximal femur allowing to identify regions within the bone that contribute more to the overall mechanical strength than others.

  19. Comparison of Diagnostic Accuracy of Radiation Dose-Equivalent Radiography, Multidetector Computed Tomography and Cone Beam Computed Tomography for Fractures of Adult Cadaveric Wrists

    PubMed Central

    Neubauer, Jakob; Benndorf, Matthias; Reidelbach, Carolin; Krauß, Tobias; Lampert, Florian; Zajonc, Horst; Kotter, Elmar; Langer, Mathias; Fiebich, Martin; Goerke, Sebastian M.

    2016-01-01

    Purpose To compare the diagnostic accuracy of radiography, to radiography equivalent dose multidetector computed tomography (RED-MDCT) and to radiography equivalent dose cone beam computed tomography (RED-CBCT) for wrist fractures. Methods As study subjects we obtained 10 cadaveric human hands from body donors. Distal radius, distal ulna and carpal bones (n = 100) were artificially fractured in random order in a controlled experimental setting. We performed radiation dose equivalent radiography (settings as in standard clinical care), RED-MDCT in a 320 row MDCT with single shot mode and RED-CBCT in a device dedicated to musculoskeletal imaging. Three raters independently evaluated the resulting images for fractures and the level of confidence for each finding. Gold standard was evaluated by consensus reading of a high-dose MDCT. Results Pooled sensitivity was higher in RED-MDCT with 0.89 and RED-MDCT with 0.81 compared to radiography with 0.54 (P = < .004). No significant differences were detected concerning the modalities’ specificities (with values between P = .98). Raters' confidence was higher in RED-MDCT and RED-CBCT compared to radiography (P < .001). Conclusion The diagnostic accuracy of RED-MDCT and RED-CBCT for wrist fractures proved to be similar and in some parts even higher compared to radiography. Readers are more confident in their reporting with the cross sectional modalities. Dose equivalent cross sectional computed tomography of the wrist could replace plain radiography for fracture diagnosis in the long run. PMID:27788215

  20. Development of adaptive noise reduction filter algorithm for pediatric body images in a multi-detector CT

    NASA Astrophysics Data System (ADS)

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Okita, Izumi; Ninomiya, Yuuji; Tomoshige, Yukihiro; Kurokawa, Takehiro; Ono, Yutaka; Nakamura, Yuko; Suzuki, Masayuki

    2008-03-01

    Recently, several kinds of post-processing image filters which reduce the noise of computed tomography (CT) images have been proposed. However, these image filters are mostly for adults. Because these are not very effective in small (< 20 cm) display fields of view (FOV), we cannot use them for pediatric body images (e.g., premature babies and infant children). We have developed a new noise reduction filter algorithm for pediatric body CT images. This algorithm is based on a 3D post-processing in which the output pixel values are calculated by nonlinear interpolation in z-directions on original volumetric-data-sets. This algorithm does not need the in-plane (axial plane) processing, so the spatial resolution does not change. From the phantom studies, our algorithm could reduce SD up to 40% without affecting the spatial resolution of x-y plane and z-axis, and improved the CNR up to 30%. This newly developed filter algorithm will be useful for the diagnosis and radiation dose reduction of the pediatric body CT images.

  1. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Gu, J.; Bednarz, B.; Caracappa, P. F.; Xu, X. G.

    2009-05-01

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  2. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations.

    PubMed

    Gu, J; Bednarz, B; Caracappa, P F; Xu, X G

    2009-05-07

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  3. Non-invasive Detection of Aortic and Coronary Atherosclerosis in Homozygous Familial Hypercholesterolemia by 64 Slice Multi-detector Row Computed Tomography Angiography

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Homozygous familial hypercholesterolemia (HoFH) is a rare disorder characterized by the early onset of atherosclerosis, often at the ostia of coronary arteries. In this study we document for the first time that aortic and coronary atherosclerosis can be detected using 64 slice multiple detector-row ...

  4. Non-invasive detection of aortic and coronary atherosclerosis in homozygous familial hypercholesterolemia by 64 slice multi-detector row computed tomography angiography

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Homozygous familial hypercholesterolemia (HoFH) is a rare disorder characterized by the early onset of atherosclerosis, often at the ostia of coronary arteries. In this study we document for the first time that aortic and coronary atherosclerosis can be detected using 64 slice multiple detector row ...

  5. Voxel-Based Sensitivity of Flat-Panel CT for the Detection of Intracranial Hemorrhage: Comparison to Multi-Detector CT

    PubMed Central

    Frölich, Andreas M.; Buhk, Jan-Hendrik; Fiehler, Jens; Kemmling, Andre

    2016-01-01

    Objectives Flat-panel CT (FPCT) allows cross-sectional parenchymal, vascular and perfusion imaging within the angiography suite, which could greatly facilitate acute stroke management. We hypothesized that FPCT offers equal diagnostic accuracy compared to multi-detector CT (MDCT) as a primary tool to exclude intracranial hemorrhage. Methods 22 patients with intracranial hematomas who had both MDCT and FPCT performed within 24 hours were retrospectively identified. Patients with visible change in hematoma size or configuration were excluded. Two raters independently segmented hemorrhagic lesions. Data sets and corresponding binary lesion maps were co-registered to compare hematoma volume. Diagnostic accuracy of FPCT to detect hemorrhage was calculated from voxel-wise analysis of lesion overlap compared to reference MDCT. Results Mean hematoma size was similar between MDCT (16.2±8.9 ml) and FPCT (16.1±8.6 ml), with near perfect correlation of hematoma sizes between modalities (ρ = 0.95, p<0.001). Sensitivity and specificity of FPCT to detect hemorrhagic voxels was 61.6% and 99.8% for intraventricular hematomas and 67.7% and 99.5% for all other intracranial hematomas. Conclusions In this small sample containing predominantly cases with subarachnoid hemorrhage, FPCT based assessment of hemorrhagic volume in brain yields acceptable accuracy compared to reference MDCT, albeit with a limited sensitivity on a voxel level. Further assessment and improvement of FPCT is necessary before it can be applied as a primary imaging modality to exclude intracranial hemorrhage in acute stroke patients. PMID:27806106

  6. A 64-slice multi-detector CT scan could evaluate the change of the left atrial appendage thrombi of the atrial fibrillation patient, which was reduced by warfarin therapy.

    PubMed

    Takeuchi, Hidekazu

    2011-08-19

    Curable cause of stroke is the left atrial appendage (LAA) thrombi of atrial fibrillation (AF) patients. Some AF patients have the LAA thrombi. It is very important to cure AF patients by warfarin. Transoesophageal echocardiography (TOE) is the usual clinical tool to detect the LAA thrombi. Recently, a 64-slice multi-detector CT (64-MDCT) scan enables us to display the LAA thrombi more easily than TOE. I reported a case that a 64-MDCT scan had been used successfully in displaying the change of the LAA thrombi reduced by warfarin therapy. The size of the LAA thrombi was reduced from 25.2 mm × 19.3 mm (figure 1) to 22.1 mm × 14.8 mm (figure 2) after the 3-month warfarin therapy. It was useful to estimate the LAA thrombi by a 64-MDCT scan to estimate LAA thrombi itself and the change of LAA thrombi to evaluate the effectiveness of warfarin therapy.

  7. [Novel method of noise power spectrum measurement for computed tomography images with adaptive iterative reconstruction method].

    PubMed

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Hara, Takanori; Terakawa, Shoichi; Yokomachi, Kazushi; Fujioka, Chikako; Kiguchi, Masao; Ishifuro, Minoru

    2012-01-01

    Adaptive iterative reconstruction techniques (IRs) can decrease image noise in computed tomography (CT) and are expected to contribute to reduction of the radiation dose. To evaluate the performance of IRs, the conventional two-dimensional (2D) noise power spectrum (NPS) is widely used. However, when an IR provides an NPS value drop at all spatial frequency (which is similar to NPS changes by dose increase), the conventional method cannot evaluate the correct noise property because the conventional method does not correspond to the volume data natures of CT images. The purpose of our study was to develop a new method for NPS measurements that can be adapted to IRs. Our method utilized thick multi-planar reconstruction (MPR) images. The thick images are generally made by averaging CT volume data in a direction perpendicular to a MPR plane (e.g. z-direction for axial MPR plane). By using this averaging technique as a cutter for 3D-NPS, we can obtain adequate 2D-extracted NPS (eNPS) from 3D NPS. We applied this method to IR images generated with adaptive iterative dose reduction 3D (AIDR-3D, Toshiba) to investigate the validity of our method. A water phantom with 24 cm-diameters was scanned at 120 kV and 200 mAs with a 320-row CT (Acquilion One, Toshiba). From the results of study, the adequate thickness of MPR images for eNPS was more than 25.0 mm. Our new NPS measurement method utilizing thick MPR images was accurate and effective for evaluating noise reduction effects of IRs.

  8. Monte Carlo simulations in multi-detector CT (MDCT) for two PET/CT scanner models using MASH and FASH adult phantoms

    NASA Astrophysics Data System (ADS)

    Belinato, W.; Santos, W. S.; Paschoal, C. M. M.; Souza, D. N.

    2015-06-01

    The combination of positron emission tomography (PET) and computed tomography (CT) has been extensively used in oncology for diagnosis and staging of tumors, radiotherapy planning and follow-up of patients with cancer, as well as in cardiology and neurology. This study determines by the Monte Carlo method the internal organ dose deposition for computational phantoms created by multidetector CT (MDCT) beams of two PET/CT devices operating with different parameters. The different MDCT beam parameters were largely related to the total filtration that provides a beam energetic change inside the gantry. This parameter was determined experimentally with the Accu-Gold Radcal measurement system. The experimental values of the total filtration were included in the simulations of two MCNPX code scenarios. The absorbed organ doses obtained in MASH and FASH phantoms indicate that bowtie filter geometry and the energy of the X-ray beam have significant influence on the results, although this influence can be compensated by adjusting other variables such as the tube current-time product (mAs) and pitch during PET/CT procedures.

  9. Anatomy of the larynx and pharynx: effects of age, gender and height revealed by multidetector computed tomography.

    PubMed

    Inamoto, Y; Saitoh, E; Okada, S; Kagaya, H; Shibata, S; Baba, M; Onogi, K; Hashimoto, S; Katada, K; Wattanapan, P; Palmer, J B

    2015-09-01

    Although oropharyngeal and laryngeal structures are essential for swallowing, the three-dimensional (3D) anatomy is not well understood, due in part to limitations of available measuring techniques. This study uses 3D images acquired by 320-row area detector computed tomography ('320-ADCT'), to measure the pharynx and larynx and to investigate the effects of age, gender and height. Fifty-four healthy volunteers (30 male, 24 female, 23-77 years) underwent one single-phase volume scan (0.35 s) with 320-ADCT during resting tidal breathing. Six measurements of the pharynx and two of larynx were performed. Bivariate statistical methods were used to analyse the effects of gender, age and height on these measurements. Length and volume were significantly larger for men than for women for every measurement (P < 0.05) and increased with height (P < 0.05). Multiple regression analysis was performed to understand the interactions of gender, height and age. Gender, height and age each had significant effects on certain values. The volume of the larynx and hypopharynx was significantly affected by height and age. The length of pharynx was associated with gender and age. Length of the vocal folds and distance from the valleculae to the vocal folds were significantly affected by gender (P < 0.05). These results suggest that age, gender and height have independent and interacting effects on the morphology of the pharynx and larynx. Three-dimensional imaging and morphometrics using 320-ADCT are powerful tools for efficiently and reliably observing and measuring the pharynx and larynx.

  10. Deconvolution Methods for Multi-Detectors

    DTIC Science & Technology

    1989-08-30

    other words, the integral (71) can be considered as an operator defined I by a certain linear combination of the Dirac masses 6 and their alai...fk /r . This is the Sturm - Liouville expansion for, k n/2 1 a boundary value problem singular at t = 0 and derivative equal to zero at t - r1 . It...1I CHAPTER 1 ANALYTIC BEZOUT IDENTITIES ............. 1.0I CHAPTER 2 EXACT DECONVOLUTION FOR MULTIPLE CONVOLUTION OPERATORS

  11. Low fingertip temperature rebound measured by digital thermal monitoring strongly correlates with the presence and extent of coronary artery disease diagnosed by 64-slice multi-detector computed tomography.

    PubMed

    Ahmadi, Naser; Nabavi, Vahid; Nuguri, Vivek; Hajsadeghi, Fereshteh; Flores, Ferdinand; Akhtar, Mohammad; Kleis, Stanley; Hecht, Harvey; Naghavi, Morteza; Budoff, Matthew

    2009-10-01

    Previous studies showed strong correlations between low fingertip temperature rebound measured by digital thermal monitoring (DTM) during a 5 min arm-cuff induced reactive hyperemia and both the Framingham Risk Score (FRS), and coronary artery calcification (CAC) in asymptomatic populations. This study evaluates the correlation between DTM and coronary artery disease (CAD) measured by CT angiography (CTA) in symptomatic patients. It also investigates the correlation between CTA and a new index of neurovascular reactivity measured by DTM. 129 patients, age 63 +/- 9 years, 68% male, underwent DTM, CAC and CTA. Adjusted DTM indices in the occluded arm were calculated: temperature rebound: aTR and area under the temperature curve aTMP-AUC. DTM neurovascular reactivity (NVR) index was measured based on increased fingertip temperature in the non-occluded arm. Obstructive CAD was defined as >or=50% luminal stenosis, and normal as no stenosis and CAC = 0. Baseline fingertip temperature was not different across the groups. However, all DTM indices of vascular and neurovascular reactivity significantly decreased from normal to non-obstructive to obstructive CAD [(aTR 1.77 +/- 1.18 to 1.24 +/- 1.14 to 0.94 +/- 0.92) (P = 0.009), (aTMP-AUC: 355.6 +/- 242.4 to 277.4 +/- 182.4 to 184.4 +/- 171.2) (P = 0.001), (NVR: 161.5 +/- 147.4 to 77.6 +/- 88.2 to 48.8 +/- 63.8) (P = 0.015)]. After adjusting for risk factors, the odds ratio for obstructive CAD compared to normal in the lowest versus two upper tertiles of FRS, aTR, aTMP-AUC, and NVR were 2.41 (1.02-5.93), P = 0.05, 8.67 (2.6-9.4), P = 0.001, 11.62 (5.1-28.7), P = 0.001, and 3.58 (1.09-11.69), P = 0.01, respectively. DTM indices and FRS combined resulted in a ROC curve area of 0.88 for the prediction of obstructive CAD. In patients suspected of CAD, low fingertip temperature rebound measured by DTM significantly predicted CTA-diagnosed obstructive disease.

  12. The role of mobile computed tomography in mass fatality incidents.

    PubMed

    Rutty, Guy N; Robinson, Claire E; BouHaidar, Ralph; Jeffery, Amanda J; Morgan, Bruno

    2007-11-01

    Mobile multi-detector computed tomography (MDCT) scanners are potentially available to temporary mortuaries and can be operational within 20 min of arrival. We describe, to our knowledge, the first use of mobile MDCT for a mass fatality incident. A mobile MDCT scanner attended the disaster mortuary after a five vehicle road traffic incident. Five out of six bodies were successfully imaged by MDCT in c. 15 min per body. Subsequent full radiological analysis took c. 1 h per case. The results were compared to the autopsy examinations. We discuss the advantages and disadvantages of imaging with mobile MDCT in relation to mass fatality work, illustrating the body pathway process, and its role in the identification of the pathology, personal effects, and health and safety hazards. We propose that the adoption of a single modality of mobile MDCT could replace the current use of multiple radiological sources within a mass fatality mortuary.

  13. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles.

  14. An automated system for lung nodule detection in low-dose computed tomography

    NASA Astrophysics Data System (ADS)

    Gori, I.; Fantacci, M. E.; Preite Martinez, A.; Retico, A.

    2007-03-01

    A computer-aided detection (CAD) system for the identification of pulmonary nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 Italian project. One of the main goals of this project is to build a distributed database of lung CT scans in order to enable automated image analysis through a data and cpu GRID infrastructure. The basic modules of our lung-CAD system, a dot-enhancement filter for nodule candidate selection and a neural classifier for false-positive finding reduction, are described. The system was designed and tested for both internal and sub-pleural nodules. The results obtained on the collected database of low-dose thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  15. Does Computed Tomography Change our Observation and Management of Fracture Non-Unions?

    PubMed Central

    Kleinlugtenbelt, Ydo V.; Scholtes, Vanessa A.B.; Toor, Jay; Amaechi, Christian; Maas, Mario; Bhandari, Mohit; Poolman, Rudolf W.; Kloen, Peter

    2016-01-01

    Background: The purpose of this study was to determine whether Multi-Detector Computed Tomography (MDCT) in addition to plain radiographs influences radiologists’ and orthopedic surgeons’ diagnosis and treatment plans for delayed unions and non-unions. Methods: A retrospective database of 32 non-unions was reviewed by 20 observers. On a scale of 1 to 5, observers rated on X-Ray and a subsequent Multi Detector Helical Computer Tomography (MDCT) scan was performed to determine the following categories: “healed”, “bridging callus present”, “persistent fracture line” or “surgery advised”. Interobserver reliability in each category was calculated using the Interclass Correlation Coefficient (ICC). The influence of the MDCT scan on the raters’ observations was determined in each case by subtracting the two scores of both time points. Results: All four categories show fair interobserver reliability when using plain radiographs. MDCT showed no improvement, the reliability was poor for the categories “bridging callus present” and “persistent fracture line”, and fair for “healed” and “surgery advised”. In none of the cases, MDCT led to a change of management from nonoperative to operative treatment or vice versa. For 18 out of 32 cases, the treatment plans did not alter. In seven cases MDCT led to operative treatment while on X-ray the treatment plan was undecided. Conclusion: In this study, the interobserver reliability of MDCT scan is not greater than conventional radiographs for determining non-union. However, a MDCT scan did lead to a more invasive approach in equivocal cases. Therefore a MDCT is only recommended for making treatment strategies in those cases. PMID:27847846

  16. Multi-detector CT in the paediatric urinary tract.

    PubMed

    Damasio, M B; Darge, K; Riccabona, M

    2013-07-01

    The use of paediatric multi-slice CT (MSCT) is rapidly increasing worldwide. As technology advances its application in paediatric care is constantly expanding with an increasing need for radiation dose control and appropriate utilization. Recommendations on how and when to use CT for assessment of the paediatric urinary tract appear to be an important issue. Therefore the European Society of Paediatric Radiology (ESPR) uroradiology task force and European Society of Urogenital Radiology (ESUR) paediatric working groups created a proposal for performing renal CT in children that has recently been published. The objective of this paper is to discuss paediatric urinary tract CT (uro-CT) in more detail and depth. The specific aim is not only to offer general recommendations on clinical indications and optimization processes of paediatric CT examination, but also to address various childhood characteristics and phenomena that facilitate understanding the different approach and use of uro-CT in children compared to adults. According to ALARA principles, paediatric uro-CT should only be considered for selected indications provided high-level comprehensive US is not conclusive and alternative non-ionizing techniques such as MR are not available or appropriate. Optimization of paediatric uro-CT protocols (considering lower age-adapted kV and mAs) is mandatory, and the number of phases and acquisition series should be kept as few as possible.

  17. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  18. 64-MULTIDETECTOR COMPUTED TOMOGRAPHIC ANGIOGRAPHY OF THE CANINE CORONARY ARTERIES

    PubMed Central

    Drees, Randi; Frydrychowicz, Alex; Reeder, Scott B.; Pinkerton, Marie E.; Johnson, Rebecca

    2012-01-01

    Canine coronary artery angiography (CTA) was performed in four anesthetized healthy dogs using 64-multi-detector computed tomography. Esmolol, a β-1 adrenergic receptor antagonist, and sodium nitroprusside, an arteriolar and venous dilator, were administered to enhance visualization of the coronary arteries by reducing heart rate and creating vasodilation. The left main coronary artery with its three main branches and the right coronary artery were visualized and subdivided in 13 segments for evaluation. Optimal reconstruction interval, expressed as percentage of the R-to-R interval, was determined at 5% in 2.9%, 35% in 1%, 75% in 21.2%, 85% in 43.3%, and 95% in 31.7% of the segments. Overall image quality was good in 41.3% of the segments and excellent in 14.4%. There was blur in 98.1%, motion in 17.3%, and stair step in 6.7% of the evaluated segments, but these artifacts did not interfere with anatomic depiction of the arteries. Cross-sectional anatomy of the coronary arteries as evaluated from the coronary CTA agreed well with gross anatomic evaluation and published information. The use of esmolol did not lead to the target heart rate of 60–65 beats/min. Nitroprusside had no significant effect on visualized length or diameter of the coronary artery branches. Coronary CTA is useful for the anatomic depiction of coronary artery branches in the dog. PMID:21521398

  19. Computer-based route-definition system for peripheral bronchoscopy.

    PubMed

    Graham, Michael W; Gibbs, Jason D; Higgins, William E

    2012-04-01

    Multi-detector computed tomography (MDCT) scanners produce high-resolution images of the chest. Given a patient's MDCT scan, a physician can use an image-guided intervention system to first plan and later perform bronchoscopy to diagnostic sites situated deep in the lung periphery. An accurate definition of complete routes through the airway tree leading to the diagnostic sites, however, is vital for avoiding navigation errors during image-guided bronchoscopy. We present a system for the robust definition of complete airway routes suitable for image-guided bronchoscopy. The system incorporates both automatic and semiautomatic MDCT analysis methods for this purpose. Using an intuitive graphical user interface, the user invokes automatic analysis on a patient's MDCT scan to produce a series of preliminary routes. Next, the user visually inspects each route and quickly corrects the observed route defects using the built-in semiautomatic methods. Application of the system to a human study for the planning and guidance of peripheral bronchoscopy demonstrates the efficacy of the system.

  20. Anomalous right coronary artery arising next to the left coronary ostium: unambiguous detection of the anatomy by computed tomography and evaluation of functional significance by cardiovascular magnetic resonance.

    PubMed

    Korosoglou, Grigorios; Heye, Tobias; Giannitsis, Evangelos; Hosch, Waldemar; Kauczor, Hans U; Katus, Hugo A

    2010-11-19

    Herein we report on the diagnostic potential of multi-detector row computed tomography (MDCT) combined with cardiovascular magnetic resonance (CMR) for the diagnostic workup in an adult patient with a rare coronary anomaly. MDCT unambiguously detected the anomalous right coronary artery (RCA), which originated next to the left coronary ostium and coursed inter-arterially between the ascending aorta and the pulmonary trunk. The intramural proximal intussusception of the ectopic RCA could be clearly appreciated on MDCT images, while multiple mixed plaques were detected in the left anterior descending (LAD), resulting in moderate stenosis of this vessel. CMR during adenosine infusion ruled-out inducible ischemia, yielding normal perfusion patterns both in the RCA and in the LAD coronary territory. Since ischemia was not demonstrated by stress CMR, revascularization was not performed.

  1. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  2. Comparison of image features calculated in different dimensions for computer-aided diagnosis of lung nodules

    NASA Astrophysics Data System (ADS)

    Xu, Ye; Lee, Michael C.; Boroczky, Lilla; Cann, Aaron D.; Borczuk, Alain C.; Kawut, Steven M.; Powell, Charles A.

    2009-02-01

    Features calculated from different dimensions of images capture quantitative information of the lung nodules through one or multiple image slices. Previously published computer-aided diagnosis (CADx) systems have used either twodimensional (2D) or three-dimensional (3D) features, though there has been little systematic analysis of the relevance of the different dimensions and of the impact of combining different dimensions. The aim of this study is to determine the importance of combining features calculated in different dimensions. We have performed CADx experiments on 125 pulmonary nodules imaged using multi-detector row CT (MDCT). The CADx system computed 192 2D, 2.5D, and 3D image features of the lesions. Leave-one-out experiments were performed using five different combinations of features from different dimensions: 2D, 3D, 2.5D, 2D+3D, and 2D+3D+2.5D. The experiments were performed ten times for each group. Accuracy, sensitivity and specificity were used to evaluate the performance. Wilcoxon signed-rank tests were applied to compare the classification results from these five different combinations of features. Our results showed that 3D image features generate the best result compared with other combinations of features. This suggests one approach to potentially reducing the dimensionality of the CADx data space and the computational complexity of the system while maintaining diagnostic accuracy.

  3. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  4. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  5. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  6. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  7. Evaluation of Sinonasal Diseases by Computed Tomography

    PubMed Central

    Phatak, Suresh

    2016-01-01

    Introduction Computed Tomography (CT) plays an important diagnostic role in patients with sinonasal diseases and determines the treatment. The CT images clearly show fine structural architecture of bony anatomy thereby determining various anatomical variation, extent of disease and characterization of various inflammatory, benign and malignant sinonasal diseases. Aim To evaluate sensitivity and specificity of CT in diagnosis of sinonasal diseases and to characterise the benign and malignant lesions with the help of various CT parameters. Also, to correlate findings of CT with histo-pathological and diagnostic nasal endoscopy/ Functional Endoscopic Sinus Surgery (FESS) findings. Materials and Methods In this hospital based prospective study 175 patients with symptomatic sinonasal diseases were evaluated by clinical diagnosis and 16 slice Multi Detector Computed Tomography (MDCT). The details of findings of nasal endoscopy, Functional Endoscopic Sinus Surgery (FESS), histopathological examination and fungal culture were collected in all those cases where those investigations were done. All those findings were correlated with CT findings and statistical analysis was done by using Test statistics (sensitivity, specificity, Positive Predictive Value (PPV), Negative Predictive Value (NPV) and accuracy), Chi-Square test and Z-test for single proportions. Software used in the analysis was SPSS 17.0 version and graph pad prism 6.0 version and p < 0.05 was considered as statistically significant. Results CT diagnosis had higher sensitivity, specificity, PPV and NPV in diagnosing various sinonasal diseases in comparison to clinical diagnosis. On correlating CT diagnosis with final diagnosis, congenital conditions have 100% sensitivity and specificity. Chronic sinusitis has 98.3% sensitivity and 97.8% specificity. For fungal sinusitis the sensitivity was 60% and specificity was 99.3%. Polyps have sensitivity of 94.4% and specificity of 98.1%. Benign neoplasms have sensitivity

  8. A minimum data set approach to post-mortem computed tomography reporting for anthropological biological profiling.

    PubMed

    Brough, Alison L; Morgan, Bruno; Robinson, Claire; Black, Sue; Cunningham, Craig; Adams, Catherine; Rutty, Guy N

    2014-12-01

    Anthropological examination of bones is routinely undertaken in medico-legal investigations to establish an individual's biological profile, particularly their age. This often requires the removal of soft tissue from bone (de-fleshing), which, especially when dealing with the recently deceased, is a time consuming and invasive procedure. Recent advances in multi-detector computed tomography have made it practical to rapidly acquire high-resolution morphological skeletal information from images of "fleshed" remains. The aim of this study was to develop a short standard form, created from post-mortem computed tomography images, that contains the minimum image-set required to anthropologically assess an individual. The proposed standard forms were created for 31 juvenile forensic cases with known age-at-death, spanning the full age range of the developing human. Five observers independently used this form to estimate age-at-death. All observers estimated age in all cases, and all estimations were within the accepted ranges for traditional anthropological and odontological assessment. This study supports the implementation of this approach in forensic radiological practice.

  9. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  10. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  11. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  12. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  13. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  14. Distributed Computing.

    ERIC Educational Resources Information Center

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  15. Portable Computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    SPOC, a navigation monitoring computer used by NASA in a 1983 mission, was a modification of a commercial computer called GRiD Compass, produced by GRiD Systems Corporation. SPOC was chosen because of its small size, large storage capacity, and high processing speed. The principal modification required was a fan to cool the computer. SPOC automatically computes position, orbital paths, communication locations, etc. Some of the modifications were adapted for commercial applications. The computer is presently used in offices for conferences, for on-site development, and by the army as part of a field communications systems.

  16. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  17. Post-mortem computed tomography and 3D imaging: anthropological applications for juvenile remains.

    PubMed

    Brough, Alison L; Rutty, Guy N; Black, Sue; Morgan, Bruno

    2012-09-01

    Anthropological examination of defleshed bones is routinely used in medico-legal investigations to establish an individual's biological profile. However, when dealing with the recently deceased, the removal of soft tissue from bone can be an extremely time consuming procedure that requires the presence of a trained anthropologist. In addition, due to its invasive nature, in some disaster victim identification scenarios the maceration of bones is discouraged by religious practices and beliefs, or even prohibited by national laws and regulations. Currently, three different radiological techniques may be used in the investigative process; plain X-ray, dental X-ray and fluoroscopy. However, recent advances in multi-detector computed tomography (MDCT) mean that it is now possible to acquire morphological skeletal information from high resolution images, reducing the necessity for invasive procedures. This review paper considers the possible applications of a virtual anthropological examination by reviewing the main juvenile age determination methods used by anthropologists at present and their possible adaption to MDCT.

  18. Assessing vertebral fracture risk on volumetric quantitative computed tomography by geometric characterization of trabecular bone structure

    NASA Astrophysics Data System (ADS)

    Checefsky, Walter A.; Abidin, Anas Z.; Nagarajan, Mahesh B.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2016-03-01

    The current clinical standard for measuring Bone Mineral Density (BMD) is dual X-ray absorptiometry, however more recently BMD derived from volumetric quantitative computed tomography has been shown to demonstrate a high association with spinal fracture susceptibility. In this study, we propose a method of fracture risk assessment using structural properties of trabecular bone in spinal vertebrae. Experimental data was acquired via axial multi-detector CT (MDCT) from 12 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. Common image processing methods were used to annotate the trabecular compartment in the vertebral slices creating a circular region of interest (ROI) that excluded cortical bone for each slice. The pixels inside the ROI were converted to values indicative of BMD. High dimensional geometrical features were derived using the scaling index method (SIM) at different radii and scaling factors (SF). The mean BMD values within the ROI were then extracted and used in conjunction with a support vector machine to predict the failure load of the specimens. Prediction performance was measured using the root-mean-square error (RMSE) metric and determined that SIM combined with mean BMD features (RMSE = 0.82 +/- 0.37) outperformed MDCT-measured mean BMD (RMSE = 1.11 +/- 0.33) (p < 10-4). These results demonstrate that biomechanical strength prediction in vertebrae can be significantly improved through the use of SIM-derived texture features from trabecular bone.

  19. Scanning protocol optimization and dose evaluation in coronary stenosis using multi-slices computed tomography

    NASA Astrophysics Data System (ADS)

    Huang, Yung-hui; Chen, Chia-lin; Sheu, Chin-yin; Lee, Jason J. S.

    2007-02-01

    Cardiovascular diseases are the most common incidence for premature death in developed countries. A major fraction is attributable to atherosclerotic coronary artery disease, which may result in sudden cardiac failure. A reduction of mortality caused by myocardial infarction may be achieved if coronary atherosclerosis can be detected and treated at an early stage before symptoms occur. Therefore, there is need for an effective tool that allows identification of patients at increased risk for future cardiac events. The current multi-detector CT has been widely used for detection and quantification of coronary calcifications as a sign of coronary atherosclerosis. The aim of this study is to optimize the diagnostic values and radiation exposure in coronary artery calcium-screening examination using multi-slice CT (MSCT) with different image scan protocols. The radiation exposure for all protocols is evaluated by using computed tomography dose index (CTDI) phantom measurements. We chose an optimal scanning protocol and evaluated patient radiation dose in the MSCT coronary artery screenings and preserved its expecting diagnostic accuracy. These changes make the MSCT have more operation flexibility and provide more diagnostic values in current practice.

  20. Developing Computation

    ERIC Educational Resources Information Center

    McIntosh, Alistair

    2004-01-01

    In this article, the author presents the results of a state project that focused on the effect of developing informal written computation processes through Years 2-4. The "developing computation" project was conducted in Tasmania over the two years 2002-2003 and involved nine schools: five government schools, two Catholic schools, and…

  1. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  2. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  3. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  4. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  5. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  6. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  7. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  8. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  9. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  10. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...Summary Cloud Computing is in essence an economic model • It is a different way to acquire and manage IT resources There are multiple cloud providers...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  11. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  12. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  13. Computer Poker

    ERIC Educational Resources Information Center

    Findler, Nicholas V.

    1978-01-01

    This familiar card game has interested mathematicians, economists, and psychologists as a model of decision-making in the real world. It is now serving as a vehicle for investigations in computer science. (Author/MA)

  14. Evolutionary Computing

    SciTech Connect

    Patton, Robert M; Cui, Xiaohui; Jiao, Yu; Potok, Thomas E

    2008-01-01

    The rate at which information overwhelms humans is significantly more than the rate at which humans have learned to process, analyze, and leverage this information. To overcome this challenge, new methods of computing must be formulated, and scientist and engineers have looked to nature for inspiration in developing these new methods. Consequently, evolutionary computing has emerged as new paradigm for computing, and has rapidly demonstrated its ability to solve real-world problems where traditional techniques have failed. This field of work has now become quite broad and encompasses areas ranging from artificial life to neural networks. This chapter focuses specifically on two sub-areas of nature-inspired computing: Evolutionary Algorithms and Swarm Intelligence.

  15. Computer Calculus.

    ERIC Educational Resources Information Center

    Steen, Lynn Arthur

    1981-01-01

    The development of symbolic computer algebra designed to manipulate abstract mathematical expressions is discussed. The ability of this software to mimic the standard patterns of human problem solving represents a major advance toward "true" artificial intelligence. (MP)

  16. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  17. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  18. Quantum Computing

    DTIC Science & Technology

    1998-04-01

    information representation and processing technology, although faster than the wheels and gears of the Charles Babbage computation machine, is still in...the same computational complexity class as the Babbage machine, with bits of information represented by entities which obey classical (non-quantum...nuclear double resonances Charles M Bowden and Jonathan P. Dowling Weapons Sciences Directorate, AMSMI-RD-WS-ST Missile Research, Development, and

  19. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  20. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  1. Quantum computers.

    PubMed

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  2. Qubus computation

    NASA Astrophysics Data System (ADS)

    Munro, W. J.; Nemoto, Kae; Spiller, T. P.; van Loock, P.; Braunstein, Samuel L.; Milburn, G. J.

    2006-08-01

    Processing information quantum mechanically is known to enable new communication and computational scenarios that cannot be accessed with conventional information technology (IT). We present here a new approach to scalable quantum computing---a "qubus computer"---which realizes qubit measurement and quantum gates through interacting qubits with a quantum communication bus mode. The qubits could be "static" matter qubits or "flying" optical qubits, but the scheme we focus on here is particularly suited to matter qubits. Universal two-qubit quantum gates may be effected by schemes which involve measurement of the bus mode, or by schemes where the bus disentangles automatically and no measurement is needed. This approach enables a parity gate between qubits, mediated by a bus, enabling near-deterministic Bell state measurement and entangling gates. Our approach is therefore the basis for very efficient, scalable QIP, and provides a natural method for distributing such processing, combining it with quantum communication.

  3. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  4. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  5. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  6. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  7. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  8. F18-fluorodeoxyglucose-positron emission tomography and computed tomography is not accurate in preoperative staging of gastric cancer

    PubMed Central

    Ha, Tae Kyung; Choi, Yun Young; Song, Soon Young

    2011-01-01

    Purpose To investigate the clinical benefits of F18-fluorodeoxyglucose-positron emission tomography and computed tomography (18F-FDG-PET/CT) over multi-detector row CT (MDCT) in preoperative staging of gastric cancer. Methods FDG-PET/CT and MDCT were performed on 78 patients with gastric cancer pathologically diagnosed by endoscopy. The accuracy of radiologic staging retrospectively was compared to pathologic result after curative resection. Results Primary tumors were detected in 51 (65.4%) patients with 18F-FDG-PET/CT, and 47 (60.3%) patients with MDCT. Regarding detection of lymph node metastasis, the sensitivity of FDG-PET/CT was 51.5% with an accuracy of 71.8%, whereas those of MDCT were 69.7% and 69.2%, respectively. The sensitivity of 18F-FDG-PET/CT for a primary tumor with signet ring cell carcinoma was lower than that of 18F-FDG-PET/CT for a primary tumor with non-signet ring cell carcinoma (35.3% vs. 73.8%, P < 0.01). Conclusion Due to its low sensitivity, 18F-FDG-PET/CT alone shows no definite clinical benefit for prediction of lymph node metastasis in preoperative staging of gastric cancer. PMID:22066108

  9. Computational Hearing

    DTIC Science & Technology

    1998-11-01

    ranging from the anatomy and physiology of the auditory pathway to the perception of speech and music under both ideal and not-so-ideal (but more...physiology of various parts of the auditory pathway, to auditory prostheses, speech and audio coding, computational models of pitch and timbre , the role of

  10. Library Computing.

    ERIC Educational Resources Information Center

    Dayall, Susan A.; And Others

    1987-01-01

    Six articles on computers in libraries discuss training librarians and staff to use new software; appropriate technology; system upgrades of the Research Libraries Group's information system; pre-IBM PC microcomputers; multiuser systems for small to medium-sized libraries; and a library user's view of the traditional card catalog. (EM)

  11. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  12. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  13. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  14. Computer Guerrillas.

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1983-01-01

    Describes several cases in which microcomputers were used to prevent large organizations (e.g., utility companies, U.S. Government Forestry Commission) from carrying out actions considered not to be in the public's best interests. The use of the computers by social activitists in their efforts to halt environmental destruction is discussed. (EAO)

  15. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  16. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  17. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  18. Computer Corner.

    ERIC Educational Resources Information Center

    Smith, David A.; And Others

    1986-01-01

    APL was invented specifically as a mathematical teaching tool, and is an excellent vehicle for teaching mathematical concepts using computers. This article illustrates the use of APL in teaching many different topics in mathematics, including logic, set theory, functions, statistics, linear algebra, and matrices. (MNS)

  19. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  20. COMPUTATIONAL SOCIOLINGUISTICS.

    ERIC Educational Resources Information Center

    SEDELOW, WALTER A., JR.

    THE USE OF THE COMPUTER MAY BE ONE OF THE WAYS IN WHICH VARIED LINGUISTIC INTERESTS (SOCIOLINGUISTICS, PSYCHOLINGUISTICS) COME TO BE RENDERED INTERRELATED AND EVEN INTELLECTUALLY COHERENT. (THE CRITERION OF COHERENCE IS SET HERE AT MONISM AS TO MODELS.) ONE OF THE AUTHOR'S MAJOR INTERESTS IS A SYSTEMATIC APPROACH TO SCIENTIFIC CREATIVITY,…

  1. Computational Mathematics

    DTIC Science & Technology

    2012-03-06

    Marsha Berger, NYU) Inclusion of the Adaptation/Adjoint module, Embedded Boundary Methods in the software package Cart3D --- Transition to NASA...ONR, DOE, AFRL, DIA Cart3D used for computing Formation Flight to reduce drag and improve energy efficiency Application to Explosively Formed

  2. Radiological Protection in Cone Beam Computed Tomography (CBCT). ICRP Publication 129.

    PubMed

    Rehani, M M; Gupta, R; Bartling, S; Sharp, G C; Pauwels, R; Berris, T; Boone, J M

    2015-07-01

    The objective of this publication is to provide guidance on radiological protection in the new technology of cone beam computed tomography (CBCT). Publications 87 and 102 dealt with patient dose management in computed tomography (CT) and multi-detector CT. The new applications of CBCT and the associated radiological protection issues are substantially different from those of conventional CT. The perception that CBCT involves lower doses was only true in initial applications. CBCT is now used widely by specialists who have little or no training in radiological protection. This publication provides recommendations on radiation dose management directed at different stakeholders, and covers principles of radiological protection, training, and quality assurance aspects. Advice on appropriate use of CBCT needs to be made widely available. Advice on optimisation of protection when using CBCT equipment needs to be strengthened, particularly with respect to the use of newer features of the equipment. Manufacturers should standardise radiation dose displays on CBCT equipment to assist users in optimisation of protection and comparisons of performance. Additional challenges to radiological protection are introduced when CBCT-capable equipment is used for both fluoroscopy and tomography during the same procedure. Standardised methods need to be established for tracking and reporting of patient radiation doses from these procedures. The recommendations provided in this publication may evolve in the future as CBCT equipment and applications evolve. As with previous ICRP publications, the Commission hopes that imaging professionals, medical physicists, and manufacturers will use the guidelines and recommendations provided in this publication for implementation of the Commission's principle of optimisation of protection of patients and medical workers, with the objective of keeping exposures as low as reasonably achievable, taking into account economic and societal factors, and

  3. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  4. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  5. Computational Physics

    NASA Astrophysics Data System (ADS)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  6. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  7. Spatial Computation

    DTIC Science & Technology

    2003-12-01

    particular program, synthesized under compiler control from the application source code . The translation is illustrated in Figure 1.4. From now on, when we use...very efficient method of exploring the design of complex application-specific system-on-a-chip devices using only the application source code . • New...computation gates. This frees, but also complicates, the com- pilation process. In order to handle the great semantic gap between the source code and the

  8. Computational enzymology.

    PubMed

    Lonsdale, Richard; Ranaghan, Kara E; Mulholland, Adrian J

    2010-04-14

    Molecular simulations and modelling are changing the science of enzymology. Calculations can provide detailed, atomic-level insight into the fundamental mechanisms of biological catalysts. Computational enzymology is a rapidly developing area, and is testing theories of catalysis, challenging 'textbook' mechanisms, and identifying novel catalytic mechanisms. Increasingly, modelling is contributing directly to experimental studies of enzyme-catalysed reactions. Potential practical applications include interpretation of experimental data, catalyst design and drug development.

  9. Computational Electromagnetics

    DTIC Science & Technology

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  10. Quantum Computers

    DTIC Science & Technology

    2010-03-04

    1227–1230 (2009). 31. Olmschenk, S. et al. Quantum teleportation between distant matter qubits. Science 323, 486–489 (2009). 32. Dür, W., Briegel, H...REVIEWS Quantum computers T. D. Ladd1{, F. Jelezko2, R. Laflamme3,4,5, Y. Nakamura6,7, C. Monroe8,9 & J. L. O’Brien10 Over the past several decades... quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing

  11. Computer grants

    NASA Astrophysics Data System (ADS)

    The Computer and Information Science and Engineering Directorate of the National Science Foundation will offer educational supplements to CISE grants in Fiscal Year 1990. The purpose of the supplements is to establish closer links between CISE-supported research and undergraduate education and to accelerate transfer into the classroom of research results from work done under existing research grants. Any principal investigator with an active NSF research award from a program in the CISE Directorate can apply for an educational supplement. Proposals should be for creative activities to improve education, not for research.

  12. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  13. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  14. Computational introspection

    SciTech Connect

    Batali, J.

    1983-02-01

    Introspection is the process of thinking about one's own thoughts and feelings. In this paper, the author discusses recent attempts to make computational systems that exhibit introspective behavior. Each presents a system capable of manipulating representations of its own program and current context. He argues that introspective ability is crucial for intelligent systems--without it an agent cannot represent certain problems that it must be able to solve. A theory of intelligent action would describe how and why certain actions intelligently achieve an agent's goals. The agent would both embody and represent this theory: it would be implemented as the program for the agent; and the importance of introspection suggests that the agent represent its theory of action to itself.

  15. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  16. Computational micromechanics

    NASA Astrophysics Data System (ADS)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  17. Computed tomography, endoscopic, laparoscopic, and intra-operative sonography for assessing resectability of pancreatic cancer.

    PubMed

    Long, Eliza E; Van Dam, Jacques; Weinstein, Stefanie; Jeffrey, Brooke; Desser, Terry; Norton, Jeffrey A

    2005-08-01

    Pancreas cancer is the fourth leading cancer killer in adults. Cure of pancreas cancer is dependent on the complete surgical removal of localized tumor. A complete surgical resection is dependent on accurate preoperative and intra-operative imaging of tumor and its relationship to vital structures. Imaging of pancreatic tumors preoperatively and intra-operatively is achieved by pancreatic protocol computed tomography (CT), endoscopic ultrasound (EUS), laparoscopic ultrasound (LUS), and intra-operative ultrasound (IOUS). Multi-detector CT with three-dimensional (3-D) reconstruction of images is the most useful preoperative modality to assess resectability. It has a sensitivity and specificity of 90 and 99%, respectively. It is not observer dependent. The images predict operative findings. EUS and LUS have sensitivities of 77 and 78%, respectively. They both have a very high specificity. Further, EUS has the ability to biopsy tumor and obtain a definitive tissue diagnosis. IOUS is a very sensitive (93%) method to assess tumor resectability during surgery. It adds little time and no morbidity to the operation. It greatly facilitates the intra-operative decision-making. In reality, each of these methods adds some information to help in determining the extent of tumor and the surgeon's ability to remove it. We rely on pancreatic protocol CT with 3-D reconstruction and either EUS or IOUS depending on the tumor location and operability of the tumor and patient. With these modern imaging modalities, it is now possible to avoid major operations that only determine an inoperable tumor. With proper preoperative selection, surgery is able to remove tumor in the majority of patients.

  18. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  19. Ultrasonography in the diagnosis of nasal bone fractures: a comparison with conventional radiography and computed tomography.

    PubMed

    Lee, In Sook; Lee, Jung-Hoon; Woo, Chang-Ki; Kim, Hak Jin; Sol, Yu Li; Song, Jong Woon; Cho, Kyu-Sup

    2016-02-01

    The purpose of this study was to evaluate and compare the diagnostic efficacy of ultrasonography (US) with radiography and multi-detector computed tomography (CT) for the detection of nasal bone fractures. Forty-one patients with a nasal bone fracture who underwent prospective US examinations were included. Plain radiographs and CT images were obtained on the day of trauma. For US examinations, radiologist used a linear array transducer (L17-5 MHz) in 24 patients and hockey-stick probe (L15-7 MHz) in 17. The bony component of the nose was divided into three parts (right and left lateral nasal walls, and midline of nasal bone). Fracture detection by three modalities was subjected to analysis. Furthermore, findings made by each modality were compared with intraoperative findings. Nasal bone fractures were located in the right lateral wall (n = 28), midline of nasal bone (n = 31), or left lateral wall (n = 31). For right and left lateral nasal walls, CT had greater sensitivity and specificity than US or radiography, and better agreed with intraoperative findings. However, for midline fractures of nasal bone, US had higher specificity, positive predictive value, and negative predictive value than CT. Although two US evaluations showed good agreements at all three sites, US findings obtained by the hockey-stick probe showed closer agreement with intraoperative findings for both lateral nasal wall and midline of nasal bone. Although CT showed higher sensitivity and specificity than US or radiography, US found to be helpful for evaluating the midline of nasal bone. Furthermore, for US examinations of the nasal bone, a smaller probe and higher frequency may be required.

  20. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults.

  1. Changes in entrance surface dose in relation to the location of shielding material in chest computed tomography

    NASA Astrophysics Data System (ADS)

    Kang, Y. M.; Cho, J. H.; Kim, S. C.

    2015-07-01

    This study examined the effects of entrance surface dose (ESD) on the abdomen and pelvis of the patient when undergoing chest computed tomography (CT) procedure, and evaluated the effects of ESD reduction depending on the location of radiation shield. For CT scanner, the 64-slice multi-detector computed tomography was used. The alderson radiation therapy phantom and optically stimulated luminescence dosimeter (OSLD), which enabled measurement from low to high dose, were also used. For measurement of radiation dose, the slice number from 9 to 21 of the phantom was set as the test range, which included apex up to both costophrenic angles. A total of 10 OSLD nanoDots were attached for measurement of the front and rear ESD. Cyclic tests were performed using the low-dose chest CT and high-resolution CT (HRCT) protocol on the following set-ups: without shielding; shielding only on the front side; shielding only on the rear side; and shielding for both front and rear sides. According to the test results, ESD for both front and rear sides was higher in HRCT than low-dose CT when radiation shielding was not used. It was also determined that, compared to the set-up that did not use the radiation shield, locating the radiation shield on the front side was effective in reducing front ESD, while locating the radiation shield on the rear side reduced rear ESD level. Shielding both the front and rear sides resulted in ESD reduction. In conclusion, it was confirmed that shielding the front and rear sides was the most effective method to reduce the ESD effect caused by scatter ray during radiography.

  2. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  3. A multi-detector neutron spectrometer with nearly isotropic response for environmental and workplace monitoring

    NASA Astrophysics Data System (ADS)

    Gómez-Ros, J. M.; Bedogni, R.; Moraleda, M.; Delgado, A.; Romero, A.; Esposito, A.

    2010-01-01

    This communication describes an improved design for a neutron spectrometer consisting of 6Li thermoluminescent dosemeters located at selected positions within a single moderating polyethylene sphere. The spatial arrangement of the dosemeters has been designed using the MCNPX Monte Carlo code to calculate the response matrix for 56 log-equidistant energies from 10 -9 to 100 MeV, looking for a configuration that permits to obtain a nearly isotropic response for neutrons in the energy range from thermal to 20 MeV. The feasibility of the proposed spectrometer and the isotropy of its response have been evaluated by simulating exposures to different reference and workplace neutron fields. The FRUIT code has been used for unfolding purposes. The results of the simulations as well as the experimental tests confirm the suitability of the prototype for environmental and workplace monitoring applications.

  4. A multi-detector, digitizer based neutron depth profiling device for characterizing thin film materials

    SciTech Connect

    Mulligan, P. L.; Cao, L. R.; Turkoglu, D.

    2012-07-15

    Neutron depth profiling (NDP) is a mature, nondestructive technique used to characterize the concentration of certain light isotopes in a material as a function of depth by measuring the residual energy of charged particles in neutron induced reactions. Historically, NDP has been performed using a single detector, resulting in low intrinsic detection efficiency, and limiting the technique largely to high flux research reactors. In this work, we describe a new NDP instrument design with higher detection efficiency by way of spectrum summing across multiple detectors. Such a design is capable of acquiring a statistically significant charged particle spectrum at facilities limited in neutron flux and operation time.

  5. A multi-detector, digitizer based neutron depth profiling device for characterizing thin film materials

    NASA Astrophysics Data System (ADS)

    Mulligan, P. L.; Cao, L. R.; Turkoglu, D.

    2012-07-01

    Neutron depth profiling (NDP) is a mature, nondestructive technique used to characterize the concentration of certain light isotopes in a material as a function of depth by measuring the residual energy of charged particles in neutron induced reactions. Historically, NDP has been performed using a single detector, resulting in low intrinsic detection efficiency, and limiting the technique largely to high flux research reactors. In this work, we describe a new NDP instrument design with higher detection efficiency by way of spectrum summing across multiple detectors. Such a design is capable of acquiring a statistically significant charged particle spectrum at facilities limited in neutron flux and operation time.

  6. A multi-detector, digitizer based neutron depth profiling device for characterizing thin film materials.

    PubMed

    Mulligan, P L; Cao, L R; Turkoglu, D

    2012-07-01

    Neutron depth profiling (NDP) is a mature, nondestructive technique used to characterize the concentration of certain light isotopes in a material as a function of depth by measuring the residual energy of charged particles in neutron induced reactions. Historically, NDP has been performed using a single detector, resulting in low intrinsic detection efficiency, and limiting the technique largely to high flux research reactors. In this work, we describe a new NDP instrument design with higher detection efficiency by way of spectrum summing across multiple detectors. Such a design is capable of acquiring a statistically significant charged particle spectrum at facilities limited in neutron flux and operation time.

  7. Real-time operating system for a multi-laser/multi-detector system

    NASA Technical Reports Server (NTRS)

    Coles, G.

    1980-01-01

    The laser-one hazard detector system, used on the Rensselaer Mars rover, is reviewed briefly with respect to the hardware subsystems, the operation, and the results obtained. A multidetector scanning system was designed to improve on the original system. Interactive support software was designed and programmed to implement real time control of the rover or platform with the elevation scanning mast. The formats of both the raw data and the post-run data files were selected. In addition, the interface requirements were selected and some initial hardware-software testing was completed.

  8. Sub-10-Minute Characterization of an Ultrahigh Molar Mass Polymer by Multi-detector Hydrodynamic Chromatography

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Molar mass averages, distributions, and architectural information of polymers are routinely obtained using size-exclusion chromatography (SEC). It has previously been shown that ultrahigh molar mass polymers may experience degradation during SEC analysis, leading to inaccurate molar mass averages a...

  9. Program Facilitates Distributed Computing

    NASA Technical Reports Server (NTRS)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  10. CAA: Computer Assisted Athletics.

    ERIC Educational Resources Information Center

    Hall, John H.

    Computers have been used in a variety of applications for athletics since the late 1950's. These have ranged from computer-controlled electric scoreboards to computer-designed pole vaulting poles. Described in this paper are a computer-based athletic injury reporting system and a computer-assisted football scouting system. The injury reporting…

  11. The assumptions of computing

    SciTech Connect

    Huggins, J.K.

    1994-12-31

    The use of computers, like any technological activity, is not content-neutral. Users of computers constantly interact with assumptions regarding worthwhile activity which are embedded in any computing system. Directly questioning these assumptions in the context of computing allows us to develop an understanding of responsible computing.

  12. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  13. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  14. Computing the Profession.

    ERIC Educational Resources Information Center

    Denning, Peter J.

    1998-01-01

    Discussion of computing as a science and profession examines the chasm between computer scientists and users, barriers to the use and growth of computing, experimental computer science, computational science, software engineering, professional identity, professional practices, applications of technology, innovation, field boundaries, and…

  15. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  16. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  17. Avoiding Computer Viruses.

    ERIC Educational Resources Information Center

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  18. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  19. Computer Literacy Revisited.

    ERIC Educational Resources Information Center

    Klassen, Daniel

    1983-01-01

    This examination of important trends in the field of computing and education identifies and discusses four key computer literacy goals for educators as well as obstacles facing educators concerned with computer literacy. Nine references are listed. (Author/MBR)

  20. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  1. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation…

  2. Computers: Instruments of Change.

    ERIC Educational Resources Information Center

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  3. Environmentalists and the Computer.

    ERIC Educational Resources Information Center

    Baron, Robert C.

    1982-01-01

    Review characteristics, applications, and limitations of computers, including word processing, data/record keeping, scientific and industrial, and educational applications. Discusses misuse of computers and role of computers in environmental management. (JN)

  4. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  5. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  6. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  7. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  8. French Computer Terminology.

    ERIC Educational Resources Information Center

    Gray, Eugene F.

    1985-01-01

    Characteristics, idiosyncrasies, borrowings, and other aspects of the French terminology for computers and computer-related matters are discussed and placed in the context of French computer use. A glossary provides French equivalent terms or translations of English computer terminology. (MSE)

  9. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  10. The Glass Computer

    ERIC Educational Resources Information Center

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  11. Teach Teachers Computers.

    ERIC Educational Resources Information Center

    Levin, Dan

    1983-01-01

    Suggests eight steps for training teachers to use computers, including establishing a computer committee, structuring and budgeting for inservice courses, cooperating with other schools, and sending some teachers to the computer company's maintenance and repair school. Lists 25 computer skills teachers need and describes California and Minnesota…

  12. My Computer Romance

    ERIC Educational Resources Information Center

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  13. Computer Innovations in Education.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    Computers in education are put in context by a brief review of current social and technological trends, a short history of the development of computers and the vast expansion of their use, and a brief description of computers and their use. Further chapters describe instructional applications, administrative uses, uses of computers for libraries…

  14. Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  15. Parallel computing works

    SciTech Connect

    Not Available

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  16. Ion Trap Quantum Computing

    DTIC Science & Technology

    2011-12-01

    an inspiring speech at the MIT Physics of Computation 1st Conference in 1981, Feynman proposed the development of a computer that would obey the...on ion trap based 36 quantum computing for physics and computer science students would include lecture notes, slides, lesson plans, a syllabus...reading lists, videos, demonstrations, and laboratories. 37 LIST OF REFERENCES [1] R. P. Feynman , “Simulating physics with computers,” Int. J

  17. Distributed computing in bioinformatics.

    PubMed

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  18. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  19. Computers and Computation. Readings from Scientific American.

    ERIC Educational Resources Information Center

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  20. Understanding student computational thinking with computational modeling

    NASA Astrophysics Data System (ADS)

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2013-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.

  1. Heterogeneous Distributed Computing for Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  2. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications.

  3. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  4. Scientific Grid computing.

    PubMed

    Coveney, Peter V

    2005-08-15

    We introduce a definition of Grid computing which is adhered to throughout this Theme Issue. We compare the evolution of the World Wide Web with current aspirations for Grid computing and indicate areas that need further research and development before a generally usable Grid infrastructure becomes available. We discuss work that has been done in order to make scientific Grid computing a viable proposition, including the building of Grids, middleware developments, computational steering and visualization. We review science that has been enabled by contemporary computational Grids, and associated progress made through the widening availability of high performance computing.

  5. Computational aerodynamics and design

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1982-01-01

    The role of computational aerodynamics in design is reviewed with attention given to the design process; the proper role of computations; the importance of calibration, interpretation, and verification; the usefulness of a given computational capability; and the marketing of new codes. Examples of computational aerodynamics in design are given with particular emphasis on the Highly Maneuverable Aircraft Technology. Finally, future prospects are noted, with consideration given to the role of advanced computers, advances in numerical solution techniques, turbulence models, complex geometries, and computational design procedures. Previously announced in STAR as N82-33348

  6. Light multinary computing

    NASA Astrophysics Data System (ADS)

    Arago, Jaime

    2012-11-01

    Next-generation optical communication and optical computing imply an evolution from binary to multinary computing. Light multinary computing encodes data using pulses of light components in higher orders than binary and processes it using truth tables larger than Boolean ones. This results in lesser encoded data that can be processed at faster speeds. We use a general-purpose optical transistor as the building block to develop the main computing units for counting, distributing, storing, and logically operating the arithmetic addition of two bytes of base-10 data. Currently available optical switching technologies can be used to physically implement light multinary computing to achieve ultra-high speed communication and computing.

  7. Computers in manufacturing.

    PubMed

    Hudson, C A

    1982-02-12

    Computers are now widely used in product design and in automation of selected areas in factories. Within the next decade, the use of computers in the entire spectrum of manufacturing applications, from computer-aided design to computer-aided manufacturing and robotics, is expected to be practical and economically justified. Such widespread use of computers on the factory floor awaits further advances in computer capabilities, the emergence of systems that are adaptive to the workplace, and the development of interfaces to link islands of automation and to allow effective user communications.

  8. Polymorphous computing fabric

    DOEpatents

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  9. Moving Beyond Computer Literacy.

    ERIC Educational Resources Information Center

    Rhodes, Lewis A.

    1985-01-01

    This article reviews Sherry Turkle's book, "The Second Self: Computers and the Human Spirit," which explores the subjective impact of the computer on children, adults, and our consciousness and culture in general. Two references are provided. (DCS)

  10. Computer Intrusions and Attacks.

    ERIC Educational Resources Information Center

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  11. Human Computers 1947

    NASA Technical Reports Server (NTRS)

    1947-01-01

    Langley's human computers at work in 1947. The female presence at Langley, who performed mathematical computations for male staff. Photograph published in Winds of Change, 75th Anniversary NASA publication (page 48), by James Schultz.

  12. Computed Tomography (CT) -- Head

    MedlinePlus

    ... ray beam follows a spiral path. A special computer program processes this large volume of data to create ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ...

  13. Computed Tomography (CT) - Spine

    MedlinePlus

    ... ray beam follows a spiral path. A special computer program processes this large volume of data to create ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ...

  14. Dedicated to computation

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2017-03-01

    Princeton University astrophysicist David Spergel, founding director of the Center for Computational Astrophsyics at the Flatiron Institute, talks to Michael Banks about plans to make New York a major centre for computational science

  15. Novel Applications of Computers

    ERIC Educational Resources Information Center

    Levi, Barbara G.

    1970-01-01

    Presents some novel applications of the computer to physics research. They include (1) a computer program for calculating Compton scattering, (2) speech simulation, (3) data analysis in spectrometry, and (4) measurement of complex alpha-particle spectrum. Bibliography. (LC)

  16. The Merit Computer Network

    ERIC Educational Resources Information Center

    Aupperle, Eric M.; Davis, Donna L.

    1978-01-01

    The successful Merit Computer Network is examined in terms of both technology and operational management. The network is fully operational and has a significant and rapidly increasing usage, with three major institutions currently sharing computer resources. (Author/CMV)

  17. Computer Crime and Insurance.

    ERIC Educational Resources Information Center

    Beaudoin, Ralph H.

    1985-01-01

    The susceptibility of colleges and universities to computer crime is great. While insurance coverage is available to cover the risks, an aggressive loss-prevention program is the wisest approach to limiting the exposures presented by computer technology. (MLW)

  18. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  19. Cognitive Computing for Security.

    SciTech Connect

    Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  20. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  1. Jacobi Set Computation

    SciTech Connect

    Bhatia, Harsh

    2016-07-28

    Jacobi Set Computation is a software to compute the Jacobi set of 2 piecewise linear scalar functions defined on a triangular mesh. This functionality is useful for analyzing multiple scalar fields simultaneously.

  2. My Computer Is Learning.

    ERIC Educational Resources Information Center

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  3. Computers in Education.

    ERIC Educational Resources Information Center

    Rao, K. Srinivasa

    1991-01-01

    The important role of the digital computer in education is outlined. The existing Indian situation and needs and the scope of the Computer Literacy And Studies in Schools (CLASS) program are briefly discussed. (Author/KR)

  4. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  5. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  6. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  7. The Computing World

    DTIC Science & Technology

    1992-04-01

    to modern computers. In the 1840s Augusta Ada, the Countess of Lovelace, translated and wrote several scientific papers regarding Charles P...Babbage’s ideas on an analytical engine problem solving machine. 3 Babbage , "the father of computers," developed ways to store results via memory devices. Ada...simple arithmetic calculating machines to small complex and powerful integrated circuit computers. Jacquard, Babbage and Lovelace set the computer

  8. Chapter on Distributed Computing

    DTIC Science & Technology

    1989-02-01

    MASSACHUSETTS LABORATORY FOR INSTITUTE OF COMPUTER SCIENCE TECHNOLOGY ("D / o O MIT/LCS/TM-384 CHAPTER ON DISTRIBUTED COMPUTING Leslie Lamport Nancy...22217 ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (Miude Secuwity Ciaifiation) Chapter on Distributed Computing 12. PERSONAL AUTHOR(S) Lamport... distributed computing , distributed systems models, dis- tributed algorithms, message-passing, shared variables, 19. UBSTRACT (Continue on reverse if

  9. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  10. Nanoelectronics: Metrology and Computation

    SciTech Connect

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-09-26

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example.

  11. Physics and Computation

    DTIC Science & Technology

    1988-03-01

    com- puter for which the time development is generated by the Schr ~ dinger equa- tion must be a reversible computer . Feynman presented the first... computer for which the time development is generated by the Schrodinger equation must be a reversible computer . Feynman presented the first convincing... computation ................ 139 6.3 Time -evolution operator approach ............... 139 6.4 Hamiltonian operator approach ................. 141 6.4.1

  12. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2016-07-12

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  13. Computer Training at Harwell

    ERIC Educational Resources Information Center

    Hull, John

    1969-01-01

    By using teletypewriters connected to the Harwell multi-access computing system, lecturers can easily demonstrate the operation of the computer in the classroom; this saves time and eliminates errors and staff can carry out exercises using the main computer. (EB)

  14. Children's Computer Drawings.

    ERIC Educational Resources Information Center

    Alexander, David

    Computer drawing programs have several characteristics that make them appropriate for use in early childhood education. Drawing at the computer is an activity that captures and holds children's attention. Children at all developmental levels of graphic ability can draw at the computer, and their products can be stored in a disc or printed for…

  15. Optimizing Computer Technology Integration

    ERIC Educational Resources Information Center

    Dillon-Marable, Elizabeth; Valentine, Thomas

    2006-01-01

    The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…

  16. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  17. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  18. Reading, Writing, and Computing.

    ERIC Educational Resources Information Center

    Adams, Dennis M.; And Others

    Reading, writing, and computing, which are interrelated and can thrive on each other for literacy and intellectual growth, are in the process of becoming linked in instructional practice. As reading and writing become more demanding, their task is eased with computer use. The computer seems to provide the connection between composing,…

  19. Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  20. Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Insolia, Gerard

    This document contains course outlines in computer-aided manufacturing developed for a business-industry technology resource center for firms in eastern Pennsylvania by Northampton Community College. The four units of the course cover the following: (1) introduction to computer-assisted design (CAD)/computer-assisted manufacturing (CAM); (2) CAM…

  1. Computers + Student Activities Handbook.

    ERIC Educational Resources Information Center

    Masie, Elliott; Stein, Michele

    Designed to provide schools with the tools to start utilizing computers for student activity programs without additional expenditures, this handbook provides beginning computer users with suggestions and ideas for using computers in such activities as drama clubs, yearbooks, newspapers, activity calendars, accounting programs, room utilization,…

  2. Computers in Engineering Teaching.

    ERIC Educational Resources Information Center

    Rushby, N. J.

    This bibliography cites 26 books, papers, and reports dealing with various uses of computers in engineering education; and describes several computer programs available for use in teaching aeronautical, chemical, civil, electrical and electronic, mechanical, and nuclear engineering. Each computer program entry is presented by name, author,…

  3. Getting To Know Computers.

    ERIC Educational Resources Information Center

    Lundgren, Mary Beth

    Originally written for adult new readers involved in literacy programs, this book is also helpful to those individuals who want a basic book about computers. It uses the carefully controlled vocabulary with which adult new readers are familiar. Chapter 1 addresses the widespread use of computers. Chapter 2 discusses what a computer is and…

  4. Education for Computers

    ERIC Educational Resources Information Center

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  5. The fifth generation computer

    SciTech Connect

    Moto-Oka, T.; Kitsuregawa, M.

    1985-01-01

    The leader of Japan's Fifth Generation computer project, known as the 'Apollo' project, and a young computer scientist elucidate in this book the process of how the idea came about, international reactions, the basic technology, prospects for realization, and the abilities of the Fifth Generation computer. Topics considered included forecasting, research programs, planning, and technology impacts.

  6. Computers in Science Fiction.

    ERIC Educational Resources Information Center

    La Faille, Eugene

    1985-01-01

    This 136-item annotated bibliography listing science fiction suitable for 12-19 age range is divided into five sections, each organized alphabetically by author's name: in-print novels featuring computers, in-print anthologies of computer stories, in-print studies of computer science fiction, out-of-print novels, short stories. Current publication…

  7. Computer applications in bioprocessing.

    PubMed

    Bungay, H R

    2000-01-01

    Biotechnologists have stayed at the forefront for practical applications for computing. As hardware and software for computing have evolved, the latest advances have found eager users in the area of bioprocessing. Accomplishments and their significance can be appreciated by tracing the history and the interplay between the computing tools and the problems that have been solved in bioprocessing.

  8. Quantum walk computation

    SciTech Connect

    Kendon, Viv

    2014-12-04

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer.

  9. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  10. BNL ATLAS Grid Computing

    SciTech Connect

    Michael Ernst

    2008-10-02

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  11. Writing, Computers, and Gender.

    ERIC Educational Resources Information Center

    Beer, Ann

    1994-01-01

    Uses brief accounts by undergraduate students of their experiences with computers and word processing to investigate gender-related differences in attitudes toward computers and to explore why computers seem to reflect back to many women a learned sense of technical incompetence. Focuses on themes of power, caring, and self-esteem. (SR)

  12. Chippy's Computer Numbers.

    ERIC Educational Resources Information Center

    Girard, Suzanne; Willing, Kathlene R.

    Intended for young children just becoming familiar with computers, this counting book introduces and reinforces new computer vocabulary and concepts. The numbers, from one to twelve, are presented along with words and illustrations from the world of computers, allowing for different activities in which children can count or match and name the…

  13. Computing environment logbook

    DOEpatents

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  14. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  15. The science of computing - Parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  16. Neural Computation and the Computational Theory of Cognition

    ERIC Educational Resources Information Center

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  17. Scalable optical quantum computer

    SciTech Connect

    Manykin, E A; Mel'nichenko, E V

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  18. Computer algebra and operators

    NASA Technical Reports Server (NTRS)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  19. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  20. ALMA correlator computer systems

    NASA Astrophysics Data System (ADS)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  1. Richard Feynman and computation

    NASA Astrophysics Data System (ADS)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  2. Optimal Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Mantri, Atul; Pérez-Delgado, Carlos A.; Fitzsimons, Joseph F.

    2013-12-01

    Blind quantum computation allows a client with limited quantum capabilities to interact with a remote quantum computer to perform an arbitrary quantum computation, while keeping the description of that computation hidden from the remote quantum computer. While a number of protocols have been proposed in recent years, little is currently understood about the resources necessary to accomplish the task. Here, we present general techniques for upper and lower bounding the quantum communication necessary to perform blind quantum computation, and use these techniques to establish concrete bounds for common choices of the client’s quantum capabilities. Our results show that the universal blind quantum computation protocol of Broadbent, Fitzsimons, and Kashefi, comes within a factor of (8)/(3) of optimal when the client is restricted to preparing single qubits. However, we describe a generalization of this protocol which requires exponentially less quantum communication when the client has a more sophisticated device.

  3. Computer-assisted psychotherapy

    PubMed Central

    Wright, Jesse H.; Wright, Andrew S.

    1997-01-01

    The rationale for using computers in psychotherapy includes the possibility that therapeutic software could improve the efficiency of treatment and provide access for greater numbers of patients. Computers have not been able to reliably duplicate the type of dialogue typically used in clinician-administered therapy. However, computers have significant strengths that can be used to advantage in designing treatment programs. Software developed for computer-assisted therapy generally has been well accepted by patients. Outcome studies have usually demonstrated treatment effectiveness for this form of therapy. Future development of computer tools may be influenced by changes in health care financing and rapid growth of new technologies. An integrated care delivery model incorporating the unique attributes of both clinicians and computers should be adopted for computer-assisted therapy. PMID:9292446

  4. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  5. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  6. Assessment of sub-milli-sievert abdominal computed tomography with iterative reconstruction techniques of different vendors

    PubMed Central

    Padole, Atul; Sainani, Nisha; Lira, Diego; Khawaja, Ranish Deedar Ali; Pourjabbar, Sarvenaz; Lo Gullo, Roberto; Otrakji, Alexi; Kalra, Mannudeep K

    2016-01-01

    AIM: To assess diagnostic image quality of reduced dose (RD) abdominal computed tomography (CT) with 9 iterative reconstruction techniques (IRTs) from 4 different vendors to the standard of care (SD) CT. METHODS: In an Institutional Review Board approved study, 66 patients (mean age 60 ± 13 years, 44 men, and 22 women) undergoing routine abdomen CT on multi-detector CT (MDCT) scanners from vendors A, B, and C (≥ 64 row CT scanners) (22 patients each) gave written informed consent for acquisition of an additional RD CT series. Sinogram data of RD CT was reconstructed with two vendor-specific and a vendor-neutral IRTs (A-1, A-2, A-3; B-1, B-2, B-3; and C-1, C-2, C-3) and SD CT series with filtered back projection. Subjective image evaluation was performed by two radiologists for each SD and RD CT series blinded and independently. All RD CT series (198) were assessed first followed by SD CT series (66). Objective image noise was measured for SD and RD CT series. Data were analyzed by Wilcoxon signed rank, kappa, and analysis of variance tests. RESULTS: There were 13/50, 18/57 and 9/40 missed lesions (size 2-7 mm) on RD CT for vendor A, B, and C, respectively. Missed lesions includes liver cysts, kidney cysts and stone, gall stone, fatty liver, and pancreatitis. There were also 5, 4, and 4 pseudo lesions (size 2-3 mm) on RD CT for vendor A, B, and C, respectively. Lesions conspicuity was sufficient for clinical diagnostic performance for 6/24 (RD-A-1), 10/24 (RD-A-2), and 7/24 (RD-A-3) lesions for vendor A; 5/26 (RD-B-1), 6/26 (RD-B-2), and 7/26 (RD-B-3) lesions for vendor B; and 4/20 (RD-C-1) 6/20 (RD-C-2), and 10/20 (RD-C-3) lesions for vendor C (P = 0.9). Mean objective image noise in liver was significantly lower for RD A-1 compared to both RD A-2 and RD A-3 images (P < 0.001). Similarly, mean objective image noise lower for RD B-2 (compared to RD B-1, RD B-3) and RD C-3 (compared to RD C-1 and C-2) (P = 0.016). CONCLUSION: Regardless of IRTs and MDCT vendors

  7. Computational Biology and High Performance Computing 2000

    SciTech Connect

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  8. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century.

  9. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  10. Computer Health Score

    SciTech Connect

    2016-08-03

    The algorithm develops a single health score for office computers, today just Windows, but we plan to extend this to Apple computers. The score is derived from various parameters, including: CPU Utilization Memory Utilization Various Error logs Disk Problems Disk write queue length It then uses a weighting scheme to balance these parameters and provide an overall health score. By using these parameters, we are not just assessing the theoretical performance of the components of the computer, rather we are using actual performance metrics that are selected to be a more realistic representation of the experience of the person using the computer. This includes compensating for the nature of their use. If there are two identical computers and the user of one places heavy demands on their computer compared with the user of the second computer, the former will have a lower health score. This allows us to provide a 'fit for purpose' score tailored to the assigned user. This is very helpful data to inform the mangers when individual computers need to be replaced. Additionally it provides specific information that can facilitate the fixing of the computer, to extend it's useful lifetime. This presents direct financial savings, time savings for users transferring from one computer to the next, and better environmental stewardship.

  11. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  12. Left ventricular remodelling and systolic function measurement with 64 multi-slice computed tomography versus second harmonic echocardiography in patients with coronary artery disease: a double blind study.

    PubMed

    Palazzuoli, Alberto; Cademartiri, Filippo; Geleijnse, Marcel L; Meijboom, Bob; Pugliese, Francesca; Soliman, Osama; Calabrò, Anna; Nuti, Ranuccio; de Feyter, Pim

    2010-01-01

    The present study evaluated LV volumes, ejection fraction (LVEF) and stroke volume (SV) obtained by 64-MDCT and to compare these data with those obtained by second harmonic 2D Echo, in patients referred for non-invasive coronary vessels evaluation. The most common technique in daily clinical practice used for determination of LV function is two-dimensional echocardiography (2D-TTE). Multi-detector computed tomography (MDCT) is an emerging new technique to detect coronary artery disease (CAD) and was recently proposed to assess LV function. 93 patients underwent to 64-MDCT for LV function and volumes assessment by segmental reconstruction algorithm (Argus) and compared with recent (2 months) 2D-TTE, all images were processed and interpreted by two observers blinded to the Echo and MDCT results. A close correlation between TTE and 64 MDCT was demonstrated for the ejection fraction LVEF (r=0.84), end-diastolic volume LVEDV (r=0.80) and end-systolic volume LVESV (r=0.85); acceptable correlation was recruited for stroke volume LVSV (r=0.58). Optimal results were recruited for inter-observer variability for 64-MDCT measured in 45 patients: LVESV (r=0.82, p<0.001), LVEDV (r=0.83, p<0.001), LVEF (r=0.69, p<0.002) and SV (r=0.66, p<0.001). Our results, showed that functional and temporal information contained in a coronary 64-MDCT study can be used to assess left ventricular (LV) systolic function and LV dimensions with good reproducibility and acceptable correlation respect to 2D-TTE. The combination of non-invasive coronary artery imaging and assessment of global LV function might became in the future a fast and conclusive cardiac work-up in patients with CAD.

  13. Computational approaches to computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    The various techniques by which the goal of computational aeroacoustics (the calculation and noise prediction of a fluctuating fluid flow) may be achieved are reviewed. The governing equations for compressible fluid flow are presented. The direct numerical simulation approach is shown to be computationally intensive for high Reynolds number viscous flows. Therefore, other approaches, such as the acoustic analogy, vortex models and various perturbation techniques that aim to break the analysis into a viscous part and an acoustic part are presented. The choice of the approach is shown to be problem dependent.

  14. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  15. Reconfigurable Computing for High Performance Computing Computational Science

    DTIC Science & Technology

    2007-06-01

    the C code for validation, key test of Blowfish algorithm is composed of: the secret key load, pre-processing, and encryption / decryption . The time...block bound easily add to the clock quantum to produce cipher , the algorithm supports a hash operation by using a interesting results. Overall, with...standard execution of encryption and with integer and bit-based computing technology. Here decryption functions with constant secret key for a set of we

  16. LPF Computation Revisited

    NASA Astrophysics Data System (ADS)

    Crochemore, Maxime; Ilie, Lucian; Iliopoulos, Costas S.; Kubica, Marcin; Rytter, Wojciech; Waleń, Tomasz

    We present efficient algorithms for storing past segments of a text. They are computed using two previously computed read-only arrays (SUF and LCP) composing the Suffix Array of the text. They compute the maximal length of the previous factor (subword) occurring at each position of the text in a table called LPF. This notion is central both in many conservative text compression techniques and in the most efficient algorithms for detecting motifs and repetitions occurring in a text.

  17. Cluster State Quantum Computing

    DTIC Science & Technology

    2012-12-01

    implementation of quantum computation,” Fortschr. Phys. 48, 771 (2000). [Dragoman01] D. Dragoman, “Proposal for a three-qubit teleportation experiment”, Phys...CLUSTER STATE QUANTUM COMPUTING DECEMBER 2012 INTERIM TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...From - To) NOV 2010 – OCT 2012 4. TITLE AND SUBTITLE CLUSTER STATE QUANTUM COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c

  18. Threats to Computer Systems

    DTIC Science & Technology

    1973-03-01

    subjects and objects of attacks contribute to the uniqueness of computer-related crime. For example, as the cashless , checkless society approaches...advancing computer tech- nology and security methods, and proliferation of computers in bringing about the paperless society . The universal use of...organizations do to society . Jerry Schneider, one of the known perpetrators, said that he was motivated to perform his acts to make money, for the

  19. Mobile computing for radiology.

    PubMed

    Auffermann, William F; Chetlen, Alison L; Sharma, Arjun; Colucci, Andrew T; DeQuesada, Ivan M; Grajo, Joseph R; Kung, Justin W; Loehfelm, Thomas W; Sherry, Steven J

    2013-12-01

    The rapid advances in mobile computing technology have the potential to change the way radiology and medicine as a whole are practiced. Several mobile computing advances have not yet found application to the practice of radiology, while others have already been applied to radiology but are not in widespread clinical use. This review addresses several areas where radiology and medicine in general may benefit from adoption of the latest mobile computing technologies and speculates on potential future applications.

  20. The Rabi Quantum Computer

    DTIC Science & Technology

    2001-04-01

    example that other students learn to make quantum computers does not quite meet the RQC specification, consider useful in many fields . I also want to...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010869 TITLE: The Rabi Quantum Computer DISTRIBUTION: Approved for...comprise the compilation report: ADP010865 thru ADP010894 UNCLASSIFIED 5-1 The Rabi Quantum Computer Rudolph A. Krutar Advanced Information Technology’ U.S

  1. Cluster State Quantum Computation

    DTIC Science & Technology

    2014-02-01

    nearest neighbor cluster state has been shown to be a universal resource for MBQC thus we can say our quantum computer is universal. We note that...CLUSTER STATE QUANTUM COMPUTATION FEBRUARY 2014 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED STINFO COPY AIR FORCE...TITLE AND SUBTITLE CLUSTER STATE QUANTUM COMPUTATION 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6

  2. Sensor sentinel computing device

    DOEpatents

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  3. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  4. Systemization of Secure Computation

    DTIC Science & Technology

    2015-11-01

    studied MPC paradigm. 15. SUBJECT TERMS Garbled Circuits, Secure Multiparty Computation, SMC, Multiparty Computation, MPC, Server- aided computation 16...that may well happen for non-trivial input sizes and algorithms. One way to allow mobile devices to perform 2P-SFE is to use a server- aided ...Previous cryptographic work in a 3-party model (also referred as commodity-based, server-assisted, server- aided model) seems to have originated in [1], with

  5. Factors Affecting Computer Anxiety in High School Computer Science Students.

    ERIC Educational Resources Information Center

    Hayek, Linda M.; Stephens, Larry

    1989-01-01

    Examines factors related to computer anxiety measured by the Computer Anxiety Index (CAIN). Achievement in two programing courses was inversely related to computer anxiety. Students who had a home computer and had computer experience before high school had lower computer anxiety than those who had not. Lists 14 references. (YP)

  6. Fifth generation computers

    NASA Astrophysics Data System (ADS)

    Treleaven, Philip C.; Lima, Isabel Gouveia

    1982-06-01

    Fifth generation computers are analogous to LEGO building blocks, with each block corresponding to a microcomputer and a group of blocks working together as a computer system. These computers will represent a unification of currently separate areas of research into parallel processing and into VLSI processors. Parallel processing based on data driven and demand driven computer organisations are under investigation in well over thirty laboratories in the United States, Japan and Europe. Basically, in data driven (e.g. data flow) computers the availability of operands triggers the execution of the operation to be performed on them; whereas in demand driven (e.g. reduction) computers the requirement for a result triggers the operation that will generate the value. VLSI processors exploit very large scale integration and the new simplified chip design methodology pioneered in US universities by Mead and Conway, allowing users to design their own chips. These novel VLSI processors are implementable by simple replicated cells and use extensive pipelining and multiprocessing to achieve a high performance. Examples range from a powerful image processing device configured from identical special-purpose chips, to a large parallel computer built from replicated general-purpose microcomputers. This paper outlines these topics contributing to fifth generation computers, and speculates on their effect on computing.

  7. Hot ice computer

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2009-12-01

    We experimentally demonstrate that supersaturated solution of sodium acetate, commonly called ‘hot ice’, is a massively-parallel unconventional computer. In the hot ice computer data are represented by a spatial configuration of crystallization induction sites and physical obstacles immersed in the experimental container. Computation is implemented by propagation and interaction of growing crystals initiated at the data-sites. We discuss experimental prototypes of hot ice processors which compute planar Voronoi diagram, shortest collision-free paths and implement AND and OR logical gates.

  8. Optimization of computations

    SciTech Connect

    Mikhalevich, V.S.; Sergienko, I.V.; Zadiraka, V.K.; Babich, M.D.

    1994-11-01

    This article examines some topics of optimization of computations, which have been discussed at 25 seminar-schools and symposia organized by the V.M. Glushkov Institute of Cybernetics of the Ukrainian Academy of Sciences since 1969. We describe the main directions in the development of computational mathematics and present some of our own results that reflect a certain design conception of speed-optimal and accuracy-optimal (or nearly optimal) algorithms for various classes of problems, as well as a certain approach to optimization of computer computations.

  9. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  10. Computing by Observing Changes

    NASA Astrophysics Data System (ADS)

    Cavaliere, Matteo; Leupold, Peter

    Computing by Observing is a paradigm for the implementation of models of Natural Computing. It was inspired by the setup of experiments in biochemistry. One central feature is an observer that translates the evolution of an underlying observed system into sequences over a finite alphabet. We take a step toward more realistic observers by allowing them to notice only an occurring change in the observed system rather than to read the system's entire configuration. Compared to previous implementations of the Computing by Observing paradigm, this decreases the computational power; but with relatively simple systems we still obtain the language class generated by matrix grammars.

  11. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  12. The Social Computer

    NASA Astrophysics Data System (ADS)

    Serugendo, Giovanna Di Marzo; Risoldi, Matteo; Solemayni, Mohammad

    The following sections are included: * Introduction * Problem and Research Questions * State of the Art * TSC Structure and Computational Awareness * Methodology and Research Directions * Case Study: Democracy * Conclusions

  13. Highly parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Among the highly parallel computing architectures required for advanced scientific computation, those designated 'MIMD' and 'SIMD' have yielded the best results to date. The present development status evaluation of such architectures shown neither to have attained a decisive advantage in most near-homogeneous problems' treatment; in the cases of problems involving numerous dissimilar parts, however, such currently speculative architectures as 'neural networks' or 'data flow' machines may be entailed. Data flow computers are the most practical form of MIMD fine-grained parallel computers yet conceived; they automatically solve the problem of assigning virtual processors to the real processors in the machine.

  14. Computational approaches to vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  15. Computer Confrontation: Suppes and Albrecht

    ERIC Educational Resources Information Center

    Suppes, Patrick; Albrecht, Bob

    1973-01-01

    Two well-known computer specialists argue about the function of computers in schools. Patrick Suppes believes mastery of basic skills is the prime function of computers. Bob Albrecht believes computers should be learning devices and not drill masters. (DS)

  16. Introduction: The Computer After Me

    NASA Astrophysics Data System (ADS)

    Pitt, Jeremy

    The following sections are included: * Introduction * Computer Awareness in Science Fiction * Computer Awareness and Self-Awareness * How many senses does a computer have? * Does a computer know that it is a computer? * Does metal know when it is weakening? * Why Does Computer Awareness Matter? * Chapter Overviews * Summary and Conclusions

  17. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  18. Flexible Animation Computer Program

    NASA Technical Reports Server (NTRS)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  19. Computer Series, 97.

    ERIC Educational Resources Information Center

    Kay, Jack G.; And Others

    1988-01-01

    Describes two applications of the microcomputer for laboratory exercises. Explores radioactive decay using the Batemen equations on a Macintosh computer. Provides examples and screen dumps of data. Investigates polymer configurations using a Monte Carlo simulation on an IBM personal computer. (MVL)

  20. A Home Computer Primer.

    ERIC Educational Resources Information Center

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  1. Computer Augmented Video Education.

    ERIC Educational Resources Information Center

    Sousa, M. B.

    1979-01-01

    Describes project CAVE (Computer Augmented Video Education), an ongoing effort at the U.S. Naval Academy to present lecture material on videocassette tape, reinforced by drill and practice through an interactive computer system supported by a 12 channel closed circuit television distribution and production facility. (RAO)

  2. Computers, Networks and Education.

    ERIC Educational Resources Information Center

    Kay, Alan C.

    1991-01-01

    Discussed is how globally networked, easy-to-use computers can enhance learning only within an educational environment that encourages students to question "facts" and seek challenges. The strengths and weaknesses of computers used as amplifiers for learning are described. (KR)

  3. ELECTRONIC DIGITAL COMPUTER

    DOEpatents

    Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.

    1957-10-01

    The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.

  4. Who Owns Computer Software?

    ERIC Educational Resources Information Center

    Branscomb, Anne Wells

    1995-01-01

    Discusses the protection of intellectual property as it applies to computer software and its impact on private enterprise and the public good. Highlights include the role of patents, copyrights, and trade secrets; some court cases; and recommendations for alternatives to the existing legal framework for protecting computer software. (KRN)

  5. Quantum Analog Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  6. Computer Produced Media Guides.

    ERIC Educational Resources Information Center

    Jeffcott, Janet B.

    To increase access to the media collection at the Madison Area Technical College (Wisconsin) a computer-produced key work index was created using an International Business Machine (IBM) 360 model 40 computer and a duplicating facility with offset capability. A standard 80 column IBM card was used reserving columns 1-9 for the media item number,…

  7. COMPUTER MODELS/EPANET

    EPA Science Inventory

    Pipe network flow analysis was among the first civil engineering applications programmed for solution on the early commercial mainframe computers in the 1960s. Since that time, advancements in analytical techniques and computing power have enabled us to solve systems with tens o...

  8. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  9. Chippy's Computer Words.

    ERIC Educational Resources Information Center

    Willing, Kathlene R.; Girard, Suzanne

    Intended for young children just becoming familiar with computers, this naming book introduces and reinforces new computer vocabulary and concepts. The 20 words are presented alphabetically, along with illustrations, providing room for different activities in which children can match and name the pictures and words. The 20 vocabulary items are…

  10. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell, Ed.

    1988-01-01

    Describes three situations in which computer software was used in a chemistry laboratory. Discusses interfacing voltage output instruments with Apple II computers and using spreadsheet programs to simulate gas chromatography and analysis of kinetic data. Includes information concerning procedures, hardware, and software used in each situation. (CW)

  11. Connecting Kids and Computers

    ERIC Educational Resources Information Center

    Giles, Rebecca McMahon

    2006-01-01

    Exposure to cell phones, DVD players, video games, computers, digital cameras, and iPods has made today's young people more technologically advanced than those of any previous generation. As a result, parents are now concerned that their children are spending too much time in front of the computer. In this article, the author focuses her…

  12. Computed Tomography (CT) -- Sinuses

    MedlinePlus

    ... More Info Images/Videos About Us News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment to evaluate the paranasal sinus cavities – hollow, air-filled spaces within the bones of the face surrounding the ...

  13. Computers in the Classroom.

    ERIC Educational Resources Information Center

    Merton, Andrew

    1983-01-01

    Major computer manufacturers are donating microcomputers to school systems. Discusses whether or not these computers in the classroom will help spur a "back to basics" movement or prove to be an expensive and largely tangential sideshow. Considers software, software evaluation, skill transfer, and other topics. (JN)

  14. Computer Programmer/Analyst.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 25 subjects appropriate for use in a competency list for the occupation of computer programmer/analyst, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 25 units are as…

  15. Computer Software Reviews.

    ERIC Educational Resources Information Center

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  16. Computational physics: a perspective.

    PubMed

    Stoneham, A M

    2002-06-15

    Computing comprises three distinct strands: hardware, software and the ways they are used in real or imagined worlds. Its use in research is more than writing or running code. Having something significant to compute and deploying judgement in what is attempted and achieved are especially challenging. In science or engineering, one must define a central problem in computable form, run such software as is appropriate and, last but by no means least, convince others that the results are both valid and useful. These several strands are highly interdependent. A major scientific development can transform disparate aspects of information and computer technologies. Computers affect the way we do science, as well as changing our personal worlds. Access to information is being transformed, with consequences beyond research or even science. Creativity in research is usually considered uniquely human, with inspiration a central factor. Scientific and technological needs are major forces in innovation, and these include hardware and software opportunities. One can try to define the scientific needs for established technologies (atomic energy, the early semiconductor industry), for rapidly developing technologies (advanced materials, microelectronics) and for emerging technologies (nanotechnology, novel information technologies). Did these needs define new computing, or was science diverted into applications of then-available codes? Regarding credibility, why is it that engineers accept computer realizations when designing engineered structures, whereas predictive modelling of materials has yet to achieve industrial confidence outside very special cases? The tensions between computing and traditional science are complex, unpredictable and potentially powerful.

  17. Programming the social computer.

    PubMed

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  18. Chemistry by Computer.

    ERIC Educational Resources Information Center

    Garmon, Linda

    1981-01-01

    Describes the features of various computer chemistry programs. Utilization of computer graphics, color, digital imaging, and other innovations are discussed in programs including those which aid in the identification of unknowns, predict whether chemical reactions are feasible, and predict the biological activity of xenobiotic compounds. (CS)

  19. Computers and Personal Privacy.

    ERIC Educational Resources Information Center

    Ware, Willis H.

    Privacy is an issue that arises from the intersection of a demand for improved recordkeeping processes, and computing technology as the response to the demand. Recordkeeping in the United States centers on information about people. Modern day computing technology has the ability to maintain, store, and retrieve records quickly; however, this…

  20. Computer controlled antenna system

    NASA Technical Reports Server (NTRS)

    Raumann, N. A.

    1972-01-01

    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  1. Computational chemistry at Janssen

    NASA Astrophysics Data System (ADS)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2016-12-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  2. Computer Literacy Project 1983.

    ERIC Educational Resources Information Center

    El Dorado County Office of Education, Placerville, CA.

    A K-12 computer literacy course of study is presented. Four basic parts are included: (1) a reference index which organizes 37 computer literacy topics into seven major categories; (2) a master index which presents goals for each topic by grade level and lists a reference number for each goal; (3) a scope and sequence organized by grade level…

  3. Computer Series, 78.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1986-01-01

    Presents six brief articles dealing with the use of computers in teaching various topics in chemistry. Describes hardware and software applications which relate to protein graphics, computer simulated metabolism, interfaces between microcomputers and measurement devices, courseware available for spectrophotometers, and the calculation of elemental…

  4. Extreme Scale Computing Studies

    DTIC Science & Technology

    2010-12-01

    SUPPLEMENTARY NOTES DISTAR case 17642; Clearance Date: 18 July 2011. This report contains color. 14 . ABSTRACT Four studies were conducted to...Embedded Computing Applications................................................. 14 4.2.3 Low Power Computational Algorithms... 14 4.2.4 High Performance Libraries for Advanced Graphics Processing Units ................ 15 4.2.5 Metrics

  5. Computational chemistry at Janssen.

    PubMed

    van Vlijmen, Herman; Desjarlais, Renee L; Mirzadegan, Tara

    2016-12-19

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  6. UltraScale Computing

    NASA Astrophysics Data System (ADS)

    Maynard, , Jr.

    1997-08-01

    The Defense Advanced Research Projects Agency Information Technology Office (DARPA/ITO) supports research in technology for defense-critical applications. Defense Applications are always insatiable consumers of computing. Futuristic applications such as automated image interpretation/whole vehicle radar-cross-section/real-time prototyping/faster-than-real-time simulation will require computing capabilities orders-of-magnitude beyond the best performance that can be projected from contemporary scalable parallel processors. To reach beyond the silicon digital paradigm, DARPA has initiated a program in UltraScale Computing to explore the domain of innovative computational models, methods, and mechanisms. The objective is to encourage a complete re-thinking of computing. Novel architectures, program synthesis, and execution environments are needed as well as alternative underlying physical mechanisms including molecular, biological, optical and quantum mechanical processes. Development of these advanced computing technologies will offer spectacular performance and cost improvements beyond the threshold of traditional materials and processes. The talk will focus on novel approaches for employing vastly more computational units than shrinking transistors will enable and exploration of the biological options for solving computationally difficult problems.

  7. Parallel and Distributed Computing.

    DTIC Science & Technology

    1986-12-12

    program was devoted to parallel and distributed computing . Support for this part of the program was obtained from the present Army contract and a...Umesh Vazirani. A workshop on parallel and distributed computing was held from May 19 to May 23, 1986 and drew 141 participants. Keywords: Mathematical programming; Protocols; Randomized algorithms. (Author)

  8. Computers and School Reform.

    ERIC Educational Resources Information Center

    McDaniel, Ernest; And Others

    1993-01-01

    Discusses ways in which computers can be used to help school reform by shifting the emphasis from information transmission to information processing. Highlights include creating learning communities that extend beyond the classroom; educationally oriented computer networks; Professional Development Schools for curriculum development; and new…

  9. Learning with Ubiquitous Computing

    ERIC Educational Resources Information Center

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  10. Computer Virus Protection

    ERIC Educational Resources Information Center

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  11. Computational Modeling of Tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)

    1995-01-01

    This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.

  12. Computer Room Water Protection.

    ERIC Educational Resources Information Center

    Price, Bennett J.

    1990-01-01

    Addresses the protection of computer rooms from water. Sources of water and potentially vulnerable areas in computer rooms are described. Water detection is then discussed, and several detection systems are detailed. Prices and manufacturers' telephone numbers for some of the systems are included. Water cleanup is also briefly considered. (MES)

  13. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  14. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  15. Computers and Classroom Culture.

    ERIC Educational Resources Information Center

    Schofield, Janet Ward

    This book explores the meaning of computer technology in schools. The book is based on data gathered from a two-year observation of more than 30 different classrooms in an urban high school: geometry classes in which students used artificially intelligent tutors; business classes in which students learned word processing; and computer science…

  16. Preventing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  17. Computer Aided Art Major.

    ERIC Educational Resources Information Center

    Gibson, Jim

    The Computer Aided Art program offered at Northern State State University (Aberdeen, South Dakota), is coordinated with the traditional art major. The program is designed to familiarize students with a wide range of art-related computer hardware and software and their applications and to prepare students for problem-solving with unfamiliar…

  18. Metacomputing on Commodity Computers

    DTIC Science & Technology

    1999-05-01

    Journal on Future Generation Computer Systems , 1999. [17] L. Beca, G. Cheng, G. Fox, T. Jurga, K. Olszewski, M. Podgorny, P. Sokol- wski, and K...Katramatos, J. Karpovich, and A. Grimshaw. Resource man- agement in Legion. International Journal on Future Generation Computer Systems (to appear

  19. Classroom Computer Network.

    ERIC Educational Resources Information Center

    Lent, John

    1984-01-01

    This article describes a computer network system that connects several microcomputers to a single disk drive and one copy of software. Many schools are switching to networks as a cheaper and more efficient means of computer instruction. Teachers may be faced with copywriting problems when reproducing programs. (DF)

  20. Computers as Cognitive Tools.

    ERIC Educational Resources Information Center

    Lajoie, Susanne P., Ed.; Derry, Sharon J., Ed.

    This book provides exemplars of the types of computer-based learning environments represented by the theoretical camps within the field and the practical applications of the theories. The contributors discuss a variety of computer applications to learning, ranging from school-related topics such as geometry, algebra, biology, history, physics, and…

  1. Computations in Plasma Physics.

    ERIC Educational Resources Information Center

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  2. Campus Computing Strategies.

    ERIC Educational Resources Information Center

    McCredie, John W., Ed.

    Ten case studies that describe the planning process and strategies employed by colleges who use computing and communication systems are presented, based on a 1981-1982 study conducted by EDUCOM. An introduction by John W. McCredie summarizes several current and future effects of the rapid spread and integration of computing and communication…

  3. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  4. Profiling Computing Coordinators.

    ERIC Educational Resources Information Center

    Edwards, Sigrid; Morton, Allan

    The people responsible for managing school computing resources in Australia have become known as Computing Coordinators. To date there has been no large systematic study of the role, responsibilities and characteristics of this position. This paper represents a first attempt to provide information on the functions and attributes of the Computing…

  5. Computer Yearbook 72.

    ERIC Educational Resources Information Center

    1972

    Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…

  6. The Economics of Computers.

    ERIC Educational Resources Information Center

    Sharpe, William F.

    A microeconomic theory is applied in this book to computer services and costs and for the benefit of those who are decision-makers in the selection, financing, and use of computers. Subtopics of the theory discussed include value and demand; revenue and profits; time and risk; and costs, inputs, and outputs. Application of the theory is explained…

  7. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  8. Computer Anxiety and Instruction.

    ERIC Educational Resources Information Center

    Baumgarte, Roger

    While the computer is commonly viewed as a tool for simplifying and enriching lives, many individuals react to this technology with feelings of anxiety, paranoia, and alienation. These reactions may have potentially serious career and educational consequences. Fear of computers reflects a generalized fear of current technology and is most…

  9. Computer Networking for Educators.

    ERIC Educational Resources Information Center

    McCain, Ted D. E.; Ekelund, Mark

    This book is intended to introduce the basic concepts of connecting computers together and to equip individuals with the technical background necessary to begin constructing small networks. For those already experienced with creating and maintaining computer networks, the book can help in considering the creation of a schoolwide network. The book…

  10. Computer Series, 112.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1990-01-01

    Four microcomputer applications are presented including: "Computer Simulated Process of 'Lead Optimization': A Student-Interactive Program,""A PROLOG Program for the Generation of Molecular Formulas,""Determination of Inflection Points from Experimental Data," and "LAOCOON PC: NMR Simulation on a Personal Computer." Software, availability,…

  11. Computers on Wheels.

    ERIC Educational Resources Information Center

    Rosemead Elementary School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: How does a school provide the computer learning experiences for students given the paucity of available funding for hardware, software, and staffing? Here is what one school, Emma W. Shuey in Rosemead, did after exploratory research on computers by a committee of teachers and administrators. The…

  12. Nasal computed tomography.

    PubMed

    Kuehn, Ned F

    2006-05-01

    Chronic nasal disease is often a challenge to diagnose. Computed tomography greatly enhances the ability to diagnose chronic nasal disease in dogs and cats. Nasal computed tomography provides detailed information regarding the extent of disease, accurate discrimination of neoplastic versus nonneoplastic diseases, and identification of areas of the nose to examine rhinoscopically and suspicious regions to target for biopsy.

  13. Scheduling THE Computer.

    ERIC Educational Resources Information Center

    McKay, Martin D.

    1998-01-01

    Focuses on how to schedule the use of a single computer so that all students are represented and given equal access. Suggests that a computer management team be selected from within the class; discusses the teacher's role and student role definition and responsibility assignments. (AEF)

  14. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1989-01-01

    Discussed are some uses of computers in chemistry classrooms. Described are: (1) interactive chromatographic analysis software; (2) computer interface for a digital frequency-period-counter-ratio meter and analog interface based on a voltage-to-frequency converter; and (3) use of spectrometer/microcomputer arrangement for teaching atomic theory.…

  15. Onward to Petaflops Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.

  16. Indirection and computer security.

    SciTech Connect

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  17. (Computer) Vision without Sight

    PubMed Central

    Manduchi, Roberto; Coughlan, James

    2012-01-01

    Computer vision holds great promise for helping persons with blindness or visual impairments (VI) to interpret and explore the visual world. To this end, it is worthwhile to assess the situation critically by understanding the actual needs of the VI population and which of these needs might be addressed by computer vision. This article reviews the types of assistive technology application areas that have already been developed for VI, and the possible roles that computer vision can play in facilitating these applications. We discuss how appropriate user interfaces are designed to translate the output of computer vision algorithms into information that the user can quickly and safely act upon, and how system-level characteristics affect the overall usability of an assistive technology. Finally, we conclude by highlighting a few novel and intriguing areas of application of computer vision to assistive technology. PMID:22815563

  18. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  19. Computer Security for the Computer Systems Manager.

    DTIC Science & Technology

    1982-12-01

    concern of computer security is the auditing of the system in both the normal and standby nodes of operation (Ref. 2: p. 21. Risk manaqement Is the...planning and auditing will be treated in Chapter six. B. COST EFFECTIVENESS DETERMIN&TION As d’cussed before, the third part of risk analysis is the...to physical security and depend upon some of the following considerations: * physical location * availability of fire and law enforcement services

  20. Computational electromagnetics and parallel dense matrix computations

    SciTech Connect

    Forsman, K.; Kettunen, L.; Gropp, W.; Levine, D.

    1995-06-01

    We present computational results using CORAL, a parallel, three-dimensional, nonlinear magnetostatic code based on a volume integral equation formulation. A key feature of CORAL is the ability to solve, in parallel, the large, dense systems of linear equations that are inherent in the use of integral equation methods. Using the Chameleon and PSLES libraries ensures portability and access to the latest linear algebra solution technology.

  1. Computing Algebraic Immunity by Reconfigurable Computer

    DTIC Science & Technology

    2012-09-01

    the linear system, then the amount of computation required is O (( n d )ω) , where ω is the well–known “exponent of Gaussian reduction” (ω = 3 ( Gauss ...x2, x3) = x1x2 ⊕ x1x3 ⊕ x2x3. The top half of Table 2 shows the minterm canonical form of f̄ . Here, the first (leftmost) column represents all

  2. Computational electromagnetics and parallel dense matrix computations

    SciTech Connect

    Forsman, K.; Kettunen, L.; Gropp, W.

    1995-12-01

    We present computational results using CORAL, a parallel, three-dimensional, nonlinear magnetostatic code based on a volume integral equation formulation. A key feature of CORAL is the ability to solve, in parallel, the large, dense systems of linear equations that are inherent in the use of integral equation methods. Using the Chameleon and PSLES libraries ensures portability and access to the latest linear algebra solution technology.

  3. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  4. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  5. Computing with synthetic protocells.

    PubMed

    Courbet, Alexis; Molina, Franck; Amar, Patrick

    2015-09-01

    In this article we present a new kind of computing device that uses biochemical reactions networks as building blocks to implement logic gates. The architecture of a computing machine relies on these generic and composable building blocks, computation units, that can be used in multiple instances to perform complex boolean functions. Standard logical operations are implemented by biochemical networks, encapsulated and insulated within synthetic vesicles called protocells. These protocells are capable of exchanging energy and information with each other through transmembrane electron transfer. In the paradigm of computation we propose, protoputing, a machine can solve only one problem and therefore has to be built specifically. Thus, the programming phase in the standard computing paradigm is represented in our approach by the set of assembly instructions (specific attachments) that directs the wiring of the protocells that constitute the machine itself. To demonstrate the computing power of protocellular machines, we apply it to solve a NP-complete problem, known to be very demanding in computing power, the 3-SAT problem. We show how to program the assembly of a machine that can verify the satisfiability of a given boolean formula. Then we show how to use the massive parallelism of these machines to verify in less than 20 min all the valuations of the input variables and output a fluorescent signal when the formula is satisfiable or no signal at all otherwise.

  6. Computers as tools

    SciTech Connect

    Eriksson, I.V.

    1994-12-31

    The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

  7. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  8. Computational Tractability - Beyond Turing?

    NASA Astrophysics Data System (ADS)

    Marcer, Peter; Rowlands, Peter

    A fundamental problem in the theory of computing concerns whether descriptions of systems at all times remain tractable, that is whether the complexity that inevitably results can be reduced to a polynomial form (P) or whether some problems lead to a non-polynomial (NP) exponential growth in complexity. Here, we propose that the universal computational rewrite system that can be shown to be responsible ultimately for the development of mathematics, physics, chemistry, biology and even human consciousness, is so structured that Nature will always be structured as P at any scale and so will be computationally tractable.

  9. World's Most Powerful Computer

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The use of the Cray 2 supercomputer, the fastest computer in the world, at ARC is detailed. The Cray 2 can perform 250 million calculations per second and has 10 times the memory of any other computer. Ames researchers are shown creating computer simulations of aircraft airflow, waterflow around a submarine, and fuel flow inside of the Space Shuttle's engines. The video also details the Cray 2's use in calculating airflow around the Shuttle and its external rockets during liftoff for the first time and in the development of the National Aero Space Plane.

  10. Human Computer Interaction

    NASA Astrophysics Data System (ADS)

    Bhagwani, Akhilesh; Sengar, Chitransh; Talwaniper, Jyotsna; Sharma, Shaan

    2012-08-01

    The paper basically deals with the study of HCI (Human computer interaction) or BCI(Brain-Computer-Interfaces) Technology that can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments. The HCI is based as a direct communication pathway between the brain and an external device. BCIs are often aimed at assisting, augmenting, or repairing human cognitive or sensory-motor functions.The paper also deals with many advantages of BCI Technology along with some of its applications and some major drawbacks.

  11. Teaching Physics with Computers

    NASA Astrophysics Data System (ADS)

    Botet, R.; Trizac, E.

    2005-09-01

    Computers are now so common in our everyday life that it is difficult to imagine the computer-free scientific life of the years before the 1980s. And yet, in spite of an unquestionable rise, the use of computers in the realm of education is still in its infancy. This is not a problem with students: for the new generation, the pre-computer age seems as far in the past as the the age of the dinosaurs. It may instead be more a question of teacher attitude. Traditional education is based on centuries of polished concepts and equations, while computers require us to think differently about our method of teaching, and to revise the content accordingly. Our brains do not work in terms of numbers, but use abstract and visual concepts; hence, communication between computer and man boomed when computers escaped the world of numbers to reach a visual interface. From this time on, computers have generated new knowledge and, more importantly for teaching, new ways to grasp concepts. Therefore, just as real experiments were the starting point for theory, virtual experiments can be used to understand theoretical concepts. But there are important differences. Some of them are fundamental: a virtual experiment may allow for the exploration of length and time scales together with a level of microscopic complexity not directly accessible to conventional experiments. Others are practical: numerical experiments are completely safe, unlike some dangerous but essential laboratory experiments, and are often less expensive. Finally, some numerical approaches are suited only to teaching, as the concept necessary for the physical problem, or its solution, lies beyond the scope of traditional methods. For all these reasons, computers open physics courses to novel concepts, bringing education and research closer. In addition, and this is not a minor point, they respond naturally to the basic pedagogical needs of interactivity, feedback, and individualization of instruction. This is why one can

  12. Computer aided surface representation

    SciTech Connect

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  13. NEMAR plotting computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.

  14. Computational Science and Innovation

    SciTech Connect

    Dean, David Jarvis

    2011-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  15. Computational biology for ageing.

    PubMed

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M

    2011-01-12

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions.

  16. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  17. Convergence: Computing and communications

    SciTech Connect

    Catlett, C.

    1996-12-31

    This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

  18. Neural computation and the computational theory of cognition.

    PubMed

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation.

  19. Computer surety: computer system inspection guidance. [Contains glossary

    SciTech Connect

    Not Available

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  20. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  1. Resurrecting the computer graveyard

    SciTech Connect

    McAdams, C.L.

    1995-02-01

    What eventually happens to all the old monitors, circuit boards, and other peripherial computer equipment when they die or simply become obsolete Disposal has been made difficult by a US EPA landfill ban on some circuit boards with high lead content, and other computer parts that contain what some call dangerous amounts of toxic substances. With few other options, most companies simply store the obsolete equipment. Often referred to as computer or dinosaur graveyards'', these increasingly numerous office purgatories can seem like permanent fixtures in the modern work-place. Not to worry, according to a new and expanding branch of the recycling industry. While there are some companies cashing in on the refurbishment and resale of these old computers, a growing number are successfully recycling the component parts--selling the plastics, metal, and circuit boards--and doing it safely.

  2. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Larrabee, C. E., Jr.; And Others

    1988-01-01

    Describes the use of computer spreadsheet programs in physical chemistry classrooms. Stresses the application of bar graphs to fundamental quantum and statistical mechanics. Lists the advantages of the use of spreadsheets and gives examples of possible uses. (ML)

  3. Computers boost structural technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.

    1989-01-01

    Derived from matrix methods of structural analysis and finite element methods developed over the last three decades, computational structures technology (CST) blends computer science, numerical analysis, and approximation theory into structural analysis and synthesis. Recent significant advances in CST include stochastic-based modeling, strategies for performing large-scale structural calculations on new computing systems, and the integration of CST with other disciplinary modules for multidisciplinary analysis and design. New methodologies have been developed at NASA for integrated fluid-thermal structural analysis and integrated aerodynamic-structure-control design. The need for multiple views of data for different modules also led to the development of a number of sophisticated data-base management systems. For CST to play a role in the future development of structures technology and in the multidisciplinary design of future flight vehicles, major advances and computational tools are needed in a number of key areas.

  4. Quantum computing: towards reality

    NASA Astrophysics Data System (ADS)

    Trabesinger, Andreas

    2017-03-01

    The concept of computers that harness the laws of quantum mechanics has transformed our thinking about how information can be processed. Now the environment exists to make prototype devices a reality.

  5. Computer Series, 107.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1989-01-01

    Presented are five examples of the application of computer technology to undergraduate chemistry instruction. Topics include: an interface for kinetic experimentation; theoretical calculations; automation of chemical techniques; searching the chemical literature; and Huckel molecular orbitals. (CW)

  6. Cars, Computers, and Curriculum.

    ERIC Educational Resources Information Center

    Suhor, Charles

    1983-01-01

    After comparing the effect of the automobile on society and education in the early 1900's, the author states that present day arguments for drastic educational change because of the power of the computer are premature. (MLF)

  7. Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Broadbent, Brooke

    1990-01-01

    Provides insight into how computers can be used in union education. States that they will never replace an effective classroom environment where participants' questions are answered by instructors, but can support existing systems. (JOW)

  8. Computational drug discovery

    PubMed Central

    Ou-Yang, Si-sheng; Lu, Jun-yan; Kong, Xiang-qian; Liang, Zhong-jie; Luo, Cheng; Jiang, Hualiang

    2012-01-01

    Computational drug discovery is an effective strategy for accelerating and economizing drug discovery and development process. Because of the dramatic increase in the availability of biological macromolecule and small molecule information, the applicability of computational drug discovery has been extended and broadly applied to nearly every stage in the drug discovery and development workflow, including target identification and validation, lead discovery and optimization and preclinical tests. Over the past decades, computational drug discovery methods such as molecular docking, pharmacophore modeling and mapping, de novo design, molecular similarity calculation and sequence-based virtual screening have been greatly improved. In this review, we present an overview of these important computational methods, platforms and successful applications in this field. PMID:22922346

  9. Downloadable Computational Toxicology Data

    EPA Pesticide Factsheets

    EPA’s computational toxicology research generates data that investigates the potential harm, or hazard of a chemical, the degree of exposure to chemicals as well as the unique chemical characteristics. This data is publicly available here.

  10. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  11. Cloud computing security.

    SciTech Connect

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    2010-10-01

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

  12. Brain-Computer Symbiosis

    PubMed Central

    Schalk, Gerwin

    2009-01-01

    The theoretical groundwork of the 1930’s and 1940’s and the technical advance of computers in the following decades provided the basis for dramatic increases in human efficiency. While computers continue to evolve, and we can still expect increasing benefits from their use, the interface between humans and computers has begun to present a serious impediment to full realization of the potential payoff. This article is about the theoretical and practical possibility that direct communication between the brain and the computer can be used to overcome this impediment by improving or augmenting conventional forms of human communication. It is about the opportunity that the limitations of our body’s input and output capacities can be overcome using direct interaction with the brain, and it discusses the assumptions, possible limitations, and implications of a technology that I anticipate will be a major source of pervasive changes in the coming decades. PMID:18310804

  13. Computers in the Classroom.

    ERIC Educational Resources Information Center

    Sigg, S. F.

    1986-01-01

    Some ways computers are used in classrooms today and their potential as a tool to enhance instruction are explored. Considered are graphics, lesson planning, test preparation, drill and practice, remedial work, demonstrations, simulations, and introduction to programming. (MNS)

  14. Computer Aided Creativity.

    ERIC Educational Resources Information Center

    Proctor, Tony

    1988-01-01

    Explores the conceptual components of a computer program designed to enhance creative thinking and reviews software that aims to stimulate creative thinking. Discusses BRAIN and ORACLE, programs intended to aid in creative problem solving. (JOW)

  15. Computational Aeroacoustics: An Overview

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    2003-01-01

    An overview of recent advances in computational aeroacoustics (CAA) is presented. CAA algorithms must not be dispersive and dissipative. It should propagate waves supported by the Euler equations with the correct group velocities. Computation domains are inevitably finite in size. To avoid the reflection of acoustic and other outgoing waves at the boundaries of the computation domain, it is required that special boundary conditions be imposed at the boundary region. These boundary conditions either absorb all the outgoing waves without reflection or allow the waves to exit smoothly. High-order schemes, invariably, supports spurious short waves. These spurious waves tend to pollute the numerical solution. They must be selectively damped or filtered out. All these issues and relevant computation methods are briefly reviewed. Jet screech tones are known to have caused structural fatigue in military combat aircrafts. Numerical simulation of the jet screech phenomenon is presented as an example of a successful application of CAA.

  16. Computers Transform an Industry.

    ERIC Educational Resources Information Center

    Simich, Jack

    1982-01-01

    Describes the use of computer technology in the graphics communication industry. Areas that are examined include typesetting, color scanners, communications satellites, page make-up systems, and the business office. (CT)

  17. Computers and Early Learning.

    ERIC Educational Resources Information Center

    Banet, Bernard

    1978-01-01

    This paper discusses the effect that microelectronic technology will have on elementary education in the decades ahead, and some of the uses of computers as learning aids for young children, including interactive games, tutorial systems, creative activity, and simulation. (MP)

  18. Goldstone (GDSCC) administrative computing

    NASA Technical Reports Server (NTRS)

    Martin, H.

    1981-01-01

    The GDSCC Data Processing Unit provides various administrative computing services for Goldstone. Those activities, including finance, manpower and station utilization, deep-space station scheduling and engineering change order (ECO) control are discussed.

  19. Biomarkers in Computational Toxicology

    EPA Science Inventory

    Biomarkers are a means to evaluate chemical exposure and/or the subsequent impacts on toxicity pathways that lead to adverse health outcomes. Computational toxicology can integrate biomarker data with knowledge of exposure, chemistry, biology, pharmacokinetics, toxicology, and e...

  20. Knowledge and Distributed computation

    DTIC Science & Technology

    1990-05-01

    convincing evidence that reasoning in terms of knowledge can lead to .. n... uif.yi ...... lts" about diStfibuLuc computation, and we extend the standard...can be made precise in the context of computer science. In this thesis, we pro- vide convincing evidence that reasoning in terms of knowledge can lead ...against different adversaries. We show how different adversaries lead to different definitions of probabilistic knowledge, and given a particular adversary

  1. Computer Literacy Education

    DTIC Science & Technology

    1989-01-01

    Becker, projcct director, "Instructional Uses of School Computers" (Baltimore, Mary- land: Center for Social Organization of Schools, The Johns Hopkins...relevant viewpoints in one study. 1.2 Guide to this Report The remainder of this thesis is organized as follows: 2See John Van Maanen, Tales of the...Jack Turner. 2.4. PROGRAMMING 33 people to conirol computers - John Kemeny talks about learning to "make them [com- puters] do what you want them to doř

  2. Computer Models of Proteins

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  3. Quantum Computing since Democritus

    NASA Astrophysics Data System (ADS)

    Aaronson, Scott

    2013-03-01

    1. Atoms and the void; 2. Sets; 3. Gödel, Turing, and friends; 4. Minds and machines; 5. Paleocomplexity; 6. P, NP, and friends; 7. Randomness; 8. Crypto; 9. Quantum; 10. Quantum computing; 11. Penrose; 12. Decoherence and hidden variables; 13. Proofs; 14. How big are quantum states?; 15. Skepticism of quantum computing; 16. Learning; 17. Interactive proofs and more; 18. Fun with the Anthropic Principle; 19. Free will; 20. Time travel; 21. Cosmology and complexity; 22. Ask me anything.

  4. Computer Games and Instruction

    ERIC Educational Resources Information Center

    Tobias, Sigmund, Ed.; Fletcher, J. D., Ed.

    2011-01-01

    There is intense interest in computer games. A total of 65 percent of all American households play computer games, and sales of such games increased 22.9 percent last year. The average amount of game playing time was found to be 13.2 hours per week. The popularity and market success of games is evident from both the increased earnings from games,…

  5. Theoretical and computational chemistry.

    PubMed

    Meuwly, Markus

    2010-01-01

    Computer-based and theoretical approaches to chemical problems can provide atomistic understanding of complex processes at the molecular level. Examples ranging from rates of ligand-binding reactions in proteins to structural and energetic investigations of diastereomers relevant to organo-catalysis are discussed in the following. They highlight the range of application of theoretical and computational methods to current questions in chemical research.

  6. Professionalism in Computer Forensics

    NASA Astrophysics Data System (ADS)

    Irons, Alastair D.; Konstadopoulou, Anastasia

    The paper seeks to address the need to consider issues regarding professionalism in computer forensics in order to allow the discipline to develop and to ensure the credibility of the discipline from the differing perspectives of practitioners, the criminal justice system and in the eyes of the public. There is a need to examine and develop professionalism in computer forensics in order to promote the discipline and maintain the credibility of the discipline.

  7. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  8. Computer-assisted instruction

    NASA Technical Reports Server (NTRS)

    Atkinson, R. C.

    1974-01-01

    The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.

  9. Partnership in Computational Science

    SciTech Connect

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  10. SLAC B Factory computing

    SciTech Connect

    Kunz, P.F.

    1992-02-01

    As part of the research and development program in preparation for a possible B Factory at SLAC, a group has been studying various aspects of HEP computing. In particular, the group is investigating the use of UNIX for all computing, from data acquisition, through analysis, and word processing. A summary of some of the results of this study will be given, along with some personal opinions on these topics.

  11. Research in Computer Forensics

    DTIC Science & Technology

    2002-06-01

    mails . Forged Email Trusted MTA Mails with Spoofed Sender Identity Forged Emails 491 Email Forgery Forging an email on SMTP (Simple Mail Transport...As such, the discipline of computer forensic analysis has emerged to meet such needs. Computers can contain evidence in many ways, in electronic mail ...shortcut files, registry entries, printer spool and operating system logs for the system events, internet information server, Exchange mail server

  12. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1985-01-01

    Synopses are given for NASA supported work in computer science at the University of Virginia. Some areas of research include: error seeding as a testing method; knowledge representation for engineering design; analysis of faults in a multi-version software experiment; implementation of a parallel programming environment; two computer graphics systems for visualization of pressure distribution and convective density particles; task decomposition for multiple robot arms; vectorized incomplete conjugate gradient; and iterative methods for solving linear equations on the Flex/32.

  13. Recent computational chemistry

    SciTech Connect

    Onishi, Taku

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  14. Word To Compute By: Keeping Up with Personal Computing Terminology.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1997-01-01

    Reviews current terminology related to personal computing, referring to a previous article written five years ago. Highlights include Macintosh processors; Pentium processors; computer memory; computer buses; video displays; storage devices; newer media; and miscellaneous terms and concepts. (LRW)

  15. Survey of Fault Tolerant Computer Security and Computer Safety.

    DTIC Science & Technology

    introduction to the report as a whole. Contents: Fundamental Concepts of Fault-Tolerant Computing; Survey of Device and System Testing; Computer Security in Defense Systems; and State of the Art of Safety for Computer Controlled Systems .

  16. Computer Technology in Adult Education.

    ERIC Educational Resources Information Center

    Slider, Patty; Hodges, Kathy; Carter, Cea; White, Barbara

    This publication provides materials to help adult educators use computer technology in their teaching. Section 1, Computer Basics, contains activities and materials on these topics: increasing computer literacy, computer glossary, parts of a computer, keyboard, disk care, highlighting text, scrolling and wrap-around text, setting up text,…

  17. Quantum computing with trapped ions

    SciTech Connect

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  18. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    ERIC Educational Resources Information Center

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  19. Computing with neural synchrony.

    PubMed

    Brette, Romain

    2012-01-01

    Neurons communicate primarily with spikes, but most theories of neural computation are based on firing rates. Yet, many experimental observations suggest that the temporal coordination of spikes plays a role in sensory processing. Among potential spike-based codes, synchrony appears as a good candidate because neural firing and plasticity are sensitive to fine input correlations. However, it is unclear what role synchrony may play in neural computation, and what functional advantage it may provide. With a theoretical approach, I show that the computational interest of neural synchrony appears when neurons have heterogeneous properties. In this context, the relationship between stimuli and neural synchrony is captured by the concept of synchrony receptive field, the set of stimuli which induce synchronous responses in a group of neurons. In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. This theory of synchrony-based computation shows that relative spike timing may indeed have computational relevance, and suggests new types of neural network models for sensory processing with appealing computational properties.

  20. Is thinking computable?

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.

  1. Computing with Neural Synchrony

    PubMed Central

    Brette, Romain

    2012-01-01

    Neurons communicate primarily with spikes, but most theories of neural computation are based on firing rates. Yet, many experimental observations suggest that the temporal coordination of spikes plays a role in sensory processing. Among potential spike-based codes, synchrony appears as a good candidate because neural firing and plasticity are sensitive to fine input correlations. However, it is unclear what role synchrony may play in neural computation, and what functional advantage it may provide. With a theoretical approach, I show that the computational interest of neural synchrony appears when neurons have heterogeneous properties. In this context, the relationship between stimuli and neural synchrony is captured by the concept of synchrony receptive field, the set of stimuli which induce synchronous responses in a group of neurons. In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. This theory of synchrony-based computation shows that relative spike timing may indeed have computational relevance, and suggests new types of neural network models for sensory processing with appealing computational properties. PMID:22719243

  2. EVOLUTIONARY COMPUTING PROJECT

    SciTech Connect

    C. BARRETT; C. REIDYS

    2000-09-01

    This report summarizes LDRD-funded mathematical research related to computer simulation, inspired in part by combinatorial analysis of sequence to structure relationships of bio-molecules. Computer simulations calculate the interactions among many individual, local entities, thereby generating global dynamics. The objective of this project was to establish a mathematical basis for a comprehensive theory of computer simulations. This mathematical theory is intended to rigorously underwrite very large complex simulations, including simulation of bio- and socio-technical systems. We believe excellent progress has been made. Abstraction of three main ingredients of simulation forms the mathematical setting, called Sequential Dynamical Systems (SDS): (1) functions realized as data-local procedures represent entity state transformations, (2) a graph that expresses locality of the functions and which represents the dependencies among entities, and (3) an ordering, or schedule according to which the entities are evaluated, e.g., up-dated. The research spans algebraic foundations, formal dynamical systems, computer simulation, and theoretical computer science. The theoretical approach is also deeply related to theoretical issues in parallel compilation. Numerous publications were produced, follow-on projects have been identified and are being developed programmatically, and a new area in computational algebra, SDS, was produced.

  3. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  4. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  5. Parallel Computational Protein Design

    PubMed Central

    Zhou, Yichao; Donald, Bruce R.; Zeng, Jianyang

    2016-01-01

    Computational structure-based protein design (CSPD) is an important problem in computational biology, which aims to design or improve a prescribed protein function based on a protein structure template. It provides a practical tool for real-world protein engineering applications. A popular CSPD method that guarantees to find the global minimum energy solution (GMEC) is to combine both dead-end elimination (DEE) and A* tree search algorithms. However, in this framework, the A* search algorithm can run in exponential time in the worst case, which may become the computation bottleneck of large-scale computational protein design process. To address this issue, we extend and add a new module to the OSPREY program that was previously developed in the Donald lab [1] to implement a GPU-based massively parallel A* algorithm for improving protein design pipeline. By exploiting the modern GPU computational framework and optimizing the computation of the heuristic function for A* search, our new program, called gOSPREY, can provide up to four orders of magnitude speedups in large protein design cases with a small memory overhead comparing to the traditional A* search algorithm implementation, while still guaranteeing the optimality. In addition, gOSPREY can be configured to run in a bounded-memory mode to tackle the problems in which the conformation space is too large and the global optimal solution cannot be computed previously. Furthermore, the GPU-based A* algorithm implemented in the gOSPREY program can be combined with the state-of-the-art rotamer pruning algorithms such as iMinDEE [2] and DEEPer [3] to also consider continuous backbone and side-chain flexibility. PMID:27914056

  6. Trends in Higher Education Computing.

    ERIC Educational Resources Information Center

    Thomas, Charles R.

    1984-01-01

    Discusses the effects which changes in computer technology are having on the organization, staffing, and budgets at institutions of higher education. Trends in computer hardware, computer software, and in office automation are also discussed. (JN)

  7. Hot, Hot, Hot Computer Careers.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1988-01-01

    Discusses the increasing need for electrical, electronic, and computer engineers; and scientists. Provides current status of the computer industry and average salaries. Considers computer chip manufacture and the current chip shortage. (MVL)

  8. Organ dose measurements from multiple-detector computed tomography using a commercial dosimetry system and tomographic, physical phantoms

    NASA Astrophysics Data System (ADS)

    Lavoie, Lindsey K.

    The technology of computed tomography (CT) imaging has soared over the last decade with the use of multi-detector CT (MDCT) scanners that are capable of performing studies in a matter of seconds. While the diagnostic information obtained from MDCT imaging is extremely valuable, it is important to ensure that the radiation doses resulting from these studies are at acceptably safe levels. This research project focused on the measurement of organ doses resulting from modern MDCT scanners. A commercially-available dosimetry system was used to measure organ doses. Small dosimeters made of optically-stimulated luminescent (OSL) material were analyzed with a portable OSL reader. Detailed verification of this system was performed. Characteristics studied include energy, scatter, and angular responses; dose linearity, ability to erase the exposed dose and ability to reuse dosimeters multiple times. The results of this verification process were positive. While small correction factors needed to be applied to the dose reported by the OSL reader, these factors were small and expected. Physical, tomographic pediatric and adult phantoms were used to measure organ doses. These phantoms were developed from CT images and are composed of tissue-equivalent materials. Because the adult phantom is comprised of numerous segments, dosimeters were placed in the phantom at several organ locations, and doses to select organs were measured using three clinical protocols: pediatric craniosynostosis, adult brain perfusion and adult cardiac CT angiography (CTA). A wide-beam, 320-slice, volumetric CT scanner and a 64-slice, MDCT scanner were used for organ dose measurements. Doses ranged from 1 to 26 mGy for the pediatric protocol, 1 to 1241 mGy for the brain perfusion protocol, and 2-100 mGy for the cardiac protocol. In most cases, the doses measured on the 64-slice scanner were higher than those on the 320-slice scanner. A methodology to measure organ doses with OSL dosimeters received from CT

  9. Exercises in molecular computing.

    PubMed

    Stojanovic, Milan N; Stefanovic, Darko; Rudchenko, Sergei

    2014-06-17

    CONSPECTUS: The successes of electronic digital logic have transformed every aspect of human life over the last half-century. The word "computer" now signifies a ubiquitous electronic device, rather than a human occupation. Yet evidently humans, large assemblies of molecules, can compute, and it has been a thrilling challenge to develop smaller, simpler, synthetic assemblies of molecules that can do useful computation. When we say that molecules compute, what we usually mean is that such molecules respond to certain inputs, for example, the presence or absence of other molecules, in a precisely defined but potentially complex fashion. The simplest way for a chemist to think about computing molecules is as sensors that can integrate the presence or absence of multiple analytes into a change in a single reporting property. Here we review several forms of molecular computing developed in our laboratories. When we began our work, combinatorial approaches to using DNA for computing were used to search for solutions to constraint satisfaction problems. We chose to work instead on logic circuits, building bottom-up from units based on catalytic nucleic acids, focusing on DNA secondary structures in the design of individual circuit elements, and reserving the combinatorial opportunities of DNA for the representation of multiple signals propagating in a large circuit. Such circuit design directly corresponds to the intuition about sensors transforming the detection of analytes into reporting properties. While this approach was unusual at the time, it has been adopted since by other groups working on biomolecular computing with different nucleic acid chemistries. We created logic gates by modularly combining deoxyribozymes (DNA-based enzymes cleaving or combining other oligonucleotides), in the role of reporting elements, with stem-loops as input detection elements. For instance, a deoxyribozyme that normally exhibits an oligonucleotide substrate recognition region is

  10. Quantum Computational Cryptography

    NASA Astrophysics Data System (ADS)

    Kawachi, Akinori; Koshiba, Takeshi

    As computational approaches to classical cryptography have succeeded in the establishment of the foundation of the network security, computational approaches even to quantum cryptography are promising, since quantum computational cryptography could offer richer applications than the quantum key distribution. Our project focused especially on the quantum one-wayness and quantum public-key cryptosystems. The one-wayness of functions (or permutations) is one of the most important notions in computational cryptography. First, we give an algorithmic characterization of quantum one-way permutations. In other words, we show a necessary and sufficient condition for quantum one-way permutations in terms of reflection operators. Second, we introduce a problem of distinguishing between two quantum states as a new underlying problem that is harder to solve than the graph automorphism problem. The new problem is a natural generalization of the distinguishability problem between two probability distributions, which are commonly used in computational cryptography. We show that the problem has several cryptographic properties and they enable us to construct a quantum publickey cryptosystem, which is likely to withstand any attack of a quantum adversary.

  11. Computer Assisted Surgery

    NASA Astrophysics Data System (ADS)

    Arámbula Cosío, F.; Padilla Castañeda, M. A.

    2003-09-01

    Computer assisted surgery (CAS) systems can provide different levels of assistance to a surgeon during training and execution of a surgical procedure. This is done through the integration of : measurements taken on medical images; computer graphics techniques; and positioning or tracking mechanisms which accurately locate the surgical instruments inside the operating site. According to the type of assistance that is provided to the surgeon, CAS systems can be classified as: Image guided surgery systems; Assistant robots for surgery; and Training simulators for surgery. In this work are presented the main characteristics of CAS systems. It is also described the development of a computer simulator for training on Transurethral Resection of the Prostate (TURP) based on a computer model of the prostate gland which is able to simulate, in real time, deformations and resections of tissue. The model is constructed as a 3D mesh with physical properties such as elasticity. We describe the main characteristics of the prostate model and its performance. The prostate model will also be used in the development of a CAS system designed to assist the surgeon during a real TURP procedure. The system will provide 3D views of the shape of the prostate of the patient, and the position of the surgical instrument during the operation. The development of new computer graphics models which are able to simulate, in real time, the mechanical behavior of an organ during a surgical procedure, can improve significantly the training and execution of other minimally invasive surgical procedures such as laparoscopic gall bladder surgery.

  12. Computational Physics' Greatest Hits

    NASA Astrophysics Data System (ADS)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  13. Optical computer motherboards

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Xu, Guoda; Bartha, John M.; Gruntman, Michael A.

    1997-09-01

    In this paper, we investigate the application of precision plastic optics into a communication/computer sub-system, such as a hybrid computer motherboard. We believe that using optical waveguides for next-generation computer motherboards can provide a high performance alternative for present multi-layer printed circuit motherboards. In response to this demand, we suggest our novel concept of a hybrid motherboard based on an internal-fiber-coupling (IFC) wavelength-division-multiplexing (WDM) optical backplane. The IFC/WDM backplane provides dedicated Tx/Rx connections, and applies low-cost, high-performance components, including CD LDs, GRIN plastic fibers, molding housing, and nonimaging optics connectors. Preliminary motherboard parameters are: speed 100 MHz/100 m, or 1 GHz/10 m; fiber loss approximately 0.01 dB/m; almost zero fan-out/fan-in optical power loss, and eight standard wavelength channels. The proposed hybrid computer motherboard, based on innovative optical backplane technology, should solve low-speed, low-parallelism bottlenecks in present electric computer motherboards.

  14. Computational metrology for nanomanufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Shiyuan

    2013-10-01

    One critical challenge for high-volume nanomanufacturing requires the on-line monitoring and measurement of the nanostructures manufactured. Computational metrology is expected to provide a novel means for fast, low-cost, nondestructive, and accurate measurement in high-volume manufacturing. Computational metrology refers to a measurement method where a complicated measurement process is modeled as a forward problem and some measured data are obtained by a specific instrument under a certain measurement configuration, and then the measurands are precisely and accurately reconstructed by solving the corresponding inverse problem. Thus computational metrology is essentially a model-based metrology and a typical process to solve an inverse problem. The key issues in computational metrology, such as the measurability, the measurement error analysis and precision estimation, the measurement configuration optimization, the fast and accurate forward modeling, and the fast and robust measurand reconstruction, and their generalized solution methods are explored in this paper, with an emphasis on the significance and necessity to apply modern mathematical theories and tools in solving the related problems. Some case studies carried out in my research group are presented to demonstrate the capability of computational metrology.

  15. Highly parallel computer architecture for robotic computation

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Bejczy, Anta K. (Inventor)

    1991-01-01

    In a computer having a large number of single instruction multiple data (SIMD) processors, each of the SIMD processors has two sets of three individual processor elements controlled by a master control unit and interconnected among a plurality of register file units where data is stored. The register files input and output data in synchronism with a minor cycle clock under control of two slave control units controlling the register file units connected to respective ones of the two sets of processor elements. Depending upon which ones of the register file units are enabled to store or transmit data during a particular minor clock cycle, the processor elements within an SIMD processor are connected in rings or in pipeline arrays, and may exchange data with the internal bus or with neighboring SIMD processors through interface units controlled by respective ones of the two slave control units.

  16. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  17. Hierarchical Approximate Bayesian Computation

    PubMed Central

    Turner, Brandon M.; Van Zandt, Trisha

    2013-01-01

    Approximate Bayesian computation (ABC) is a powerful technique for estimating the posterior distribution of a model’s parameters. It is especially important when the model to be fit has no explicit likelihood function, which happens for computational (or simulation-based) models such as those that are popular in cognitive neuroscience and other areas in psychology. However, ABC is usually applied only to models with few parameters. Extending ABC to hierarchical models has been difficult because high-dimensional hierarchical models add computational complexity that conventional ABC cannot accommodate. In this paper we summarize some current approaches for performing hierarchical ABC and introduce a new algorithm called Gibbs ABC. This new algorithm incorporates well-known Bayesian techniques to improve the accuracy and efficiency of the ABC approach for estimation of hierarchical models. We then use the Gibbs ABC algorithm to estimate the parameters of two models of signal detection, one with and one without a tractable likelihood function. PMID:24297436

  18. The future of computing

    NASA Astrophysics Data System (ADS)

    Simmons, Michelle

    2016-05-01

    Down-scaling has been the leading paradigm of the semiconductor industry since the invention of the first transistor in 1947. However miniaturization will soon reach the ultimate limit, set by the discreteness of matter, leading to intensified research in alternative approaches for creating logic devices. This talk will discuss the development of a radical new technology for creating atomic-scale devices which is opening a new frontier of research in electronics globally. We will introduce single atom transistors where we can measure both the charge and spin of individual dopants with unique capabilities in controlling the quantum world. To this end, we will discuss how we are now demonstrating atom by atom, the best way to build a quantum computer - a new type of computer that exploits the laws of physics at very small dimensions in order to provide an exponential speed up in computational processing power.

  19. Constructing computer virus phylogenies

    SciTech Connect

    Goldberg, L.A.; Goldberg, P.W.; Phillips, C.A.; Sorkin, G.B.

    1996-03-01

    There has been much recent algorithmic work on the problem of reconstructing the evolutionary history of biological species. Computer virus specialists are interested in finding the evolutionary history of computer viruses--a virus is often written using code fragments from one or more other viruses, which are its immediate ancestors. A phylogeny for a collection of computer viruses is a directed acyclic graph whose nodes are the viruses and whose edges map ancestors to descendants and satisfy the property that each code fragment is ``invented`` only once. To provide a simple explanation for the data, we consider the problem of constructing such a phylogeny with a minimal number of edges. In general, this optimization problem cannot be solved in quasi-polynomial time unless NQP=QP; we present positive and negative results for associated approximated problems. When tree solutions exist, they can be constructed and randomly sampled in polynomial time.

  20. Computational Systems Biology

    SciTech Connect

    McDermott, Jason E.; Samudrala, Ram; Bumgarner, Roger E.; Montogomery, Kristina; Ireton, Renee

    2009-05-01

    Computational systems biology is the term that we use to describe computational methods to identify, infer, model, and store relationships between the molecules, pathways, and cells (“systems”) involved in a living organism. Based on this definition, the field of computational systems biology has been in existence for some time. However, the recent confluence of high throughput methodology for biological data gathering, genome-scale sequencing and computational processing power has driven a reinvention and expansion of this field. The expansions include not only modeling of small metabolic{Ishii, 2004 #1129; Ekins, 2006 #1601; Lafaye, 2005 #1744} and signaling systems{Stevenson-Paulik, 2006 #1742; Lafaye, 2005 #1744} but also modeling of the relationships between biological components in very large systems, incluyding whole cells and organisms {Ideker, 2001 #1124; Pe'er, 2001 #1172; Pilpel, 2001 #393; Ideker, 2002 #327; Kelley, 2003 #1117; Shannon, 2003 #1116; Ideker, 2004 #1111}{Schadt, 2003 #475; Schadt, 2006 #1661}{McDermott, 2002 #878; McDermott, 2005 #1271}. Generally these models provide a general overview of one or more aspects of these systems and leave the determination of details to experimentalists focused on smaller subsystems. The promise of such approaches is that they will elucidate patterns, relationships and general features that are not evident from examining specific components or subsystems. These predictions are either interesting in and of themselves (for example, the identification of an evolutionary pattern), or are interesting and valuable to researchers working on a particular problem (for example highlight a previously unknown functional pathway). Two events have occurred to bring about the field computational systems biology to the forefront. One is the advent of high throughput methods that have generated large amounts of information about particular systems in the form of genetic studies, gene expression analyses (both protein and

  1. Human-computer interface

    DOEpatents

    Anderson, Thomas G.

    2004-12-21

    The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

  2. Optical quantum computing.

    PubMed

    O'Brien, Jeremy L

    2007-12-07

    In 2001, all-optical quantum computing became feasible with the discovery that scalable quantum computing is possible using only single-photon sources, linear optical elements, and single-photon detectors. Although it was in principle scalable, the massive resource overhead made the scheme practically daunting. However, several simplifications were followed by proof-of-principle demonstrations, and recent approaches based on cluster states or error encoding have dramatically reduced this worrying resource overhead, making an all-optical architecture a serious contender for the ultimate goal of a large-scale quantum computer. Key challenges will be the realization of high-efficiency sources of indistinguishable single photons, low-loss, scalable optical circuits, high-efficiency single-photon detectors, and low-loss interfacing of these components.

  3. Future Trends in Computing

    NASA Astrophysics Data System (ADS)

    Elmegreen, Bruce G.

    2011-04-01

    According to a Top500.org compilation, large computer systems have been doubling in sustained speed every 1.14 years for the last 17 years. If this rapid growth continues, we will have computers by 2020 that can execute an Exaflop (1018) per second. Storage is also improving in cost and density at an exponential rate. Several innovations that will accompany this growth are reviewed here, including shrinkage of basic circuit components on Silicon, three-dimensional integration, and Phase Change Memory. Further growth will require new technologies, most notably those surrounding the basic building block of computers, the Field Effect Transistor. Implications of these changes for the types of problems that can be solved are briefly discussed.

  4. Optoelectronic Reservoir Computing

    PubMed Central

    Paquot, Y.; Duport, F.; Smerieri, A.; Dambre, J.; Schrauwen, B.; Haelterman, M.; Massar, S.

    2012-01-01

    Reservoir computing is a recently introduced, highly efficient bio-inspired approach for processing time dependent data. The basic scheme of reservoir computing consists of a non linear recurrent dynamical system coupled to a single input layer and a single output layer. Within these constraints many implementations are possible. Here we report an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line. Our implementation is sufficiently fast for real time information processing. We illustrate its performance on tasks of practical importance such as nonlinear channel equalization and speech recognition, and obtain results comparable to state of the art digital implementations. PMID:22371825

  5. Stopping computer crimes

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Two new books about intrusions and computer viruses remind us that attacks against our computers on networks are the actions of human beings. Cliff Stoll's book about the hacker who spent a year, beginning in Aug. 1986, attempting to use the Lawrence Berkeley Computer as a stepping-stone for access to military secrets is a spy thriller that illustrates the weaknesses of our password systems and the difficulties in compiling evidence against a hacker engaged in espionage. Pamela Kane's book about viruses that attack IBM PC's shows that viruses are the modern version of the old problem of a Trojan horse attack. It discusses the most famous viruses and their countermeasures, and it comes with a floppy disk of utility programs that will disinfect your PC and thwart future attack.

  6. Cell types, circuits, computation.

    PubMed

    Azeredo da Silveira, Rava; Roska, Botond

    2011-10-01

    How does the connectivity of a neuronal circuit, together with the individual properties of the cell types that take part in it, result in a given computation? We examine this question in the context of retinal circuits. We suggest that the retina can be viewed as a parallel assemblage of many small computational devices, highly stereotypical and task-specific circuits afferent to a given ganglion cell type, and we discuss some rules that govern computation in these devices. Multi-device processing in retina poses conceptual problems when it is contrasted with cortical processing. We lay out open questions both on processing in retinal circuits and on implications for cortical processing of retinal inputs.

  7. Personal computers on Ethernet

    NASA Technical Reports Server (NTRS)

    Kao, R.

    1988-01-01

    Many researchers in the Division have projects which require transferring large files between their personal computers (PC) and VAX computers in the Laboratory for Oceans Computing Facility (LOCF). Since Ethernet local area network provides high speed communication channels which make file transfers (among other capabilities) practical, a network plan was assembled to connect IBM and IBM compatible PC's to Ethernet for participating personnel. The design employs ThinWire Ethernet technology. A simplified configuration diagram is shown. A DEC multiport repeater (DEMPR) is used for connection of ThinWire Ethernet segments. One port of DEMPR is connected to a H4000 transceiver and the transceiver is clamped onto the Goddard Ethernet backbonecoaxial cable so that the PC's can be optionally on the SPAN network. All these common elements were successfully installed and tested.

  8. Computers, Nanotechnology and Mind

    NASA Astrophysics Data System (ADS)

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  9. Demonstration of blind quantum computing.

    PubMed

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  10. The Relationship between Computer Anxiety and Computer Self-Efficacy

    ERIC Educational Resources Information Center

    Simsek, Ali

    2011-01-01

    This study examined the relationship between computer anxiety and computer self-efficacy of students and teachers in elementary and secondary schools. The sample included a total of 845 subjects from two private school systems in Turkey. The Oetting's Computer Anxiety Scale was used to measure computer anxiety whereas the Murphy's Computer…

  11. Effect of Gender and Computer Experience on Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Levin, Tamar; Gordon, Claire

    1989-01-01

    Describes study of Israeli secondary school students that was conducted to determine the extent to which gender and prior computer exposure affect students' attitudes toward computers prior to computer instruction in school. An attitude questionnaire designed to measure affective and cognitive attitudes toward computers is described, and results…

  12. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  13. Differentiating Computer Experience and Attitudes toward Computers: An Empirical Investigation.

    ERIC Educational Resources Information Center

    Smith, B.; Caputi, P.; Rawstorne, P.

    2000-01-01

    Describes a study that defined and provided initial empirical support for differentiating the concepts of computer attitude, subjective computer experience, and objective computer experience. Discusses results of a principal component factor analysis and presents a conceptual analysis of the relation between subjective computer experience and…

  14. Notes on teaching computing

    SciTech Connect

    Gabriel, J.R.

    1982-01-01

    There are two kinds of qualification to use computers, those used incidentally to anothr marketable skill, and those needed to solve more-difficult problems. Although the second kind can be taught using pesonal computers, and can be taught to children in high school or even grade school, they are not usually taught even in two-year colleges. This is the first of two talks discussing what these skills are, what is needed to teach them, and how a program might be developed at minimal cost by integrating it with an industrial arts course in a two-year college.

  15. Computers and audit

    PubMed Central

    Fitter, M.J.; Evans, A.R.; Garber, J.R.

    1985-01-01

    A computerized information system was installed in a large group practice. This paper describes how the computer system was used for the systematic auditing of clinical activities, and also demonstrates how it acted as a catalyst for the review and changes of administrative and management procedures. An analysis of the issues that arose in an audit group is used to identify how the objectives and activities of the group evolved with experience. It is demonstrated that a computer system and audit can complement and enhance each other to the benefit of clinical and managerial decision making. PMID:4078807

  16. Exercises in Molecular Computing

    PubMed Central

    2014-01-01

    Conspectus The successes of electronic digital logic have transformed every aspect of human life over the last half-century. The word “computer” now signifies a ubiquitous electronic device, rather than a human occupation. Yet evidently humans, large assemblies of molecules, can compute, and it has been a thrilling challenge to develop smaller, simpler, synthetic assemblies of molecules that can do useful computation. When we say that molecules compute, what we usually mean is that such molecules respond to certain inputs, for example, the presence or absence of other molecules, in a precisely defined but potentially complex fashion. The simplest way for a chemist to think about computing molecules is as sensors that can integrate the presence or absence of multiple analytes into a change in a single reporting property. Here we review several forms of molecular computing developed in our laboratories. When we began our work, combinatorial approaches to using DNA for computing were used to search for solutions to constraint satisfaction problems. We chose to work instead on logic circuits, building bottom-up from units based on catalytic nucleic acids, focusing on DNA secondary structures in the design of individual circuit elements, and reserving the combinatorial opportunities of DNA for the representation of multiple signals propagating in a large circuit. Such circuit design directly corresponds to the intuition about sensors transforming the detection of analytes into reporting properties. While this approach was unusual at the time, it has been adopted since by other groups working on biomolecular computing with different nucleic acid chemistries. We created logic gates by modularly combining deoxyribozymes (DNA-based enzymes cleaving or combining other oligonucleotides), in the role of reporting elements, with stem–loops as input detection elements. For instance, a deoxyribozyme that normally exhibits an oligonucleotide substrate recognition region is

  17. Computational quantum chemistry website

    SciTech Connect

    1997-08-22

    This report contains the contents of a web page related to research on the development of quantum chemistry methods for computational thermochemistry and the application of quantum chemistry methods to problems in material chemistry and chemical sciences. Research programs highlighted include: Gaussian-2 theory; Density functional theory; Molecular sieve materials; Diamond thin-film growth from buckyball precursors; Electronic structure calculations on lithium polymer electrolytes; Long-distance electronic coupling in donor/acceptor molecules; and Computational studies of NOx reactions in radioactive waste storage.

  18. Computer Programs for Construction

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A NASA computer program aids Hudson Engineering Corporation, Houston, Texas in the design and construction of huge petrochemical processing plants like the one shown, which is located at Ju'aymah, Saud Arabia. The pipes handling the flow of chemicals are subject to a variety of stresses, such as weight and variations in temperature. Hudson Engineering uses a COSMIC piping flexibility analysis computer program to analyze and insure the necessary strength and flexibility of the pipes. This program helps the company realize substantial savings in reduced engineering time.

  19. Adventures in Computational Grids

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Sometimes one supercomputer is not enough. Or your local supercomputers are busy, or not configured for your job. Or you don't have any supercomputers. You might be trying to simulate worldwide weather changes in real time, requiring more compute power than you could get from any one machine. Or you might be collecting microbiological samples on an island, and need to examine them with a special microscope located on the other side of the continent. These are the times when you need a computational grid.

  20. Computer Mediated Communication

    NASA Astrophysics Data System (ADS)

    Fano, Robert M.

    1984-08-01

    The use of computers in organizations is discussed in terms of its present and potential role in facilitating and mediating communication between people. This approach clarifies the impact that computers may have on the operation of organizations and on the individuals comprising them. Communication, which is essential to collaborative activities, must be properly controlled to protect individual and group privacy, which is equally essential. Our understanding of the human and organizational aspects of controlling communication and access to information presently lags behind our technical ability to implement the controls that may be needed.

  1. Optical signal computing

    NASA Astrophysics Data System (ADS)

    Cathey, Wade Thomas; Schmidt, Rodney A.; Moddel, Garret

    1989-12-01

    Architectures for optical symbolic computing were designed, devices were designed and built that were specifically for the architectures, and test circuits for some of the logic elements were designed, constructed, and operated. The research elements were designed, constructed, and operated. The research led to novel architectures for optical symbolic computing. Devices were developed that are suitable for optical 2-D memory and logic. These devices are pixilated photo-addressed spatial light modulators (SLMs) with a three terminal arrangement so that the threshold can be adjusted. Spinoff non-pixilated devices are useful as high frame rate, high resolution SLMs that can be used for many optical signal processing applications.

  2. The design of a multi-detector spectrometer for the infrared. [satellite-borne atmospheric temperature sounder

    NASA Technical Reports Server (NTRS)

    Koch, D. G.; Aubrecht, J. A.

    1978-01-01

    A modified Ebert-Fastie spectrometer has been developed for atmospheric temperature sounding applications. The device is described with reference to its resolution, grating, focal-length mirror, mirror, equivalent f-number, and projected area of grating. The images of the entrance slit appear tilted backwards away from the concave mirror. Astigmatism and spherical aberration are reduced by asperizing the mirror. The resolution and f-number of the instrument are limited by the sagittal coma. The orientation and size of the exit slit are functions of wavelength.

  3. A Small Radioactive Source in a Large Media - Localization by Multi-detector Measurement. The Case of a Lung Counter

    SciTech Connect

    Alfassi, Z. B.; Pelled, O.; German, U.

    2008-08-14

    Considerable errors in the determination of radioactive contamination in lungs can be induced if there is no homogeneous distribution, as assumed for the calibration. Modern lung counter systems use several detectors, and the count rate ratios of the detectors can be used for localization of the radioactive contamination, enabling the use of correction algorithms. This greatly reduces the errors in the determination of the activity. Further reduction of the errors can be obtained by simultaneous analysis of several gamma lines (if several energies are emitted by the radioisotope), and by optimizing the number and location of the detectors. This presentation deals with the case of a point source of natural uranium in human lungs.

  4. Diagnostic capability of gadoxetate disodium-enhanced liver MRI for diagnosis of hepatocellular carcinoma: comparison with multi-detector CT.

    PubMed

    Toyota, Naoyuki; Nakamura, Yuko; Hieda, Masashi; Akiyama, Naoko; Terada, Hiroaki; Matsuura, Noriaki; Nishiki, Masayo; Kono, Hirotaka; Kohno, Hiroshi; Irei, Toshimitsu; Yoshikawa, Yukinobu; Kuraoka, Kazuya; Taniyama, Kiyomi; Awai, Kazuo

    2013-09-01

    The purpose of this study was to evaluate the diagnostic capability of gadoxetate disodium (Gd-EOB)-MRI for the detection of hepatocellular carcinoma (HCC) compared with multidetector CT (MDCT). Fifty patients with 57 surgically proven HCCs who underwent Gd-EOB-MRI and MDCT from March 2008 to June 2011 were evaluated. Two observers evaluated MR and CT on a lesion-by-lesion basis. We analyzed sensitivity by grading on a 5-point scale, the degree of arterial enhancement and the differences in histological grades in the diffusion-weighted images (DWI). The results showed that the sensitivity of Gd-EOB-MRI was higher than that of MDCT especially for HCCs that were 1 cm in diameter or smaller. The hepatobiliary phase was useful for the detecting of small HCC. We had few cases in which it was difficult to judge HCC in the arterial enhancement between MRI and MDCT. In the diffusion-weighted image, well differentiated HCC tended to show a low signal intensity, and poorly differentiated HCC tended to show a high signal intensity. In moderately differentiated HCC's, the mean diameter of the high signal intensity group was larger than that of the low signal intensity group (24.5 mm vs. 15.8 mm). In conclusion, Gd-EOB-MRI tended to show higher sensitivity compared to MDCT in the detection of HCC.

  5. An introduction to computer viruses

    SciTech Connect

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  6. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2014-12-30

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  7. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2015-01-27

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  8. EDITORIAL: Computational materials science Computational materials science

    NASA Astrophysics Data System (ADS)

    Kahl, Gerhard; Kresse, Georg

    2011-10-01

    Special issue in honour of Jürgen Hafner On 30 September 2010, Jürgen Hafner, one of the most prominent and influential members within the solid state community, retired. His remarkably broad scientific oeuvre has made him one of the founding fathers of modern computational materials science: more than 600 scientific publications, numerous contributions to books, and a highly cited monograph, which has become a standard reference in the theory of metals, witness not only the remarkable productivity of Jürgen Hafner but also his impact in theoretical solid state physics. In an effort to duly acknowledge Jürgen Hafner's lasting impact in this field, a Festsymposium was held on 27-29 September 2010 at the Universität Wien. The organizers of this symposium (and authors of this editorial) are proud to say that a large number of highly renowned scientists in theoretical condensed matter theory—co-workers, friends and students—accepted the invitation to this celebration of Hafner's jubilee. Some of these speakers also followed our invitation to submit their contribution to this Festschrift, published in Journal of Physics: Condensed Matter, a journal which Jürgen Hafner served in 2000-2003 and 2003-2006 as a member of the Advisory Editorial Board and member of the Executive Board, respectively. In the subsequent article, Volker Heine, friend and co-worker of Jürgen Hafner over many decades, gives an account of Hafner's impact in the field of theoretical condensed matter physics. Computational materials science contents Theoretical study of structural, mechanical and spectroscopic properties of boehmite (γ-AlOOH) D Tunega, H Pašalić, M H Gerzabek and H Lischka Ethylene epoxidation catalyzed by chlorine-promoted silver oxide M O Ozbek, I Onal and R A Van Santen First-principles study of Cu2ZnSnS4 and the related band offsets for photovoltaic applicationsA Nagoya, R Asahi and G Kresse Renormalization group study of random quantum magnetsIstván A Kovács and

  9. Cloud Computing Explained

    ERIC Educational Resources Information Center

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  10. Computer-Assisted Organizing

    ERIC Educational Resources Information Center

    Brunner, David James

    2009-01-01

    Organizing refers to methods of distributing physical and symbolic tasks among multiple agents in order to achieve goals. My dissertation investigates the dynamics of organizing in hybrid information processing systems that incorporate both humans and computers. To explain the behavior of these hybrid systems, I develop and partially test a theory…

  11. Computationally efficient control allocation

    NASA Technical Reports Server (NTRS)

    Durham, Wayne (Inventor)

    2001-01-01

    A computationally efficient method for calculating near-optimal solutions to the three-objective, linear control allocation problem is disclosed. The control allocation problem is that of distributing the effort of redundant control effectors to achieve some desired set of objectives. The problem is deemed linear if control effectiveness is affine with respect to the individual control effectors. The optimal solution is that which exploits the collective maximum capability of the effectors within their individual physical limits. Computational efficiency is measured by the number of floating-point operations required for solution. The method presented returned optimal solutions in more than 90% of the cases examined; non-optimal solutions returned by the method were typically much less than 1% different from optimal and the errors tended to become smaller than 0.01% as the number of controls was increased. The magnitude of the errors returned by the present method was much smaller than those that resulted from either pseudo inverse or cascaded generalized inverse solutions. The computational complexity of the method presented varied linearly with increasing numbers of controls; the number of required floating point operations increased from 5.5 i, to seven times faster than did the minimum-norm solution (the pseudoinverse), and at about the same rate as did the cascaded generalized inverse solution. The computational requirements of the method presented were much better than that of previously described facet-searching methods which increase in proportion to the square of the number of controls.

  12. Office Computers: Ergonomic Considerations.

    ERIC Educational Resources Information Center

    Ganus, Susannah

    1984-01-01

    Each new report of the office automation market indicates technology is overrunning the office. The impacts of this technology are described and some ways to manage and physically "soften" the change to a computer-based office environment are suggested. (Author/MLW)

  13. Computing beyond Moore's Law

    SciTech Connect

    Shalf, John M.; Leland, Robert

    2015-12-01

    Here, photolithography systems are on pace to reach atomic scale by the mid-2020s, necessitating alternatives to continue realizing faster, more predictable, and cheaper computing performance. If the end of Moore's law is real, a research agenda is needed to assess the viability of novel semiconductor technologies and navigate the ensuing challenges.

  14. Thomas Jefferson's Computer.

    ERIC Educational Resources Information Center

    Smith, Catherine F.

    1996-01-01

    Notes that taken together, Thomas Jefferson's contributions to the history of writing technology demonstrate a virtual "computer." Links Jefferson's development of writing technology to his democratic political philosophy. Argues that this link should interest writing teachers. Suggests that Jeffersonian optimism effectively counters…

  15. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  16. Vocational Agriculture Computer Handbook.

    ERIC Educational Resources Information Center

    Kentucky State Dept. of Education, Frankfort.

    This document is a catalog of reviews of computer software suitable for use in vocational agriculture programs. The reviews were made by vocational agriculture teachers in Kentucky. The reviews cover software on the following topics: farm management, crop production, livestock production, horticulture, agricultural mechanics, general agriculture,…

  17. Syllabus Computer in Astronomy

    NASA Astrophysics Data System (ADS)

    Hojaev, Alisher S.

    2015-08-01

    One of the most important and actual subjects and training courses in the curricula for undergraduate level students at the National university of Uzbekistan is ‘Computer Methods in Astronomy’. It covers two semesters and includes both lecture and practice classes. Based on the long term experience we prepared the tutorial for students which contain the description of modern computer applications in astronomy.The main directions of computer application in field of astronomy briefly as follows:1) Automating the process of observation, data acquisition and processing2) Create and store databases (the results of observations, experiments and theoretical calculations) their generalization, classification and cataloging, working with large databases3) The decisions of the theoretical problems (physical modeling, mathematical modeling of astronomical objects and phenomena, derivation of model parameters to obtain a solution of the corresponding equations, numerical simulations), appropriate software creation4) The utilization in the educational process (e-text books, presentations, virtual labs, remote education, testing), amateur astronomy and popularization of the science5) The use as a means of communication and data transfer, research result presenting and dissemination (web-journals), the creation of a virtual information system (local and global computer networks).During the classes the special attention is drawn on the practical training and individual work of students including the independent one.

  18. Software Computer Series, 104.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1989-01-01

    Discusses computer simulations for chemistry courses. Reviews a simulation program, "Evaluation of Bonding Theory: The Werner Jorgensen Controversy," and provides information regarding hardware requirements for the program. Presents the idea of an expert system including simulation modules and counselor modules. Describes some chemical examples, a…

  19. Computer Labs: New Perceptions.

    ERIC Educational Resources Information Center

    Prochaska, Nancy

    1989-01-01

    Describes the use of computer labs in a Texas high school. Highlights include word processing; a study skills course; software programs currently in use by English and English-as-a-Second-Language classes; physical layout, including security considerations and furniture; hardware, including printers; and guidelines for the selection of software.…

  20. Computing Blood Flows

    NASA Technical Reports Server (NTRS)

    Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.

    1990-01-01

    Methods developed for aerospace applied to mechanics of biofluids. Report argues use of advanced computational fluid dynamics to analyze flows of biofluids - especially blood. Ability to simulate numerically and visualize complicated, time-varying three-dimensional flows contributes to understanding of phenomena in heart and blood vessels, offering potential for development of treatments for abnormal flow conditions.

  1. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Shell Oil Company used a COSMIC program, called VISCEL to insure the accuracy of the company's new computer code for analyzing polymers, and chemical compounds. Shell reported that there were no other programs available that could provide the necessary calculations. Shell produces chemicals for plastic products used in the manufacture of automobiles, housewares, appliances, film, textiles, electronic equipment and furniture.

  2. Computer Equity @ School.

    ERIC Educational Resources Information Center

    Equity Coalition for Race, Gender, and National Origin, 1999

    1999-01-01

    This edition of "Equity Coalition" is designed to be a resource to assist those who have responsibility for technology in the schools. The authors of these articles discuss a variety of issues related to computer uses in education and equal access to educational technology. The issue contains the following articles: (1) "Technology--A New Kind of…

  3. Awaiting the intelligent computer

    SciTech Connect

    Mckibbin, W.L.

    1983-08-01

    Experimental and operational expert artificial intelligence systems exist in at least a dozen domains: engineering, computer systems configuration and diagnosis, finance, education, medicine, bioengineering, manufacturing, law and management science. This article examines the state-of-the-art in expert systems.

  4. Computers in Abstract Algebra

    ERIC Educational Resources Information Center

    Nwabueze, Kenneth K.

    2004-01-01

    The current emphasis on flexible modes of mathematics delivery involving new information and communication technology (ICT) at the university level is perhaps a reaction to the recent change in the objectives of education. Abstract algebra seems to be one area of mathematics virtually crying out for computer instructional support because of the…

  5. Spies in the Computer.

    ERIC Educational Resources Information Center

    Sweetland, Robert D.

    1984-01-01

    Discusses a computer program requiring students to find an object on a grid by using location coordinates. The program, which provides a wide range of problem-solving activities, is a highly motivating approach to hands-on metric measurement, rounding off numbers, and plotting points on Cartesian coordinate systems. Includes complete program…

  6. Computer Series, 115.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1990-01-01

    Reviewed are six computer programs which may be useful in teaching college level chemistry. Topics include dynamic data storage in FORTRAN, "KC?DISCOVERER," pH of acids and bases, calculating percent boundary surfaces for orbitals, and laboratory interfacing with PT Nomograph for the Macintosh. (CW)

  7. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1989-01-01

    Described are the uses of a spreadsheet program to compute the free energy change associated with a reaction; to model acid/base equilibria; and to solve equilibrium constant expressions. Stressed are the advantages derived from having the spreadsheet perform the multiple, trivial calculations. (CW)

  8. MHD computations for stellarators

    SciTech Connect

    Johnson, J.L.

    1985-12-01

    Considerable progress has been made in the development of computational techniques for studying the magnetohydrodynamic equilibrium and stability properties of three-dimensional configurations. Several different approaches have evolved to the point where comparison of results determined with different techniques shows good agreement. 55 refs., 7 figs.

  9. Computers in Science Fiction.

    ERIC Educational Resources Information Center

    Kurland, Michael

    1984-01-01

    Science fiction writers' perceptions of the "thinking machine" are examined through a review of Baum's Oz books, Heinlein's "Beyond This Horizon," science fiction magazine articles, and works about robots including Asimov's "I, Robot." The future of computers in science fiction is discussed and suggested readings are listed. (MBR)

  10. Computer Aided Tests.

    ERIC Educational Resources Information Center

    Steinke, Elisabeth

    An approach to using the computer to assemble German tests is described. The purposes of the system would be: (1) an expansion of the bilingual lexical memory bank to list and store idioms of all degrees of difficulty, with frequency data and with complete and sophisticated retrieval possibility for assembly; (2) the creation of an…

  11. Computer Model Documentation Guide.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    These guidelines for communicating effectively the details of computer model design and operation to persons with varying interests in a model recommend the development of four different types of manuals to meet the needs of managers, users, analysts and programmers. The guidelines for preparing a management summary manual suggest a broad spectrum…

  12. Computing Cosmic Cataclysms

    NASA Technical Reports Server (NTRS)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  13. Low-Carbon Computing

    ERIC Educational Resources Information Center

    Hignite, Karla

    2009-01-01

    Green information technology (IT) is grabbing more mainstream headlines--and for good reason. Computing, data processing, and electronic file storage collectively account for a significant and growing share of energy consumption in the business world and on higher education campuses. With greater scrutiny of all activities that contribute to an…

  14. Computers for the Disabled.

    ERIC Educational Resources Information Center

    Lazzaro, Joseph J.

    1993-01-01

    Describes adaptive technology for personal computers that accommodate disabled users and may require special equipment including hardware, memory, expansion slots, and ports. Highlights include vision aids, including speech synthesizers, magnification, braille, and optical character recognition (OCR); hearing adaptations; motor-impaired…

  15. Computer Programming: BASIC.

    ERIC Educational Resources Information Center

    Fisher, Patience; And Others

    This guide was prepared to help teachers of the Lincoln Public School's introductory computer programming course in BASIC to make the necessary adjustments for changes made in the course since the purchase of microcomputers and such peripheral devices as television monitors and disk drives, and the addition of graphics. Intended to teach a…

  16. Computing and Handicapped Education.

    ERIC Educational Resources Information Center

    Johnson, Everett L.

    1984-01-01

    Brief description of an evaluation of PLATO terminal and software in a handicapped educational environment notes several problems handicapped students would have in using the system and proposed solutions. Information is provided on the Committee on Computing and the Handicapped of the Institute of Electrical and Electronics Engineers Computer…

  17. Women in Computer Sciences.

    ERIC Educational Resources Information Center

    Rose, Clare; Menninger, Sally Ann

    The keynote address of a conference that focused on the future of women in science and engineering fields and the opportunities available to them in the computer sciences is presented. Women's education in the sciences and education and entry into the job market in these fields has steadily been increasing. Excellent employment opportunities are…

  18. Computation models of discourse

    SciTech Connect

    Brady, M.; Berwick, R.C.

    1983-01-01

    This book presents papers on artificial intelligence and natural language. Topics considered include recognizing intentions from natural language utterances, cooperative responses from a portable natural language database query system, natural language generation as a computational problem, focusing in the comprehension of definite anaphora, and factors in forming discourse-dependent descriptions.

  19. Computer/Information Science

    ERIC Educational Resources Information Center

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  20. Electronic calorimetric computer

    NASA Technical Reports Server (NTRS)

    Heckelman, J. D.

    1968-01-01

    Electronic calorimetric computer calculates nuclear reactor thermal power output to a nominal accuracy of 1 percent. Heat balance is determined by an electronic approach. The thermal power is calculated using the inlet and outlet temperatures and the volume of cooling water and is displayed by a digital readout system.