Science.gov

Sample records for 320-row multi-detector computed

  1. Feasibility of 320-row multi-detector computed tomography angiography to assess bioabsorbable everolimus-eluting vascular scaffolds.

    PubMed

    Asami, Masahiko; Aoki, Jiro; Serruys, Patrick W; Abizaid, Alexandre; Saito, Shigeru; Onuma, Yoshinobu; Kimura, Takeshi; Simonton, Charles A; Tanabe, Kengo

    2016-04-01

    Coronary computer tomographic angiography (CCTA) for screening intra-arterial vessel disease is gaining rapid clinical acceptance in recent years, but its use for such assessments in metal-stented vessel segments is very limited due to blooming artifacts introduced by the metal. However, vessel segments treated by the polymeric everolimus-eluting bioresorbable vascular scaffolds (Absorb) are readily monitored for intravascular disease over time with CCTA. The data on the accuracy of multi-detector computed tomography (MDCT) in patients treated with Absorb is still sparse. Results on 5 Japanese case studies from ABSORB EXTEND are presented here. Five patients were treated with Absorb, and follow-up angiography was conducted at 8 to 14 months as per routine site standard of practice. 320-row MDCT scan was performed within 1 month before the angiography. By MDCT, all Absorb-treated lesions were clearly evaluated and restenosis were not observed. Minimal diameter and % diameter stenosis were similar between MDCT and quantitative angiography (2.07 ± 0.13 vs. 2.03 ± 0.06 mm, P = 0.86, and 22.5 ± 5.0 vs. 21.5 ± 4.5 %, P = 0.88, respectively). MDCT appears to be feasible and useful for evaluating lumen patency and vessel disease in segments implanted with Absorb at follow-up. PMID:26445951

  2. Absorbed Radiation Dose in Radiosensitive Organs Using 64- and 320-Row Multidetector Computed Tomography: A Comparative Study

    PubMed Central

    Khan, Atif N.; Nikolic, Boris; Khan, Mohammad K.; Kang, Jian; Khosa, Faisal

    2014-01-01

    Aim. To determine absorbed radiation dose (ARD) in radiosensitive organs during prospective and full phase dose modulation using ECG-gated MDCTA scanner under 64- and 320-row detector modes. Methods. Female phantom was used to measure organ radiation dose. Five DP-3 radiation detectors were used to measure ARD to lungs, breast, and thyroid using the Aquilion ONE scanner in 64- and 320-row modes using both prospective and dose modulation in full phase acquisition. Five measurements were made using three tube voltages: 100, 120, and 135 kVp at 400 mA at heart rate (HR) of 60 and 75 bpm for each protocol. Mean acquisition was recorded in milligrays (mGy). Results. Mean ARD was less for 320-row versus 64-row mode for each imaging protocol. Prospective EKG-gated imaging protocol resulted in a statistically lower ARD using 320-row versus 64-row modes for midbreast (6.728 versus 19.687 mGy, P < 0.001), lung (6.102 versus 21.841 mGy, P < 0.001), and thyroid gland (0.208 versus 0.913 mGy; P < 0.001). Retrospective imaging using 320- versus 64-row modes showed lower ARD for midbreast (10.839 versus 43.169 mGy, P < 0.001), lung (8.848 versus 47.877 mGy, P < 0.001), and thyroid gland (0.057 versus 2.091 mGy; P < 0.001). ARD reduction was observed at lower kVp and heart rate. Conclusions. Dose reduction to radiosensitive organs is achieved using 320-row compared to 64-row modes for both prospective and retrospective gating, whereas 64-row mode is equivalent to the same model 64-row MDCT scanner. PMID:25170427

  3. Diagnostic Phase of Calcium Scoring Scan Applied as the Center of Acquisition Window of Coronary Computed Tomography Angiography Improves Image Quality in Minimal Acquisition Window Scan (Target CTA Mode) Using the Second Generation 320-Row CT

    PubMed Central

    Maeda, Eriko; Kanno, Shigeaki; Ino, Kenji; Tomizawa, Nobuo; Akahane, Masaaki; Torigoe, Rumiko; Ohtomo, Kuni

    2016-01-01

    Objective. To compare the image quality of coronary computed tomography angiography (CCTA) acquired under two conditions: 75% fixed as the acquisition window center (Group 75%) and the diagnostic phase for calcium scoring scan as the center (CS; Group CS). Methods. 320-row cardiac CT with a minimal acquisition window (scanned using “Target CTA” mode) was performed on 81 patients. In Group 75% (n = 40), CS was obtained and reconstructed at 75% and the center of the CCTA acquisition window was set at 75%. In Group CS (n = 41), CS was obtained at 75% and the diagnostic phase showing minimal artifacts was applied as the center of the CCTA acquisition window. Image quality was evaluated using a four-point scale (4-excellent) and the mean scores were compared between groups. Results. The CCTA scan diagnostic phase occurred significantly earlier in CS (75.7 ± 3.2% vs. 73.6 ± 4.5% for Groups 75% and CS, resp., p = 0.013). The mean Group CS image quality score (3.58 ± 0.63) was also higher than that for Group 75% (3.19 ± 0.66, p < 0.0001). Conclusions. The image quality of CCTA in Target CTA mode was significantly better when the center of acquisition window is adjusted using CS. PMID:26977449

  4. Multi-Detector Computed Tomography Angiography for Coronary Artery Disease

    PubMed Central

    2005-01-01

    Executive Summary Purpose Computed tomography (CT) scanning continues to be an important modality for the diagnosis of injury and disease, most notably for indications of the head and abdomen. (1) According to a recent report published by the Canadian Institutes of Health Information, (1) there were about 10.3 scanners per million people in Canada as of January 2004. Ontario had the fewest number of CT scanners per million compared to the other provinces (8 CT scanners per million). The wait time for CT in Ontario of 5 weeks approaches the Canadian median of 6 weeks. This health technology and policy appraisal systematically reviews the published literature on multidetector CT (MDCT) angiography as a diagnostic tool for the newest indication for CT, coronary artery disease (CAD), and will apply the results of the review to current health care practices in Ontario. This review does not evaluate MDCT to detect coronary calcification without contrast medium for CAD screening purposes. The Technology Compared with conventional CT scanning, MDCT can provide smaller pieces of information and can cover a larger area faster. (2) Advancing MDCT technology (8, 16, 32, 64 slice systems) is capable of producing more images in less time. For general CT scanning, this faster capability can reduce the time that patients must stay still during the procedure, thereby reducing potential movement artefact. However, the additional clinical utility of images obtained from faster scanners compared to the images obtained from conventional CT scanners for current CT indications (i.e., non-moving body parts) is not known. There are suggestions that the new fast scanners can reduce wait times for general CT. MDCT angiography that utilizes a contrast medium, has been proposed as a minimally invasive replacement to coronary angiography to detect coronary artery disease. MDCT may take between 15 to 45 minutes; coronary angiography may take up to 1 hour. Although 16-slice and 32-slice CT

  5. Multi-detector row computed tomography angiography of peripheral arterial disease

    PubMed Central

    Dijkshoorn, Marcel L.; Pattynama, Peter M. T.; Myriam Hunink, M. G.

    2007-01-01

    With the introduction of multi-detector row computed tomography (MDCT), scan speed and image quality has improved considerably. Since the longitudinal coverage is no longer a limitation, multi-detector row computed tomography angiography (MDCTA) is increasingly used to depict the peripheral arterial runoff. Hence, it is important to know the advantages and limitations of this new non-invasive alternative for the reference test, digital subtraction angiography. Optimization of the acquisition parameters and the contrast delivery is important to achieve a reliable enhancement of the entire arterial runoff in patients with peripheral arterial disease (PAD) using fast CT scanners. The purpose of this review is to discuss the different scanning and injection protocols using 4-, 16-, and 64-detector row CT scanners, to propose effective methods to evaluate and to present large data sets, to discuss its clinical value and major limitations, and to review the literature on the validity, reliability, and cost-effectiveness of multi-detector row CT in the evaluation of PAD. PMID:17882427

  6. Multi-detector computed tomography imaging of large airway pathology: A pictorial review

    PubMed Central

    Jugpal, Tejeshwar Singh; Garg, Anju; Sethi, Gulshan Rai; Daga, Mradul Kumar; Kumar, Jyoti

    2015-01-01

    The tracheobronchial tree is a musculo-cartilagenous framework which acts as a conduit to aerate the lungs and consequently the entire body. A large spectrum of pathological conditions can involve the trachea and bronchial airways. These may be congenital anomalies, infections, post-intubation airway injuries, foreign body aspiration or neoplasms involving the airway. Appropriate management of airway disease requires an early and accurate diagnosis. In this pictorial essay review, we will comprehensively describe the various airway pathologies and their imaging findings by multi-detector computed tomography. PMID:26753061

  7. Multi-detector computed tomography demonstrates smoke inhalation injury at early stage.

    PubMed

    Koljonen, Virve; Maisniemi, Kreu; Virtanen, Kaisa; Koivikko, Mika

    2007-06-01

    A multitrauma victim was transported to our trauma centre. Smoke inhalation injury was suspected based on trauma history and clinical examination. The first trauma computer tomography (CT) obtained 2.8 h after the injury revealed subtle ground-glass opacifications with mainly peribronchial distribution and patchy peribronchial consolidations centrally in the left lung. A repeated scan showed a more distinctive demarcation of the peribronchial opacities, further substantiating the clinically verified smoke inhalation injury. The golden standard for diagnosing smoke inhalation injury still is fibroptic bronchoscopy examination. This paper shows that lesions typical to smoke inhalation injury appear much earlier than previously reported. Whether assessment of smoke inhalation injury severity using CT could clinically benefit patients is controversial and still requires further research. Multi-detector computed tomography is readily available in trauma centres and to simply neglect its potential as a diagnostic tool in some inhalation injury would be unwise. PMID:17285330

  8. Complete Preoperative Evaluation of Pulmonary Atresia with Ventricular Septal Defect with Multi-Detector Computed Tomography

    PubMed Central

    Liu, Jingzhe; Li, Hongyin; Liu, Zhibo; Wu, Qingyu; Xu, Yufeng

    2016-01-01

    Objective To compare multi-detector computed tomography (MDCT) with cardiac catheterization and transthoracic echocardiography (TTE) in comprehensive evaluation of the global cardiovascular anatomy in patients with pulmonary atresia with ventricular septal defect (PA-VSD). Methods The clinical and imaging data of 116 patients with PA-VSD confirmed by surgery were reviewed. Using findings at surgery as the reference standard, data from MDCT, TTE and catheterization were reviewed for assessment of native pulmonary vasculature and intracardiac defects. Results MDCT was more accurate than catheterization and TTE in identification of native pulmonary arteries. MDCT is also the most accurate test for delineation of the major aortopulmonary collateral arteries. The inter-modality agreement for evaluation of overriding aorta and VSD were both excellent. In the subgroup with surgical correlation, excellent agreement was found between TTE and surgery, and substantial agreement was also found at MDCT. Conclusion MDCT can correctly delineate the native pulmonary vasculatures and intracardiac defects and may be a reliable method for noninvasive assessment of global cardiovascular abnormalities in patients with PA-VSD. PMID:26741649

  9. Multi-Detector Row Computed Tomography Findings of Pelvic Congestion Syndrome Caused by Dilated Ovarian Veins

    PubMed Central

    Eren, Suat

    2010-01-01

    Objective: To evaluate the efficacy of multi-detector row CT (MDCT) on pelvic congestion syndrome (PCS), which is often overlooked or poorly visualized with routine imaging examination. Materials and Methods: We evaluated the MDCT features of 40 patients with PCS (mean age, 45 years; range, 29–60 years) using axial, coronal, sagittal, 3D volume-rendered, and Maximum Intensity Projection MIP images. Results: MDCT revealed pelvic varices and ovarian vein dilatations in all patients. Bilateral ovarian vein dilatation was present in 25 patients, and 15 patients had unilateral dilatation. While 12 cases of secondary pelvic varices occurred simultaneously with a retroaortic left renal vein, 10 cases were due solely to a mass obstruction or stenosis of venous structures. Conclusion: MDCT is an effective tool in the evaluation of PCS, and it has more advantages than other imaging modalities. PMID:25610142

  10. Multi-detector computed tomography in the diagnosis and management of acute aortic syndromes

    PubMed Central

    Hallinan, James Thomas Patrick Decourcy; Anil, Gopinathan

    2014-01-01

    Acute aortic syndrome (AAS) is a spectrum of conditions, which may ultimately progress to potentially life-threatening aortic rupture. This syndrome encompasses aortic dissection (AD), intramural haematoma, penetrating atherosclerotic ulcer and unstable thoracic aortic aneurysms. Multi-detector CT (MDCT) is crucial for the diagnosis of AAS, especially in the emergency setting due to its speed, accuracy and ready availability. This review attends to the value of appropriate imaging protocols in obtaining good quality images that can permit a confident diagnosis of AAS. AD is the most commonly encountered AAS and also the one with maximum potential to cause catastrophic outcome if not diagnosed and managed promptly. Hence, this review briefly addresses certain relevant clinical perspectives on this condition. Differentiating the false from the true lumen in AD is often essential; a spectrum of CT findings, e.g., “beak sign”, aortic “cobwebs” that allows such differentiation have been described with explicit illustrations. The value of non enhanced CT scans, especially useful in the diagnosis of an intramural hematoma has also been illustrated. Overlap in the clinical and imaging features of the various conditions presenting as AAS is not unusual. However, on most instances MDCT enables the right diagnosis. On select occasions MRI or trans-esophageal echocardiography may be required as a problem solving tool. PMID:24976936

  11. A Study of Internal Thoracic Arteriovenous Principal Perforators by Using Multi-detector Row Computed Tomography Angiography

    PubMed Central

    Hashikawa, Kazunobu; Sakakibara, Shunsuke; Onishi, Hiroyuki; Terashi, Hiroto

    2016-01-01

    Objective: There are numerous reports of perforating branches from the intercostal spaces of the internal thoracic vessels. These branches have varying diameters, and a main perforating branch, the principal perforator, most often found in the second or third intercostal space. We report different results based on multi-detector row computed tomography. Methods: We evaluated 121 sides from 70 women scheduled for breast reconstruction with free lower abdominal skin flaps who underwent preoperative multi-detector row computed tomographic scan between June 2008 and June 2015. For primary reconstruction, we analyzed both sides, and for 1-sided secondary reconstruction, we analyzed only the unaffected side. We evaluated both early arterial phase and late venous phase 5-mm horizontal, cross-sectional, and volume-rendering images for perforation sites and internal thoracic arteriovenous perforating branches’ intercostal space thickness. We analyzed differences in thickness between the internal thoracic arteries and veins and symmetry in cases involving both sides. Results: Venous principal perforators nearly always perforated the same intercostal spaces as accompanying veins of arterial principal perforators (99.2%), forming arteriovenous principal perforators. We found 49 principal perforators in the first intercostal space (37.4%), 52 in the second intercostal space (39.7%), 23 in the third intercostal space (17.6%), 6 in the fourth intercostal space (4.6%), and 1 in the fifth intercostal space (0.7%). Of the 51 cases in which we studied both sides, 25 cases (49%) had principal perforators with bilateral symmetry. Conclusions: In contrast to findings from past reports, we found that internal thoracic arteriovenous principal perforators were often present in almost the same numbers in the first and second intercostal spaces. PMID:26958104

  12. Role of Computer Aided Diagnosis (CAD) in the detection of pulmonary nodules on 64 row multi detector computed tomography

    PubMed Central

    Prakashini, K; Babu, Satish; Rajgopal, KV; Kokila, K Raja

    2016-01-01

    Aims and Objectives: To determine the overall performance of an existing CAD algorithm with thin-section computed tomography (CT) in the detection of pulmonary nodules and to evaluate detection sensitivity at a varying range of nodule density, size, and location. Materials and Methods: A cross-sectional prospective study was conducted on 20 patients with 322 suspected nodules who underwent diagnostic chest imaging using 64-row multi-detector CT. The examinations were evaluated on reconstructed images of 1.4 mm thickness and 0.7 mm interval. Detection of pulmonary nodules, initially by a radiologist of 2 years experience (RAD) and later by CAD lung nodule software was assessed. Then, CAD nodule candidates were accepted or rejected accordingly. Detected nodules were classified based on their size, density, and location. The performance of the RAD and CAD system was compared with the gold standard that is true nodules confirmed by consensus of senior RAD and CAD together. The overall sensitivity and false-positive (FP) rate of CAD software was calculated. Observations and Results: Of the 322 suspected nodules, 221 were classified as true nodules on the consensus of senior RAD and CAD together. Of the true nodules, the RAD detected 206 (93.2%) and 202 (91.4%) by the CAD. CAD and RAD together picked up more number of nodules than either CAD or RAD alone. Overall sensitivity for nodule detection with the CAD program was 91.4%, and FP detection per patient was 5.5%. The CAD showed comparatively higher sensitivity for nodules of size 4–10 mm (93.4%) and nodules in hilar (100%) and central (96.5%) location when compared to RAD's performance. Conclusion: CAD performance was high in detecting pulmonary nodules including the small size and low-density nodules. CAD even with relatively high FP rate, assists and improves RAD's performance as a second reader, especially for nodules located in the central and hilar region and for small nodules by saving RADs time.

  13. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    PubMed Central

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; p<0.001). Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR = 1.2; p<0.05) or flail chest (45/143 vs. 11/66 patients, RR = 1.9; p<0.05) were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR = 1.7; p<0.01). Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR = 1.4; p<0.01). Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR = 1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001). Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. CONCLUSIONS: Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  14. Evaluation of renal vascular anatomy in live renal donors: Role of multi detector computed tomography

    PubMed Central

    Pandya, Vaidehi Kumudchandra; Patel, Alpeshkumar Shakerlal; Sutariya, Harsh Chandrakant; Gandhi, Shruti Pradipkumar

    2016-01-01

    Background: Evaluation of renal vascular variations is important in renal donors to avoid vascular complications during surgery. Venous variations, mainly resulting from the errors of the embryological development, are frequently observed. Aim: This retrospective cross-sectional study aimed to investigate the renal vascular variants with multidetector computed tomography (MDCT) angiography to provide valuable information for surgery and its correlations with surgical findings. Materials and Methods: A total of 200 patients underwent MDCT angiography as a routine work up for live renal donors. The number, course, and drainage patterns of the renal veins were retrospectively observed from the scans. Anomalies of renal veins and inferior vena cava (IVC) were recorded and classified. Multiplanar reformations (MPRs), maximum intensity projections, and volume rendering were used for analysis. The results obtained were correlated surgically. Results: In the present study, out of 200 healthy donors, the standard pattern of drainage of renal veins was observed in only 67% of donors on the right side and 92% of donors on the left side. Supernumerary renal veins in the form of dual and triple renal veins were seen on the right side in about 32.5% of donors (dual right renal veins in 30.5% cases and triple right renal veins in 2.5% cases). Variations on the left side were classified into four groups: supernumerary, retro-aortic, circumaortic, and plexiform left renal veins in 1%, 2.5%, 4%, 0.5%, cases respectively. Conclusions: Developmental variations in renal veins can be easily detected on computed tomography scan, which can go unnoticed and can pose a fatal threat during major surgeries such as donor nephrectomies in otherwise healthy donors if undiagnosed. PMID:27453646

  15. Criteria for establishing shielding of multi-detector computed tomography (MDCT) rooms.

    PubMed

    Verdun, F R; Aroua, A; Baechler, S; Schmidt, S; Trueb, P R; Bochud, F O

    2010-01-01

    The aim of this work is to compare two methods used for determining the proper shielding of computed tomography (CT) rooms while considering recent technological advances in CT scanners. The approaches of the German Institute for Standardisation and the US National Council on Radiation Protection and Measurements were compared and a series of radiation measurements were performed in several CT rooms at the Lausanne University Hospital. The following three-step procedure is proposed for assuring sufficient shielding of rooms hosting new CT units with spiral mode acquisition and various X-ray beam collimation widths: (1) calculate the ambient equivalent dose for a representative average weekly dose length product at the position where shielding is required; (2) from the maximum permissible weekly dose at the location of interest, calculate the transmission factor F that must be taken to ensure proper shielding and (3) convert the transmission factor into a thickness of lead shielding. A similar approach could be adopted to use when designing shielding for fluoroscopy rooms, where the basic quantity would be the dose area product instead of the load of current (milliampere-minute). PMID:20215444

  16. Relevant incidental findings at abdominal multi-detector contrast-enhanced computed tomography: A collateral screening?

    PubMed Central

    Sconfienza, Luca Maria; Mauri, Giovanni; Muzzupappa, Claudia; Poloni, Alessandro; Bandirali, Michele; Esseridou, Anastassia; Tritella, Stefania; Secchi, Francesco; Di Leo, Giovanni; Sardanelli, Francesco

    2015-01-01

    AIM: To investigate the prevalence of relevant incidental findings (RIFs) detected during routine abdominal contrast-enhanced computed tomography (CeCT). METHODS: We retrospectively evaluated the reports of a consecutive series of abdominal CeCT studies performed between January and May 2013. For each report, patients’ age and sex, admission as inpatient or outpatient, clinical suspicion as indicated by the requesting physician, availability of a previous abdominal examination, and name of the reporting radiologist were recorded. Based on the clinical suspicion, the presence and features of any RIFs (if needing additional workup) was noted. RESULTS: One thousand forty abdominal CeCT were performed in 949 patients (528 males, mean age 66 ± 14 years). No significant difference was found between inpatients and outpatients age and sex distribution (P > 0.472). RIFs were found in 195/1040 (18.8%) CeCT [inpatients = 108/470 (23.0%); outpatients = 87/570 (15.2%); P = 0.002]. RIFs were found in 30/440 (6.8%) CeCT with a previous exam and in 165/600 (27.5%) without a previous exam (P < 0.001). Radiologists’ distribution between inpatients or outpatients was significantly different (P < 0.001). RIFs prevalence increased with aging, except for a peak in 40-49 year group. Most involved organs were kidneys, gallbladder, and lungs. CONCLUSION: A RIF is detected in 1/5 patients undergoing abdominal CeCT. Risk of overdiagnosis should be taken into account. PMID:26516432

  17. The role of 64-slice multi-detector computed tomography in the detection of subclinical atherosclerosis of the coronary artery.

    PubMed

    Jeong, Hae Chang; Ahn, Youngkeun; Ko, Jum Suk; Lee, Min Goo; Sim, Doo Sun; Park, Keun Ho; Yoon, Nam Sik; Youn, Hyun Ju; Hong, Young Joon; Kim, Kye Hun; Park, Hyung Wook; Kim, Ju Han; Kim, Yun-Hyeon; Jeong, Myung Ho; Cho, Jeong Gwan; Park, Jong Chun; Kang, Jung Chaee

    2010-12-01

    Multi-detector computed tomography (MDCT) has been used for detecting or excluding coronary atherosclerotic stenosis in symptomatic patients. However, the role of MDCT for routine medical examination in asymptomatic, high-risk patients has not been established. We therefore conducted the present study to test the hypothesis that MDCT could be a valuable method for detecting subclinical coronary artery stenosis in asymptomatic patients. An observational, retrospective, single-centre study was conducted with a cohort of 1,529 patients (mean age, 56.4 ± 8.3 years; 1,353 males) who had undergone MDCT as part of their general medical checkups from November 2005 to April 2008. The patients who had a past history of coronary artery disease, typical chest pain, or evidence of myocardial ischemia were excluded. During clinical follow up of these patients, the incidence of subclinical coronary stenosis and the usefulness of MDCT for routine medical examination in asymptomatic patients were investigated. Of the 1,529 enrolled patients, 42.3% had hypertension, 13.5% had diabetes mellitus, 7.7% had hyperlipidemia, and 40.4% were current smokers. Abnormal MDCT findings were noted in 560 (36.6%) patients, who were classified into two groups. One group had the presence coronary calcium with a luminal diameter stenosis of the coronary artery of <50% (n = 508, 33.2%). These patients were treated with medication or clinical follow-up. The other group had a luminal diameter stenosis of the coronary artery of ≥50% with the presence or absence of coronary calcium (n = 52, 3.4%). These patients underwent a conventional coronary angiogram and intravascular ultrasound. A total of 29 of the 1,529 patients (1.9%) presented with insignificant stenosis or myocardial bridge, and 23 patients (1.5%) presented with significant stenosis. The patients with significant stenosis underwent percutaneous coronary intervention (PCI) with stent implantation. Major adverse cardiac events occurred

  18. Incidental finding of a papillary fibroelastoma on the aortic valve in 16 slice multi-detector row computed tomography.

    PubMed

    Bootsveld, A; Puetz, J; Grube, E

    2004-06-01

    Papillary fibroelastoma (PFE) is a benign, rare, gelatinous tumour derived from the endocardium, primarily the cardiac valves, which is usually diagnosed by high resolution echocardiography. Although rarely clinically symptomatic, PFEs have a potential for coronary ischaemia, systemic embolisation with neurologic symptoms, and sometimes valvar dysfunction. There are reports of coronary occlusion and even sudden cardiac death due to a ball valve phenomenon on the coronary ostia. This report describes the characteristics of a PFE with multidetector 16 slice computed tomography and 1.5 Tesla cardiac magnetic resonance imaging. PMID:15145899

  19. Perforated duodenal ulcer presenting with a subphrenic abscess revealed by plain abdominal X-ray films and confirmed by multi-detector computed tomography: a case report

    PubMed Central

    2013-01-01

    Introduction Peptic ulcer disease is still the major cause of gastrointestinal perforation despite major improvements in both diagnostic and therapeutic strategies. While the diagnosis of a perforated ulcer is straightforward in typical cases, its clinical onset may be subtle because of comorbidities and/or concurrent therapies. Case presentation We report the case of a 53-year-old Caucasian man with a history of chronic myeloid leukemia on maintenance therapy (100mg/day) with imatinib who was found to have a subphrenic abscess resulting from a perforated duodenal ulcer that had been clinically overlooked. Our patient was febrile (38.5°C) with abdominal tenderness and hypoactive bowel sounds. On the abdominal plain X-ray films, a right subphrenic abscess could be seen. On contrast-enhanced multi-detector computed tomography, a huge air-fluid collection extending from the subphrenic to the subhepatic anterior space was observed. After oral administration of 500cm3 of 3 percent diluted diatrizoate meglumine, an extraluminal leakage of the water-soluble iodinated contrast media could then be appreciated as a result of a perforated duodenal ulcer. During surgery, the abscess was drained and extensive adhesiolysis had to be performed to expose the duodenal bulb where the ulcer was first identified by methylene blue administration and then sutured. Conclusions While subphrenic abscesses are well known complications of perforated gastric or duodenal ulcers, they have nowadays become rare thanks to advances in both diagnostic and therapeutic strategies for peptic ulcer disease. However, when peptic ulcer disease is not clinically suspected, the contribution of imaging may be substantial. PMID:24215711

  20. Improved vessel morphology measurements in contrast-enhanced multi-detector computed tomography coronary angiography with non-linear post-processing.

    PubMed

    Ferencik, Maros; Lisauskas, Jennifer B; Cury, Ricardo C; Hoffmann, Udo; Abbara, Suhny; Achenbach, Stephan; Karl, W Clem; Brady, Thomas J; Chan, Raymond C

    2006-03-01

    Multi-detector computed tomography (MDCT) permits detection of coronary plaque. However, noise and blurring impair accuracy and precision of plaque measurements. The aim of the study was to evaluate MDCT post-processing based on non-linear image deblurring and edge-preserving noise suppression for measurements of plaque size. Contrast-enhanced MDCT coronary angiography was performed in four subjects (mean age 55 +/- 5 years, mean heart rate 54 +/- 5 bpm) using a 16-slice scanner (Siemens Sensation 16, collimation 16 x 0.75 mm, gantry rotation 420 ms, tube voltage 120 kV, tube current 550 mAs, 80 mL of contrast). Intravascular ultrasound (IVUS; 40 MHz probe) was performed in one vessel in each patient and served as a reference standard. MDCT vessel cross-sectional images (1 mm thickness) were created perpendicular to centerline and aligned with corresponding IVUS images. MDCT images were processed using a deblurring and edge-preserving noise suppression algorithm. Then, three independent blinded observers segmented lumen and outer vessel boundaries in each modality to obtain vessel cross-sectional area and wall area in the unprocessed MDCT cross-sections, post-processed MDCT cross-sections and corresponding IVUS. The wall area measurement difference for unprocessed and post-processed MDCT images relative to IVUS was 0.4 +/- 3.8 mm2 and -0.2 +/- 2.2 mm2 (p < 0.05), respectively. Similarly, Bland-Altman analysis of vessel cross-sectional area from unprocessed and post-processed MDCT images relative to IVUS showed a measurement difference of 1.0 +/- 4.4 and 0.6 +/- 4.8 mm2, respectively. In conclusion, MDCT permitted accurate in vivo measurement of wall area and vessel cross-sectional area as compared to IVUS. Post-processing to reduce blurring and noise reduced variability of wall area measurements and reduced measurement bias for both wall area and vessel cross-sectional area. PMID:16442768

  1. Magnetic resonance imaging and multi-detector computed tomography assessment of extracellular compartment in ischemic and non-ischemic myocardial pathologies

    PubMed Central

    Saeed, Maythem; Hetts, Steven W; Jablonowski, Robert; Wilson, Mark W

    2014-01-01

    Myocardial pathologies are major causes of morbidity and mortality worldwide. Early detection of loss of cellular integrity and expansion in extracellular volume (ECV) in myocardium is critical to initiate effective treatment. The three compartments in healthy myocardium are: intravascular (approximately 10% of tissue volume), interstitium (approximately 15%) and intracellular (approximately 75%). Myocardial cells, fibroblasts and vascular endothelial/smooth muscle cells represent intracellular compartment and the main proteins in the interstitium are types I/III collagens. Microscopic studies have shown that expansion of ECV is an important feature of diffuse physiologic fibrosis (e.g., aging and obesity) and pathologic fibrosis [heart failure, aortic valve disease, hypertrophic cardiomyopathy, myocarditis, dilated cardiomyopathy, amyloidosis, congenital heart disease, aortic stenosis, restrictive cardiomyopathy (hypereosinophilic and idiopathic types), arrythmogenic right ventricular dysplasia and hypertension]. This review addresses recent advances in measuring of ECV in ischemic and non-ischemic myocardial pathologies. Magnetic resonance imaging (MRI) has the ability to characterize tissue proton relaxation times (T1, T2, and T2*). Proton relaxation times reflect the physical and chemical environments of water protons in myocardium. Delayed contrast enhanced-MRI (DE-MRI) and multi-detector computed tomography (DE-MDCT) demonstrated hyper-enhanced infarct, hypo-enhanced microvascular obstruction zone and moderately enhanced peri-infarct zone, but are limited for visualizing diffuse fibrosis and patchy microinfarct despite the increase in ECV. ECV can be measured on equilibrium contrast enhanced MRI/MDCT and MRI longitudinal relaxation time mapping. Equilibrium contrast enhanced MRI/MDCT and MRI T1 mapping is currently used, but at a lower scale, as an alternative to invasive sub-endomyocardial biopsies to eliminate the need for anesthesia, coronary

  2. Image quality and radiation reduction of 320-row area detector CT coronary angiography with optimal tube voltage selection and an automatic exposure control system: comparison with body mass index-adapted protocol.

    PubMed

    Lim, Jiyeon; Park, Eun-Ah; Lee, Whal; Shim, Hackjoon; Chung, Jin Wook

    2015-06-01

    To assess the image quality and radiation exposure of 320-row area detector computed tomography (320-ADCT) coronary angiography with optimal tube voltage selection with the guidance of an automatic exposure control system in comparison with a body mass index (BMI)-adapted protocol. Twenty-two patients (study group) underwent 320-ADCT coronary angiography using an automatic exposure control system with the target standard deviation value of 33 as the image quality index and the lowest possible tube voltage. For comparison, a sex- and BMI-matched group (control group, n = 22) using a BMI-adapted protocol was established. Images of both groups were reconstructed by an iterative reconstruction algorithm. For objective evaluation of the image quality, image noise, vessel density, signal to noise ratio (SNR), and contrast to noise ratio (CNR) were measured. Two blinded readers then subjectively graded the image quality using a four-point scale (1: nondiagnostic to 4: excellent). Radiation exposure was also measured. Although the study group tended to show higher image noise (14.1 ± 3.6 vs. 9.3 ± 2.2 HU, P = 0.111) and higher vessel density (665.5 ± 161 vs. 498 ± 143 HU, P = 0.430) than the control group, the differences were not significant. There was no significant difference between the two groups for SNR (52.5 ± 19.2 vs. 60.6 ± 21.8, P = 0.729), CNR (57.0 ± 19.8 vs. 67.8 ± 23.3, P = 0.531), or subjective image quality scores (3.47 ± 0.55 vs. 3.59 ± 0.56, P = 0.960). However, radiation exposure was significantly reduced by 42 % in the study group (1.9 ± 0.8 vs. 3.6 ± 0.4 mSv, P = 0.003). Optimal tube voltage selection with the guidance of an automatic exposure control system in 320-ADCT coronary angiography allows substantial radiation reduction without significant impairment of image quality, compared to the results obtained using a BMI-based protocol. PMID:25604967

  3. Multiplanar and three-dimensional multi-detector row CT of thoracic vessels and airways in the pediatric population.

    PubMed

    Siegel, Marilyn J

    2003-12-01

    Multi-detector row computed tomography (CT) has changed the approach to imaging of thoracic anatomy and disease in the pediatric population. At the author's institution, multi-detector row CT with multiplanar and three-dimensional reconstruction has become an important examination in the evaluation of systemic and pulmonary vasculature and the tracheobronchial tree. In some clinical situations, multi-detector row CT with reformatted images is obviating conventional angiography, which is associated with higher radiation doses and longer sedation times. Although multi-detector row CT with multiplanar and three-dimensional reconstruction is expanding the applications of CT of the thorax, its role as a diagnostic tool still needs to be better defined. The purposes of this article are to describe how to perform multi-detector row CT with multiplanar and three-dimensional reconstruction in young patients, to discuss various reconstruction techniques available, and to discuss applications in the evaluation of vascular and airways diseases. PMID:14563904

  4. Gd-EOB-DTPA-enhanced 3.0-Tesla MRI findings for the preoperative detection of focal liver lesions: Comparison with iodine-enhanced multi-detector computed tomography

    NASA Astrophysics Data System (ADS)

    Park, Hyong-Hu; Goo, Eun-Hoe; Im, In-Chul; Lee, Jae-Seung; Kim, Moon-Jib; Kwak, Byung-Joon; Chung, Woon-Kwan; Dong, Kyung-Rae

    2012-12-01

    The safety of gadolinium-ethoxybenzyl-diethylenetriamine-pentaacetic-acid (Gd-EOB-DTPA) has been confirmed, but more study is needed to assess the diagnostic accuracy of Gd-EOB-DTPA-enhanced magnetic resonance imaging (MRI) in patients with a hepatocellular carcinoma (HCC) for whom surgical treatment is considered or with a metastatic hepatoma. Research is also needed to examine the rate of detection of hepatic lesions compared to multi-detector computed tomography (MDCT), which is used most frequently to localize and characterize a HCC. Gd-EOB-DTPA-enhanced MRI and iodine-enhanced MDCT imaging were compared for the preoperative detection of focal liver lesions. The clinical usefulness of each method was examined. The current study enrolled 79 patients with focal liver lesions who preoperatively underwent MRI and MDCT. In these patients, there was less than one month between the two diagnostic modalities. Imaging data were taken before and after contrast enhancement in both methods. To evaluate the images, we analyzed the signal-to-noise ratio (SNR) and the contrast-to-noise ratio (CNR) in the lesions and the liver parenchyma. To compare the sensitivity of the two methods, we performed a quantitative analysis of the percentage signal intensity of the liver (PSIL) on a high resolution picture archiving and communication system (PACS) monitor (paired-samples t-test, p < 0.05). The enhancement was evaluated based on a consensus of four observers. The enhancement pattern and the morphological features during the arterial and the delayed phases were correlated between the Gd-EOB-DTPA-enhanced MRI findings and the iodine-enhanced MDCT by using an adjusted x2 test. The SNRs, CNRs, and PSIL all had a greater detection rate in Gd-EOB-DTPA enhanced MRI than in iodine-enhanced MDCT. Hepatocyte-selective uptake was observed 20 minutes after the injection in the focal nodular hyperplasia (FNH, 9/9), adenoma (9/10), and highly-differentiated HCC (grade G1, 27/30). Rim

  5. Multi-detector row CT of pancreatic islet cell tumors.

    PubMed

    Horton, Karen M; Hruban, Ralph H; Yeo, Charles; Fishman, Elliot K

    2006-01-01

    Pancreatic islet cell tumors (ICTs) are neuroendocrine neoplasms that produce and secrete hormones to a variable degree. These neoplasms can present a diagnostic challenge, both clinically and radiologically. ICTs can be classified as either syndromic or nonsyndromic on the basis of their clinical manifestations. Multi-detector row computed tomography (CT) plays an important role in the diagnosis and staging of both syndromic and nonsyndromic ICTs. In general, syndromic ICTs are less than 3 cm in size. They are typically hyperenhancing and are usually best seen on CT scans obtained during the arterial phase. Nonsyndromic ICTs tend to be larger than syndromic ICTs at presentation and are more likely to be cystic or necrotic. It is important for the radiologist to be familiar with appropriate CT protocol for the evaluation of patients with suspected pancreatic ICT and to understand the variable CT appearances of these neoplasms. PMID:16549609

  6. Relationship between epicardial fat and quantitative coronary artery plaque progression: insights from computer tomography coronary angiography.

    PubMed

    Psaltis, Peter J; Talman, Andrew H; Munnur, Kiran; Cameron, James D; Ko, Brian S H; Meredith, Ian T; Seneviratne, Sujith K; Wong, Dennis T L

    2016-02-01

    Epicardial fat volume (EFV) has been suggested to promote atherosclerotic plaque development in coronary arteries, and has been correlated with both coronary stenosis and acute coronary events. Although associated with progression of coronary calcification burden, a relationship with progression of coronary atheroma volume has not been previously tested. We studied patients who had clinically indicated serial 320-row multi-detector computer tomography coronary angiography with a median 25-month interval. EFV was measured at baseline and follow-up. In vessels with coronary stenosis, quantitative analysis was performed to measure atherosclerotic plaque burden, volume and aggregate plaque volume at baseline and follow-up. The study comprised 64 patients (58.4 ± 12.2 years, 27 males, 192 vessels, 193 coronary segments). 79 (41 %) coronary segments had stenosis at baseline. Stenotic segments were associated with greater baseline EFV than those without coronary stenosis (117.4 ± 45.1 vs. 102.3 ± 51.6 cm(3), P = 0.046). 46 (24 %) coronary segments displayed either new plaque formation or progression of adjusted plaque burden at follow-up. These were associated with higher baseline EFV than segments without stenosis or those segments that had stenoses that did not progress (128.7 vs. 101.0 vs. 106.7 cm(3) respectively, P = 0.006). On multivariate analysis, baseline EFV was the only independent predictor of coronary atherosclerotic plaque progression or new development (P = 0.014). High baseline EFV is associated with the presence of coronary artery stenosis and plaque volume progression. Accumulation of EFV may be implicated in the evolution and progression of coronary atheroma. PMID:26335371

  7. 64 Slice multi-detector row cardiac CT.

    PubMed

    Pannu, Harpreet K; Johnson, Pamela T; Fishman, Elliot K

    2009-01-01

    Cardiac imaging is feasible with multi-detector row (MDCT) scanners. Coronary arterial anatomy and both non-calcified and calcified plaques are depicted at CT coronary angiography. Vessel wall pathology and luminal diameter are depicted, and secondary myocardial changes may also be seen. Diagnostic capacity has increased with technological advancement, and preliminary investigations confirm the utility of 64-MDCT in low- and intermediate-risk patients who present to the emergency department with acute chest pain. The clinical indications, 64-MDCT technique, and MDCT findings in coronary artery disease are reviewed. PMID:18941811

  8. Toroid cavity/coil NMR multi-detector

    DOEpatents

    Gerald, II, Rex E.; Meadows, Alexander D.; Gregar, Joseph S.; Rathke, Jerome W.

    2007-09-18

    An analytical device for rapid, non-invasive nuclear magnetic resonance (NMR) spectroscopy of multiple samples using a single spectrometer is provided. A modified toroid cavity/coil detector (TCD), and methods for conducting the simultaneous acquisition of NMR data for multiple samples including a protocol for testing NMR multi-detectors are provided. One embodiment includes a plurality of LC resonant circuits including spatially separated toroid coil inductors, each toroid coil inductor enveloping its corresponding sample volume, and tuned to resonate at a predefined frequency using a variable capacitor. The toroid coil is formed into a loop, where both ends of the toroid coil are brought into coincidence. Another embodiment includes multiple micro Helmholtz coils arranged on a circular perimeter concentric with a central conductor of the toroid cavity.

  9. Contrast enhanced multi-detector CT and MR findings of a well-differentiated pancreatic vipoma

    PubMed Central

    Camera, Luigi; Severino, Rosa; Faggiano, Antongiulio; Masone, Stefania; Mansueto, Gelsomina; Maurea, Simone; Fonti, Rosa; Salvatore, Marco

    2014-01-01

    Pancreatic vipoma is an extremely rare tumor accounting for less than 2% of endocrine pancreatic neoplasms with a reported incidence of 0.1-0.6 per million. While cross-sectional imaging findings are usually not specific, exact localization of the tumor by means of either computed tomography (CT) or magnetic resonance (MR) is pivotal for surgical planning. However, cross-sectional imaging findings are usually not specific and further characterization of the tumor may only be achieved by somatostatin-receptor scintigraphy (SRS). We report the case of a 70 years old female with a two years history of watery diarrhoea who was found to have a solid, inhomogeneously enhancing lesion at the level of the pancreatic tail at Gadolinium-enhanced MR (Somatom Trio 3T, Siemens, Germany). The tumor had been prospectively overlooked at a contrast-enhanced multi-detector CT (Aquilion 64, Toshiba, Japan) performed after i.v. bolus injection of only 100 cc of iodinated non ionic contrast media because of a chronic renal failure (3.4 mg/mL) but it was subsequently confirmed by SRS. The patient first underwent a successful symptomatic treatment with somatostatin analogues and was then submitted to a distal pancreasectomy with splenectomy to remove a capsulated whitish tumor which turned out to be a well-differentiated vipoma at histological and immuno-histochemical analysis. PMID:25349667

  10. Multi-detector CT features of acute intestinal ischemia and their prognostic correlations

    PubMed Central

    Moschetta, Marco; Telegrafo, Michele; Rella, Leonarda; Stabile Ianora, Amato Antonio; Angelelli, Giuseppe

    2014-01-01

    Acute intestinal ischemia is an abdominal emergency occurring in nearly 1% of patients presenting with acute abdomen. The causes can be occlusive or non occlusive. Early diagnosis is important to improve survival rates. In most cases of late or missed diagnosis, the mortality rate from intestinal infarction is very high, with a reported value ranging from 60% to 90%. Multi-detector computed tomography (MDCT) is a fundamental imaging technique that must be promptly performed in all patients with suspected bowel ischemia. Thanks to the new dedicated reconstruction program, its diagnostic potential is much improved compared to the past and currently it is superior to that of any other noninvasive technique. The increased spatial and temporal resolution, high-quality multi-planar reconstructions, maximum intensity projections, vessel probe, surface-shaded volume rending and tissue transition projections make MDCT the gold standard for the diagnosis of intestinal ischemia, with reported sensitivity, specificity, positive and negative predictive values of 64%-93%, 92%-100%, 90%-100% and 94%-98%, respectively. MDCT contributes to appropriate treatment planning and provides important prognostic information thanks to its ability to define the nature and extent of the disease. The purpose of this review is to examine the diagnostic and prognostic role of MDCT in bowel ischemia with special regard to the state of art new reconstruction software. PMID:24876917

  11. Novel ultrahigh resolution data acquisition and image reconstruction for multi-detector row CT

    SciTech Connect

    Flohr, T. G.; Stierstorfer, K.; Suess, C.; Schmidt, B.; Primak, A. N.; McCollough, C. H.

    2007-05-15

    We present and evaluate a special ultrahigh resolution mode providing considerably enhanced spatial resolution both in the scan plane and in the z-axis direction for a routine medical multi-detector row computed tomography (CT) system. Data acquisition is performed by using a flying focal spot both in the scan plane and in the z-axis direction in combination with tantalum grids that are inserted in front of the multi-row detector to reduce the aperture of the detector elements both in-plane and in the z-axis direction. The dose utilization of the system for standard applications is not affected, since the grids are moved into place only when needed and are removed for standard scanning. By means of this technique, image slices with a nominal section width of 0.4 mm (measured full width at half maximum=0.45 mm) can be reconstructed in spiral mode on a CT system with a detector configuration of 32x0.6 mm. The measured 2% value of the in-plane modulation transfer function (MTF) is 20.4 lp/cm, the measured 2% value of the longitudinal (z axis) MTF is 21.5 lp/cm. In a resolution phantom with metal line pair test patterns, spatial resolution of 20 lp/cm can be demonstrated both in the scan plane and along the z axis. This corresponds to an object size of 0.25 mm that can be resolved. The new mode is intended for ultrahigh resolution bone imaging, in particular for wrists, joints, and inner ear studies, where a higher level of image noise due to the reduced aperture is an acceptable trade-off for the clinical benefit brought about by the improved spatial resolution.

  12. Multi-detector row CT scanning in Paleoanthropology at various tube current settings and scanning mode.

    PubMed

    Badawi-Fayad, J; Yazbeck, C; Balzeau, A; Nguyen, T H; Istoc, A; Grimaud-Hervé, D; Cabanis, E- A

    2005-12-01

    The purpose of this study was to determine the optimal tube current setting and scanning mode for hominid fossil skull scanning, using multi-detector row computed tomography (CT). Four fossil skulls (La Ferrassie 1, Abri Pataud 1, CroMagnon 2 and Cro-Magnon 3) were examined by using the CT scanner LightSpeed 16 (General Electric Medical Systems) with varying dose per section (160, 250, and 300 mAs) and scanning mode (helical and conventional). Image quality of two-dimensional (2D) multiplanar reconstructions, three-dimensional (3D) reconstructions and native images was assessed by four reviewers using a four-point grading scale. An ANOVA (analysis of variance) model was used to compare the mean score for each sequence and the overall mean score according to the levels of the scanning parameters. Compared with helical CT (mean score=12.03), the conventional technique showed sustained poor image quality (mean score=4.17). With the helical mode, we observed a better image quality at 300 mAs than at 160 in the 3D sequences (P=0.03). Whereas in native images, a reduction in the effective tube current induced no degradation in image quality (P=0.05). Our study suggests a standardized protocol for fossil scanning with a 16 x 0.625 detector configuration, a 10 mm beam collimation, a 0.562:1 acquisition mode, a 0.625/0.4 mm slice thickness/reconstruction interval, a pitch of 5.62, 120 kV and 300 mAs especially when a 3D study is required. PMID:16211320

  13. Validity of blood flow measurement using 320 multi-detectors CT and first-pass distribution theory: a phantom study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Yu, Xuefang; Xu, Shaopeng; Zhou, Kenneth J.

    2015-03-01

    To evaluate the feasibility of measuring the myocardial blood flow using 320 row detector CT by first-pass technique. Heart was simulated with a container that was filled with pipeline of 3mm diameter; coronary artery was simulated with a pipeline of 2 cm diameter and connected with the simulated heart. The simulated coronary artery was connected with a big container with 1500 ml saline and 150ml contrast agent. One pump linking with simulated heart will withdraw with a speed of 10 ml/min, 15 ml/min, 20 ml/min, 25 ml/min and 30 ml/min. First CT scan starts after 30 s of pumpback with certain speed. The second CT scan starts 5 s after first CT scans. CT images processed as follows: The second CT scan images subtract first CT scan images, calculate the increase of CT value of simulated heart and the CT value of the unit volume of simulated coronary artery and then to calculate the total inflow of myocardial blood flow. CT myocardial blood flows were calculated as: 0.94 ml/s, 2.09 ml/s, 2.74 ml/s, 4.18 ml/s, 4.86 ml/s. The correlation coefficient is 0.994 and r2 = 0.97. The method of measuring the myocardial blood flow using 320 row detector CT by 2 scans is feasible. It is possible to develop a new method for quantitatively and functional assessment of myocardial perfusion blood flow with less radiation does.

  14. Quantitative Features of Liver Lesions, Lung Nodules, and Renal Stones at Multi-Detector Row CT Examinations: Dependency on Radiation Dose and Reconstruction Algorithm.

    PubMed

    Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan

    2016-04-01

    Purpose To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Materials and Methods Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Results Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was

  15. Preoperative Gross Classification of Gastric Adenocarcinoma: Comparison of Double Contrast-Enhanced Ultrasound and Multi-Detector Row CT.

    PubMed

    Yan, Caoxin; Bao, Xiaofeng; Shentu, Weihui; Chen, Jian; Liu, Chunmei; Ye, Qin; Wang, Liuhong; Tan, Yangbin; Huang, Pintong

    2016-07-01

    The aim of this study was to compare the accuracy of multi-detector computed tomography (MDCT) with double contrast-enhanced ultrasound (DCEUS), in which intravenous microbubbles are used alongside oral contrast-enhanced ultrasound, in determining the gross classification of patients with gastric carcinoma (GC). Altogether, 239 patients with GC proved by histology after endoscopic biopsy were included in this study. DCEUS and MDCT were performed pre-operatively. The diagnostic accuracies of DCEUS and MDCT in determining the gross classification were calculated and compared. The overall accuracy of DCEUS in determining the gross appearance of GC was higher than that of MDCT (84.9% vs. 79.9%, p < 0.001). There was no significant difference in accuracy between DCEUS and MDCT for Borrmann I and IV classifications of advanced gastric cancer (χ(2), p = 0.323 for Borrmann type I, p = 0.141 for Borrmann type IV). The accuracy of DCEUS for early GC and Borrmann II and III classifications of GC was higher than that of MDCT (χ(2), p = 0.000 for all). DCEUS may be regarded as a valuable complementary tool to MDCT in determining the gross appearance of gastric adenocarcinoma pre-operatively. PMID:27072076

  16. Cardiac Multi-detector CT Segmentation Based on Multiscale Directional Edge Detector and 3D Level Set.

    PubMed

    Antunes, Sofia; Esposito, Antonio; Palmisano, Anna; Colantoni, Caterina; Cerutti, Sergio; Rizzo, Giovanna

    2016-05-01

    Extraction of the cardiac surfaces of interest from multi-detector computed tomographic (MDCT) data is a pre-requisite step for cardiac analysis, as well as for image guidance procedures. Most of the existing methods need manual corrections, which is time-consuming. We present a fully automatic segmentation technique for the extraction of the right ventricle, left ventricular endocardium and epicardium from MDCT images. The method consists in a 3D level set surface evolution approach coupled to a new stopping function based on a multiscale directional second derivative Gaussian filter, which is able to stop propagation precisely on the real boundary of the structures of interest. We validated the segmentation method on 18 MDCT volumes from healthy and pathologic subjects using manual segmentation performed by a team of expert radiologists as gold standard. Segmentation errors were assessed for each structure resulting in a surface-to-surface mean error below 0.5 mm and a percentage of surface distance with errors less than 1 mm above 80%. Moreover, in comparison to other segmentation approaches, already proposed in previous work, our method presented an improved accuracy (with surface distance errors less than 1 mm increased of 8-20% for all structures). The obtained results suggest that our approach is accurate and effective for the segmentation of ventricular cavities and myocardium from MDCT images. PMID:26319010

  17. Control electronics for a multi-laser/multi-detector scanning system

    NASA Technical Reports Server (NTRS)

    Kennedy, W.

    1980-01-01

    The Mars Rover Laser Scanning system uses a precision laser pointing mechanism, a photodetector array, and the concept of triangulation to perform three dimensional scene analysis. The system is used for real time terrain sensing and vision. The Multi-Laser/Multi-Detector laser scanning system is controlled by a digital device called the ML/MD controller. A next generation laser scanning system, based on the Level 2 controller, is microprocessor based. The new controller capabilities far exceed those of the ML/MD device. The first draft circuit details and general software structure are presented.

  18. Novel Wearable and Wireless Ring-Type Pulse Oximeter with Multi-Detectors

    PubMed Central

    Huang, Cheng-Yang; Chan, Ming-Che; Chen, Chien-Yue; Lin, Bor-Shyh

    2014-01-01

    The pulse oximeter is a popular instrument to monitor the arterial oxygen saturation (SPO2). Although a fingertip-type pulse oximeter is the mainstream one on the market at present, it is still inconvenient for long-term monitoring, in particular, with respect to motion. Therefore, the development of a wearable pulse oximeter, such as a finger base-type pulse oximeter, can effectively solve the above issue. However, the tissue structure of the finger base is complex, and there is lack of detailed information on the effect of the light source and detector placement on measuring SPO2. In this study, the practicability of a ring-type pulse oximeter with a multi-detector was investigated by optical human tissue simulation. The optimal design of a ring-type pulse oximeter that can provide the best efficiency of measuring SPO2 was discussed. The efficiency of ring-type pulse oximeters with a single detector and a multi-detector was also discussed. Finally, a wearable and wireless ring-type pulse oximeter was also implemented to validate the simulation results and was compared with the commercial fingertip-type pulse oximeter. PMID:25244586

  19. Multi-Detector Coronary CT Imaging for the Identification of Coronary Artery Stenoses in a “Real-World” Population

    PubMed Central

    Makaryus, Amgad N; Henry, Sonia; Loewinger, Lee; Makaryus, John N; Boxt, Lawrence

    2014-01-01

    BACKGROUND Multi-detector computed tomography (CT) has emerged as a modality for the non-invasive assessment of coronary artery disease (CAD). Prior studies have selected patients for evaluation and have excluded many of the “real-world” patients commonly encountered in daily practice. We compared 64-detector-CT (64-CT) to conventional coronary angiography (CA) to investigate the accuracy of 64-CT in determining significant coronary stenoses in a “real-world” clinical population. METHODS A total of 1,818 consecutive patients referred for 64-CT were evaluated. CT angiography was performed using the GE LightSpeed VCT (GE® Healthcare). Forty-one patients in whom 64-CT results prompted CA investigation were further evaluated, and results of the two diagnostic modalities were compared. RESULTS A total of 164 coronary arteries and 410 coronary segments were evaluated in 41 patients (30 men, 11 women, age 39–85 years) who were identified by 64-CT to have significant coronary stenoses and who thereafter underwent CA. The overall per-vessel sensitivity, specificity, positive predictive value, negative predictive value, and accuracy at the 50% stenosis level were 86%, 84%, 65%, 95%, and 85%, respectively, and 77%, 93%, 61%, 97%, and 91%, respectively, in the per-segment analysis at the 50% stenosis level. CONCLUSION 64-CT is an accurate imaging tool that allows a non-invasive assessment of significant CAD with a high diagnostic accuracy in a “real-world” population of patients. The sensitivity and specificity that we noted are not as high as those in prior reports, but we evaluated a population of patients that is typically encountered in clinical practice and therefore see more “real-world” results. PMID:25628513

  20. Evaluation of different small bowel contrast agents by multi - detector row CT

    PubMed Central

    Wang, Yong-Ren; Yu, Xiao-Li; Peng, Zhi-Yi

    2015-01-01

    Objective: This study aims to evaluate the effects of different oral small bowel contrast agents towards the intestinal dilatation and intestinal wall structure exhibition by the abdominal multi - detector row CT (MDCT) examination. Methods: 80 patients were performed the whole abdominal CT examination, then randomly divided into four groups, with 20 patients in each group. 45 minutes before the CT examination, the patients were served with a total of 1800 ml pure water, pure milk, dilute lactulose solution and isotonic mannitol solution, respectively. Results: The images were blinded read by two experienced abdominal radiologists in the workstation, the cross-sectional diameters of duodenum, jejunum, proximal and terminal ends of ileum of each patient were measured, then the analysis of variance was performed to analyze the differences in the intestinal dilatation among the experimental groups. The scoring method was used to score the intestinal dilatation and intestinal structure exhibition. The diluted lactulose solution and 2.5% mannitol exhibited the best intestinal dilation degrees. Similarly, the diluted lactulose solution and 2.5% mannitol exhibited the highest scores in the entire small bowel dilatation degree and intestinal structure exhibition. Conclusions: 2.5% osmotic mannitol and the diluted lactulose solution enabled the full dilatation of small bowel, and could clearly exhibit the wall structure. PMID:26629131

  1. The influence of novel CT reconstruction technique and ECG-gated technique on image quality and patient dose of cardiac computed tomography.

    PubMed

    Dyakov, I; Stoinova, V; Groudeva, V; Vassileva, J

    2015-07-01

    The aim of the present study was to compare image quality and patient dose in cardiac computed tomography angiography (CTA) in terms of volume computed tomography dose index (CTDI vol), dose length product (DLP) and effective dose, when changing from filtered back projection (FBP) to adaptive iterative dose reduction (AIDR) reconstruction techniques. Further aim was to implement prospective electrocardiogram (ECG) gating for patient dose reduction. The study was performed with Aquilion ONE 320-row CT of Toshiba Medical Systems. Analysis of cardiac CT protocols was performed before and after integration of the new software. The AIDR technique showed more than 50 % reduction in CTDIvol values and 57 % in effective dose. The subjective evaluation of clinical images confirmed the adequate image quality acquired by the AIDR technique. The preliminary results indicated significant dose reduction when using prospective ECG gating by keeping the adequate diagnostic quality of clinical images. PMID:25836680

  2. Multi-detector CT assessment in pulmonary hypertension: techniques, systematic approach to interpretation and key findings

    PubMed Central

    Lewis, Gareth; Reynolds, John H.; Ganeshan, Arul; Ment, Jerome

    2015-01-01

    Pulmonary arterial hypertension (PAH) may be suspected based on the clinical history, physical examination and electrocardiogram findings but imaging is usually central to confirming the diagnosis, establishing a cause and guiding therapy. The diagnostic pathway of PAH involves a variety of complimentary investigations of which computed tomography pulmonary angiography (CTPA) has established a central role both in helping identify an underlying cause for PAH and assessing resulting functional compromise. In particular CTPA is considered as the gold standard technique for the diagnosis of thromboembolic disease. This article reviews the CTPA evaluation in PAH, describing CTPA techniques, a systematic approach to interpretation and spectrum of key imaging findings. PMID:26029645

  3. A retrospective comparison of smart prep and test bolus multi-detector CT pulmonary angiography protocols

    SciTech Connect

    Suckling, Tara; Smith, Tony; Reed, Warren

    2013-06-15

    Optimal arterial opacification is crucial in imaging the pulmonary arteries using computed tomography (CT). This poses the challenge of precisely timing data acquisition to coincide with the transit of the contrast bolus through the pulmonary vasculature. The aim of this quality assurance exercise was to investigate if a change in CT pulmonary angiography (CTPA) scanning protocol resulted in improved opacification of the pulmonary arteries. Comparison was made between the smart prep protocol (SPP) and the test bolus protocol (TBP) for opacification in the pulmonary trunk. A total of 160 CTPA examinations (80 using each protocol) performed between January 2010 and February 2011 were assessed retrospectively. CT attenuation coefficients were measured in Hounsfield Units (HU) using regions of interest at the level of the pulmonary trunk. The average pixel value, standard deviation (SD), maximum, and minimum were recorded. For each of these variables a mean value was then calculated and compared for these two CTPA protocols. Minimum opacification of 200 HU was achieved in 98% of the TBP sample but only 90% of the SPP sample. The average CT attenuation over the pulmonary trunk for the SPP was 329 (SD = ±21) HU, whereas for the TBP it was 396 (SD = ±22) HU (P = 0.0017). The TBP also recorded higher maximum (P = 0.0024) and minimum (P = 0.0039) levels of opacification. This study has found that a TBP resulted in significantly better opacification of the pulmonary trunk than the SPP.

  4. Improving Image Quality of On-Board Cone-Beam CT in Radiation Therapy Using Image Information Provided by Planning Multi-Detector CT: A Phantom Study

    PubMed Central

    Yang, Ching-Ching; Chen, Fong-Lin; Lo, Yeh-Chi

    2016-01-01

    Purpose The aim of this study was to improve the image quality of cone-beam computed tomography (CBCT) mounted on the gantry of a linear accelerator used in radiation therapy based on the image information provided by planning multi-detector CT (MDCT). Methods MDCT-based shading correction for CBCT and virtual monochromatic CT (VMCT) synthesized using the dual-energy method were performed. In VMCT, the high-energy data were obtained from CBCT, while the low-energy data were obtained from MDCT. An electron density phantom was used to investigate the efficacy of shading correction and VMCT on improving the target detectability, Hounsfield unit (HU) accuracy and variation, which were quantified by calculating the contrast-to-noise ratio (CNR), the percent difference (%Diff) and the standard deviation of the CT numbers for tissue equivalent background material, respectively. Treatment plan studies for a chest phantom were conducted to investigate the effects of image quality improvement on dose planning. Results For the electron density phantom, the mean value of CNR was 17.84, 26.78 and 34.31 in CBCT, shading-corrected CBCT and VMCT, respectively. The mean value of %Diff was 152.67%, 11.93% and 7.66% in CBCT, shading-corrected CBCT and VMCT, respectively. The standard deviation within a uniform background of CBCT, shading-corrected CBCT and VMCT was 85, 23 and 15 HU, respectively. With regards to the chest phantom, the monitor unit (MU) difference between the treatment plan calculated using MDCT and those based on CBCT, shading corrected CBCT and VMCT was 6.32%, 1.05% and 0.94%, respectively. Conclusions Enhancement of image quality in on-board CBCT can contribute to daily patient setup and adaptive dose delivery, thus enabling higher confidence in patient treatment accuracy in radiation therapy. Based on our results, VMCT has the highest image quality, followed by the shading corrected CBCT and the original CBCT. The research results presented in this study should be

  5. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    SciTech Connect

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.

    2014-04-15

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same

  6. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    PubMed Central

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.

    2014-01-01

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same

  7. 320-Row wide volume CT significantly reduces density heterogeneity observed in the descending aorta: comparisons with 64-row helical CT.

    PubMed

    Yamashiro, Tsuneo; Miyara, Tetsuhiro; Honda, Osamu; Kamiya, Ayano; Tanaka, Yuko; Murayama, Sadayuki

    2014-01-01

    The aim of this study was to compare density heterogeneity on wide volume (WV) scans with that on helical CT scans. 22 subjects underwent chest CT using 320-WV and 64-helical modes. Density heterogeneity of the descending aorta was evaluated quantitatively and qualitatively. At qualitative assessment, the heterogeneity was judged to be smaller on WV scans than on helical scans (p<0.0001). Mean changes in aortic density between two contiguous slices were 1.64 HU (3.40%) on WV scans and 2.29 HU (5.19%) on helical scans (p<0.0001). CT density of thoracic organs is more homogeneous and reliable on WV scans than on helical scans. PMID:24210879

  8. A cardiac phantom study on quantitative correction of coronary calcium score on multi-detector, dual source, and electron beam tomography for velocity, calcification density, and acquisition time

    NASA Astrophysics Data System (ADS)

    Greuter, Marcel J. W.; Groen, Jaap M.; Nicolai, Lieuwe J.; Dijkstra, Hildebrand; Oudkerk, Matthijs

    2009-02-01

    Objective: To quantify the influence of velocity, calcification density and acquisition time on coronary calcium determination using multi-detector CT, dual-source CT and EBT. Materials and Methods: Artificial arteries with four calcifications of increasing density were attached to a robotic arm to which a linear movement was applied between 0 and 120 mm/s (step 10 mm/s). The phantom was scanned five times on 64-slice MDCT, DSCT and EBT using a standard acquisition protocol and the average Agatston score was determined. Results: Increasing motion artifacts were observed at increasing velocities on all scanners, with increasing severity from EBT to DSCT to 64-slice MDCT. The Agatston score showed a linear dependency on velocity from which a correction factor was derived. This correction factor showed a linear dependency on calcification density (0.92<=R2<=0.95). The slope and offset of this correction factor also showed a linear dependency on acquisition time (0.84<=R2<=0.86). Conclusion: The Agatston score is highly dependent on the average density of individual calcifications. The dependency of the Agatston score on velocity shows a linear behaviour on calcification density. A quantitative method could be derived which corrects the measured calcium score for the influence of velocity, calcification density and acquisition time.

  9. Development of adaptive noise reduction filter algorithm for pediatric body images in a multi-detector CT

    NASA Astrophysics Data System (ADS)

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Okita, Izumi; Ninomiya, Yuuji; Tomoshige, Yukihiro; Kurokawa, Takehiro; Ono, Yutaka; Nakamura, Yuko; Suzuki, Masayuki

    2008-03-01

    Recently, several kinds of post-processing image filters which reduce the noise of computed tomography (CT) images have been proposed. However, these image filters are mostly for adults. Because these are not very effective in small (< 20 cm) display fields of view (FOV), we cannot use them for pediatric body images (e.g., premature babies and infant children). We have developed a new noise reduction filter algorithm for pediatric body CT images. This algorithm is based on a 3D post-processing in which the output pixel values are calculated by nonlinear interpolation in z-directions on original volumetric-data-sets. This algorithm does not need the in-plane (axial plane) processing, so the spatial resolution does not change. From the phantom studies, our algorithm could reduce SD up to 40% without affecting the spatial resolution of x-y plane and z-axis, and improved the CNR up to 30%. This newly developed filter algorithm will be useful for the diagnosis and radiation dose reduction of the pediatric body CT images.

  10. Non-invasive Detection of Aortic and Coronary Atherosclerosis in Homozygous Familial Hypercholesterolemia by 64 Slice Multi-detector Row Computed Tomography Angiography

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Homozygous familial hypercholesterolemia (HoFH) is a rare disorder characterized by the early onset of atherosclerosis, often at the ostia of coronary arteries. In this study we document for the first time that aortic and coronary atherosclerosis can be detected using 64 slice multiple detector-row ...

  11. Non-invasive detection of aortic and coronary atherosclerosis in homozygous familial hypercholesterolemia by 64 slice multi-detector row computed tomography angiography

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Homozygous familial hypercholesterolemia (HoFH) is a rare disorder characterized by the early onset of atherosclerosis, often at the ostia of coronary arteries. In this study we document for the first time that aortic and coronary atherosclerosis can be detected using 64 slice multiple detector row ...

  12. Quantitative assessment of lipiodol deposition after Chemoembolization: Comparison between cone-beam CT and multi-detector CT

    PubMed Central

    Chen, Rongxin; Geschwind, Jean-François; Wang, Zhijun; Tacher, Vania; Lin, MingDe

    2013-01-01

    Purpose To evaluate the ability of cone-beam computed tomography (CBCT) acquired directly after TACE to assess lipiodol deposition in hepatocellular carcinoma (HCC) and compare it to unenhanced MDCT. Materials and methods Fifteen HCC patients were treated with conventional TACE, and CBCT was used to assess the lipiodol deposition directly after TACE. Unenhanced MDCT was performed 24 hours after TACE. Four patients were excluded because the margin of tumor or area of lipiodol deposition was not clear. The image enhancement density of the entire tumor (T) and liver parenchyma (L) was measured by ImageJ software, and tumor-to-liver contrast (TLC) was calculated. In addition, volumetric measurement of tumor and lipiodol was performed by semiautomatic 3D volume segmentation and compared using linear regression to evaluate consistency between two imaging modalities. Results The mean value of TLC on CBCT was not significantly different from that on MDCT (337.7±233.5 HU versus 283.0±152.1 HU, P=0.103). The average volume of the whole tumor and of only the lipiodol deposited regions, and the calculated average percentage of lipiodol retention on CBCT were not significantly different when compared to MDCT (tumor volume: 9.6±11.8 cm3 versus 10.8±14.2 cm3, P=0.142; lipiodol volume: 6.3±7.7 cm3 versus 7.0±8.1 cm3, P=0.214; percentage of lipiodol retention: 68.9%±24.0% versus 72.2%±23.1%, P=0.578). Additionally, there was a high correlation in the volume of tumor and lipiodol between CBCT and MDCT (R2=0.919 and 0.903, respectively). Conclusions The quantitative image enhancement and volume analyses demonstrate that CBCT is similar to MDCT in assessing the lipiodol deposition in HCC after TACE. PMID:24094672

  13. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Gu, J.; Bednarz, B.; Caracappa, P. F.; Xu, X. G.

    2009-05-01

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  14. Diagnosis of Small-Bowel Diseases: Prospective Comparison of Multi-Detector Row CT Enterography with MR Enterography.

    PubMed

    Masselli, Gabriele; Di Tola, Marco; Casciani, Emanuele; Polettini, Elisabetta; Laghi, Francesca; Monti, Riccardo; Bernieri, Maria Giulia; Gualdi, Gianfranco

    2016-05-01

    Purpose To prospectively compare the accuracies of computed tomographic (CT) enterography and magnetic resonance (MR) enterography for the detection and characterization of small-bowel diseases. Materials and Methods The institutional review board approved the study protocol, and informed consent was obtained from all participants. From June 2009 to July 2013, 150 consecutive patients (81 men and 69 women; mean age, 38.8 years; range, 18-74 years), who were suspected of having a small-bowel disease on the basis of clinical findings and whose previous upper and lower gastrointestinal endoscopy findings were normal, underwent CT and MR enterography. Two independent readers reviewed CT and MR enterographic images for the presence of small-bowel diseases, for differentiating between inflammatory and noninflammatory diseases, and for extraenteric complications. The histopathologic findings of surgical (n = 23) and endoscopic (n = 32) biopsy specimens were used as the reference standard; the results of video-capsule endoscopy (n = 36) and clinical follow-up (n = 59) were used only to confirm the absence of small-bowel disease. Results MR and CT enterography were successfully performed in all 150 patients. Overall sensitivity, specificity, and accuracy, respectively, in identifying patients with small-bowel lesions were 75.9% (41 of 54), 94.8% (91 of 96), and 88.0% (132 of 150) for CT enterography and 92.6% (50 of 54), 99.0% (95 of 96), and 96.7% (145 of 150) for MR enterography. The sensitivity of MR enterography was significantly higher than that of CT enterography for the detection of both overall small-bowel diseases (P = .0159) and neoplastic diseases (P = .0412) but not for the detection of inflammatory diseases (P > .99) or noninflammatory and nonneoplastic diseases (P = .6171). Conclusion MR enterography is more accurate than CT enterography in the detection of small-bowel diseases; MR enterography was more accurate in detecting neoplastic diseases in particular

  15. Diagnostic and prognostic role of computed tomography in extracorporeal shock wave lithotripsy complications

    PubMed Central

    Telegrafo, Michele; Carluccio, Davide Antonio; Rella, Leonarda; Ianora, Amato Antonio Stabile; Angelelli, Giuseppe; Moschetta, Marco

    2016-01-01

    Purpose: To evaluate the role of multidetector computed tomography (MDCT) in recognizing the complications of extracorporeal shock wave lithotripsy (ESWL) and providing a prognostic grading system for the therapeutic approach. Materials and Methods: A total of 43 patients who underwent ESWL because of urinary stone disease were assessed by 320-row MDCT examination before and after ESWL. Pre-ESWL CT unenhanced scans were performed for diagnosing stone disease. Post-ESWL CT scans were acquired before and after intravenous injection of contrast medium searching for peri-renal fluid collection or hyper-density, pyelic or ureteral wall thickening, blood clots in the urinary tract, peri- or intra-renal hematoma or abscess, active bleeding. A severity grading system of ESWL complications was established. Results: Patients were affected by renal (n = 36) or ureteral (n = 7) lithiasis. Post-ESWL CT examination detected small fluid collections and hyper-density of peri-renal fat tissue in 35/43 patients (81%), pyelic or ureteral wall thickening in 2/43 (4%), blood clots in the urinary tract in 9/43 (21%), renal abscesses or hematomas with a diameter of <2 cm in 10/43 (23%), large retroperitoneal collections in 3/43 (7%), active bleeding from renal vessels in 1/43 (2%). Mild complications were found in 30 cases; moderate in 9; severe in 4. The therapeutic choice was represented by clinical follow-up (n = 20), clinical and CT follow-up (n = 10), ureteral stenting (n = 9), drainage of large retroperitoneal collections (n = 3), and arterial embolization (n = 1). Conclusion: MDCT plays a crucial role in the diagnosis of urolithiasis and follow-up of patients treated with ESWL recognizing its complications and providing therapeutic and prognostic indications. PMID:27141186

  16. Cardiac sarcoidosis evaluated with gadolinium-enhanced magnetic resonance and contrast-enhanced 64-slice computed tomography.

    PubMed

    Smedema, Jan-Peter; Truter, Rene; de Klerk, Petra A; Zaaiman, Leonie; White, Leonie; Doubell, Anton F

    2006-09-20

    Sarcoidosis is a multi-system granulomatous disorder of unknown etiology with symptomatic cardiac involvement in up to 7% of patients. The clinical features of sarcoid heart disease include congestive heart failure, arrhythmias, conduction disturbances, and sudden death. We evaluated the value of contrast-enhanced multi-detector computed tomography in delineating myocardial scar and granulomatous inflammation by comparing our findings with gadolinium magnetic resonance in a patient diagnosed with cardiac sarcoidosis. PMID:16257460

  17. The Impact of Different Levels of Adaptive Iterative Dose Reduction 3D on Image Quality of 320-Row Coronary CT Angiography: A Clinical Trial

    PubMed Central

    Feger, Sarah; Rief, Matthias; Zimmermann, Elke; Martus, Peter; Schuijf, Joanne Désirée; Blobel, Jörg; Richter, Felicitas; Dewey, Marc

    2015-01-01

    Purpose The aim of this study was the systematic image quality evaluation of coronary CT angiography (CTA), reconstructed with the 3 different levels of adaptive iterative dose reduction (AIDR 3D) and compared to filtered back projection (FBP) with quantum denoising software (QDS). Methods Standard-dose CTA raw data of 30 patients with mean radiation dose of 3.2 ± 2.6 mSv were reconstructed using AIDR 3D mild, standard, strong and compared to FBP/QDS. Objective image quality comparison (signal, noise, signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), contour sharpness) was performed using 21 measurement points per patient, including measurements in each coronary artery from proximal to distal. Results Objective image quality parameters improved with increasing levels of AIDR 3D. Noise was lowest in AIDR 3D strong (p≤0.001 at 20/21 measurement points; compared with FBP/QDS). Signal and contour sharpness analysis showed no significant difference between the reconstruction algorithms for most measurement points. Best coronary SNR and CNR were achieved with AIDR 3D strong. No loss of SNR or CNR in distal segments was seen with AIDR 3D as compared to FBP. Conclusions On standard-dose coronary CTA images, AIDR 3D strong showed higher objective image quality than FBP/QDS without reducing contour sharpness. Trial Registration Clinicaltrials.gov NCT00967876 PMID:25945924

  18. Monte Carlo simulations in multi-detector CT (MDCT) for two PET/CT scanner models using MASH and FASH adult phantoms

    NASA Astrophysics Data System (ADS)

    Belinato, W.; Santos, W. S.; Paschoal, C. M. M.; Souza, D. N.

    2015-06-01

    The combination of positron emission tomography (PET) and computed tomography (CT) has been extensively used in oncology for diagnosis and staging of tumors, radiotherapy planning and follow-up of patients with cancer, as well as in cardiology and neurology. This study determines by the Monte Carlo method the internal organ dose deposition for computational phantoms created by multidetector CT (MDCT) beams of two PET/CT devices operating with different parameters. The different MDCT beam parameters were largely related to the total filtration that provides a beam energetic change inside the gantry. This parameter was determined experimentally with the Accu-Gold Radcal measurement system. The experimental values of the total filtration were included in the simulations of two MCNPX code scenarios. The absorbed organ doses obtained in MASH and FASH phantoms indicate that bowtie filter geometry and the energy of the X-ray beam have significant influence on the results, although this influence can be compensated by adjusting other variables such as the tube current-time product (mAs) and pitch during PET/CT procedures.

  19. Detection of plaque rupture using 64-slice multidetector row computed tomography

    PubMed Central

    Reimann, Anja J; Beck, Torsten; Heuschmid, Martin; Brodoefel, Harald; Burgstahler, Christof; Schröder, Stephen; Kopp, Andreas F

    2008-01-01

    The present case report describes a 37-year-old man who presented to the emergency room with symptoms of a myocardial infarction but no high-grade stenosis on conventional catheter angiography. Consecutive multi-detector row computed tomography of the coronary arteries showed an intimal flap along a fibrous plaque formation in the left anterior descending artery. This finding was found to represent a plaque rupture, and the lesion was treated with an 18 mm stent. Multidetector row computed tomography helped to correctly position the stent by identifying the exact location of the rupture along the long plaque formation. PMID:18340394

  20. Optimization of Free-Breathing Whole-Heart 3D Cardiac MRI at 3Tesla to Identify Coronary Vein Anatomy and to Compare with Multi-Detector Computed Tomography

    PubMed Central

    Ibrahim, Wael G.; El Khouli, Riham H.; Abd-Elmoniem, Khaled Z.; Matta, Jatin Raj; McAreavey, Dorothea; Gharib, Ahmed M

    2014-01-01

    Objective This study optimizes use of 3T MRI to delineate coronary venous anatomy, and compares 3T MRI with MDCT measurements. Methods The study population included 37 consecutive subjects (22 men, 19-71 years). Whole-heart contrast-enhanced MRI images at 3T were acquired using segmented k-space gradient echo with inversion recovery prepared technique. MDCT images were obtained using nonionic iodinated contrast. Results The coronary sinus, and great cardiac, posterior interventricular, and anterior interventricular veins were visualized in 100% of cases by both MRI and MDCT. Detection of the posterior vein of left ventricle and left marginal vein by MRI was 97% and 81% respectively. Bland Altman plots showed agreement in ostial diameter measured by both modalities with correlation coefficients ranging 0.5-0.76. Vein length and distances also agreed closely. Conclusion Free-breathing whole-heart 3D MRI at 3T provides high spatial resolution images and could offer an alternative imaging technique instead of MDCT scans. PMID:24983436

  1. Myocardial bridging of the right coronary artery inside the right atrial myocardium identified by ECG-gated 64-slice multidetector computed tomography angiography.

    PubMed

    Chen, Chien-Cheng; Chen, Huan-Wu; Fu, Chen-Ju; Lin, Fen-Chiung; Wen, Ming-Shien; Liu, Yuan-Chang

    2010-01-01

    A myocardial bridge (MB) is defined as an intramyocardial course of a major epicardial coronary artery, and it is mainly confined to the left ventricle and the left anterior descending coronary artery. There are rare reports of right coronary MB seen during angiographic examination. Herein, we present a 49 year-old man with right coronary artery MB without luminal narrowing in the diastolic and systolic phases of electrocardiography-gated computed tomography images. The value of multi-detector computed tomography for the detection of anatomical variants in the cardiovascular system is further discussed. PMID:20438676

  2. Radiological protection in computed tomography and cone beam computed tomography.

    PubMed

    Rehani, M M

    2015-06-01

    The International Commission on Radiological Protection (ICRP) has sustained interest in radiological protection in computed tomography (CT), and ICRP Publications 87 and 102 focused on the management of patient doses in CT and multi-detector CT (MDCT) respectively. ICRP forecasted and 'sounded the alarm' on increasing patient doses in CT, and recommended actions for manufacturers and users. One of the approaches was that safety is best achieved when it is built into the machine, rather than left as a matter of choice for users. In view of upcoming challenges posed by newer systems that use cone beam geometry for CT (CBCT), and their widened usage, often by untrained users, a new ICRP task group has been working on radiological protection issues in CBCT. Some of the issues identified by the task group are: lack of standardisation of dosimetry in CBCT; the false belief within the medical and dental community that CBCT is a 'light', low-dose CT whereas mobile CBCT units and newer applications, particularly C-arm CT in interventional procedures, involve higher doses; lack of training in radiological protection among clinical users; and lack of dose information and tracking in many applications. This paper provides a summary of approaches used in CT and MDCT, and preliminary information regarding work just published for radiological protection in CBCT. PMID:25816279

  3. Multidetector computed tomography of temporomandibular joint: A road less travelled

    PubMed Central

    Pahwa, Shivani; Bhalla, Ashu Seith; Roychaudhary, Ajoy; Bhutia, Ongkila

    2015-01-01

    This article reviews the imaging anatomy of temporomandibular joint (TMJ), describes the technique of multi-detector computed tomography (MDCT) of the TMJ, and describes in detail various osseous pathologic afflictions affecting the joint. Traumatic injuries affecting the mandibular condyle are most common, followed by joint ankylosis as a sequel to arthritis. The congenital anomalies are less frequent, hemifacial microsomia being the most commonly encountered anomaly involving the TMJ. Neoplastic afflictions of TMJ are distinctly uncommon, osteochondroma being one of the most common lesions. MDCT enables comprehensive evaluation of osseous afflictions of TMJ, and is a valuable tool for surgical planning. Sagittal, coronal and 3D reformatted images well depict osseous TMJ lesions, and their relationship to adjacent structures. PMID:25984518

  4. Novel clinical applications of dual energy computed tomography.

    PubMed

    Kraśnicki, Tomasz; Podgórski, Przemysław; Guziński, Maciej; Czarnecka, Anna; Tupikowski, Krzysztof; Garcarek, Jerzy; Marek Sąsiadek, Marek

    2012-01-01

    Dual energy CT (DECT) was conceived at the very beginning of the computed tomography era. However the first DECT scanner was developed in 2006. Nowadays there are three different types of DECT available: dual-source CT with 80(100) kVp and 140 kVp tubes (Siemens Medical Solution); dual-layer multi-detector scanner with acquisition 120 or 140kVp (Philips Healthcare); CT unit with one rapid kVp switching source and new detector based on gemstone scintillator materials (GE Healthcare). This article describes the physical background and principles of DECT imaging as well as applications of this innovative method in routine clinical practice (renal stone differentiation, pulmonary perfusion, neuroradiology and metallic implant imaging). The particular applications are illustrated by cases from author's material. PMID:23457140

  5. A Case of Coronary Cameral Fistula with Associated Aneurysm: Role of ECG Gated 256- Slice Dual Source Multidetector Computed Tomography in Diagnosis

    PubMed Central

    Garg, Lalit; Rissam, Harmeet Kaur; Puri, Sunil Kumar

    2016-01-01

    We report an interesting case of coronary cameral fistula with associated aneurysmal dilatation of coronary artery. The complete evaluation including anatomical relationships with surrounding vascular and non-vascular structures can be achieved with ECG gated multi-detector computed tomography (MDCT). MDCT has many advantages over echocardiography and digital subtraction catheter angiography, because of its ability to demonstrate the fistula separate from surrounding cardiovascular structures along with any aneurysm or obstruction in its course. Thus, MDCT is emerging as the initial non-invasive imaging technique for comprehensive preoperative evaluation of these rare congenital anomalies for cardiovascular surgeons to achieve better operative assessibity and outcome. PMID:27437325

  6. Improvement of distension and mural visualization of bowel loops using neutral oral contrasts in abdominal computed tomography

    PubMed Central

    Hashemi, Jahanbakhsh; Davoudi, Yasmin; Taghavi, Mina; Pezeshki Rad, Masoud; Moghadam, Amien Mahajeri

    2014-01-01

    AIM: To assess and compare the image quality of 4% sorbitol and diluted iodine 2% (positive oral contrast agent) in abdomino-pelvic multi-detector computed tomography. METHODS: Two-hundred patients, referred to the Radiology Department of a central educational hospital for multi-detector row abdominal-pelvic computed tomography, were randomly divided into two groups: the first group received 1500 mL of 4% sorbitol solution as a neutral contrast agent, while in the second group 1500 mL of meglumin solution as a positive contrast agent was administered in a one-way randomized prospective study. The results were independently reviewed by two radiologists. Luminal distension and mural thickness and mucosal enhancement were compared between the two groups. Statistical analysis of the results was performed by Statistical Package for the Social Sciences software version 16 and the Mann-Whitney test at a confidence level of 95%. RESULTS: Use of neutral oral contrast agent significantly improved visualization of the small bowel wall thickness and mural appearance in comparison with administration of positive contrast agent (P < 0.01). In patients who received sorbitol, the small bowel showed better distention compared with those who received iodine solution as a positive contrast agent (P < 0.05). CONCLUSION: The results of the study demonstrated that oral administration of sorbitol solution allows better luminal distention and visualization of mural features than iodine solution as a positive contrast agent. PMID:25550995

  7. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles. PMID:17205351

  8. An automated system for lung nodule detection in low-dose computed tomography

    NASA Astrophysics Data System (ADS)

    Gori, I.; Fantacci, M. E.; Preite Martinez, A.; Retico, A.

    2007-03-01

    A computer-aided detection (CAD) system for the identification of pulmonary nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 Italian project. One of the main goals of this project is to build a distributed database of lung CT scans in order to enable automated image analysis through a data and cpu GRID infrastructure. The basic modules of our lung-CAD system, a dot-enhancement filter for nodule candidate selection and a neural classifier for false-positive finding reduction, are described. The system was designed and tested for both internal and sub-pleural nodules. The results obtained on the collected database of low-dose thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  9. Diagnosis of a Cerebral Arteriovenous Malformation Using Isolated Brain Computed Tomography Angiography: Case Report.

    PubMed

    Qian, Hui; Shao, Yu; Li, Zhengdong; Huang, Ping; Zou, Donghua; Liu, Ningguo; Chen, Yijiu; Wan, Lei

    2016-09-01

    This report presents a case of a 40-year-old woman who was found dead in her house. The examination of the body revealed no external injuries. The whole body was scanned by multi-detector-row computed tomography (CT) before autopsy, revealing massive hemorrhage in the right frontal extending into the ventricular system. At autopsy, the brain parenchyma was removed. Then CT angiography was carried on the isolated brain. Computed tomography angiography suggested a mass of irregular, tortuous vessels in areas of hemorrhage in the right frontal lobe of the brain. Finally, histological examination confirmed the result of CT angiography due to an arteriovenous malformation. Hence, postmortem CT angiography played an important role in diagnosis of the cerebral arteriovenous malformation that was responsible for a massive hemorrhage in the skull. PMID:27367577

  10. Demonstrating the origin of cardiac air embolism using post-mortem computed tomography; an illustrated case.

    PubMed

    Saunders, Sarah; Kotecha, Deepjay; Morgan, Bruno; Raj, Vimal; Rutty, Guy

    2011-03-01

    An 83 year old female was found dead in her home. The deceased had been struck repeatedly to the head with at least one weapon, one of which was a hammer. The deceased had suffered both penetrating and non-penetrating blunt trauma to the head as a result of the assault. A multi-detector computed tomography (MDCT) scan was undertaken approximately 12h after death prior to the autopsy examination. This demonstrated the presence of a cardiac air embolus and continuity between the air embolus and the penetrating head injury. Air within the heart is a recognised post-mortem artefact frequently seen on MDCT scans and a common pitfall for inexperienced cadaveric MDCT reporters. This case builds upon a previous report by Kauczor, illustrating how MDCT can be used to demonstrate the origin and route of ingress of a genuine air embolism to the heart. PMID:21131225

  11. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  12. Multi-Detector Analysis System for Spent Nuclear Fuel Characterization

    SciTech Connect

    Reber, Edward Lawrence; Aryaeinejad, Rahmat; Cole, Jerald Donald; Drigert, Mark William; Jewell, James Keith; Egger, Ann Elizabeth; Cordes, Gail Adele

    1999-09-01

    The Spent Nuclear Fuel (SNF) Non-Destructive Analysis (NDA) program at INEEL is developing a system to characterize SNF for fissile mass, radiation source term, and fissile isotopic content. The system is based on the integration of the Fission Assay Tomography System (FATS) and the Gamma-Neutron Analysis Technique (GNAT) developed under programs supported by the DOE Office of Non-proliferation and National Security. Both FATS and GNAT were developed as separate systems to provide information on the location of special nuclear material in weapons configuration (FATS role), and to measure isotopic ratios of fissile material to determine if the material was from a weapon (GNAT role). FATS is capable of not only determining the presence and location of fissile material but also the quantity of fissile material present to within 50%. GNAT determines the ratios of the fissile and fissionable material by coincidence methods that allow the two prompt (immediately) produced fission fragments to be identified. Therefore, from the combination of FATS and GNAT, MDAS is able to measure the fissile material, radiation source term, and fissile isotopics content.

  13. Estimation of computed tomography dose index in cone beam computed tomography: MOSFET measurements and Monte Carlo simulations.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald

    2010-05-01

    To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry. PMID:20386198

  14. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  15. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  16. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  17. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  18. COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    Over the last several years, there has been increased pressure to utilize novel technologies derived from computational chemistry, molecular biology and systems biology in toxicological risk assessment. This new area has been referred to as "Computational Toxicology". Our resear...

  19. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  20. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  1. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  2. DNA computing.

    PubMed

    Gibbons, A; Amos, M; Hodgson, D

    1997-02-01

    DNA computation is a novel and exciting recent development at the interface of computer science and molecular biology. We describe the current activity in this field following the seminal work of Adleman, who recently showed how techniques of molecular biology may be applied to the solution of a computationally intractable problem. PMID:9013647

  3. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  4. Computer Corner.

    ERIC Educational Resources Information Center

    Hampel, Paul J.

    1984-01-01

    Presents: (1) a computer program which will change a fraction into a decimal; (2) a program in which students supply missing lines to create the output given; and (3) suggestions for adding computer awareness to classrooms, including use of used punch cards and old computer-generated printouts. (JN)

  5. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  6. Computer Literacy.

    ERIC Educational Resources Information Center

    Hunter, Beverly

    The concept of computer literacy is examined as it applies to two-year colleges. The paper begins with definitions of the term, emphasizing the skills, knowledge, and attitudes toward computers that are considered criteria for computer literacy. The paper continues by describing a conference at which educators attempted to visualize the technology…

  7. Computer Manual.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This manual designed to provide the teacher with methods of understanding the computer and its potential in the classroom includes four units with exercises and an answer sheet. Unit 1 covers computer fundamentals, the mini computer, programming languages, an introduction to BASIC, and control instructions. Variable names and constants described…

  8. The Spectrum of Presentations of Cryptogenic Organizing Pneumonia in High Resolution Computed Tomography

    PubMed Central

    Mehrian, Payam; Shahnazi, Makhtoom; Dahaj, Ali Ahmadi; Bizhanzadeh, Sorour; Karimi, Mohammad Ali

    2014-01-01

    Summary Background Various radiologic patterns of cryptogenic organizing pneumonia (COP) in X-rays have been reported for more than 20 years, and later, in computed tomography scans. The aim of the present study was to describe the spectrum of radiologic findings on high resolution computed tomography (HRCT) scans in patients with COP. Material/Methods HRCT scans of 31 sequential patients (mean age: 54.3±11 years; 55% male) with biopsy-proven COP in a tertiary lung center between 2009 and 2012 were reviewed by two experienced pulmonary radiologists with almost perfect interobserver agreement (kappa=0.83). Chest HRCTs from the lung apex to the base were performed using a 16-slice multi-detector CT scanner. Results The most common HRCT presentation of COP was ground-glass opacity (GGO) in 83.9% of cases, followed by consolidation in 71%. Both findings were mostly asymmetric bilateral and multifocal. Other common findings were the reverse halo (48.4%), parenchymal bands (54.8%) and subpleural bands (32.3%). Pulmonary nodules were found in about one-third of patients and were frequently smaller than 5 mm in diameter. Both GGOs and consolidations were revealed more often in the lower lobes. Conclusions The main presentations of COP on HRCT include bilateral GGOs and consolidations in the lower lobes together with the reverse halo sign. PMID:25493105

  9. What can computed tomography and magnetic resonance imaging tell us about ventilation?

    PubMed Central

    Simon, Brett A.; Kaczka, David W.; Bankier, Alexander A.

    2012-01-01

    This review provides a summary of pulmonary functional imaging approaches for determining pulmonary ventilation, with a specific focus on multi-detector x-ray computed tomography and magnetic resonance imaging (MRI). We provide the important functional definitions of pulmonary ventilation typically used in medicine and physiology and discuss the fact that some of the imaging literature describes gas distribution abnormalities in pulmonary disease that may or may not be related to the physiological definition or clinical interpretation of ventilation. We also review the current state-of-the-field in terms of the key physiological questions yet unanswered related to ventilation and gas distribution in lung disease. Current and emerging imaging research methods are described, including their strengths and the challenges that remain to translate these methods to more wide-spread research and clinical use. We also examine how computed tomography and MRI might be used in the future to gain more insight into gas distribution and ventilation abnormalities in pulmonary disease. PMID:22653989

  10. Quantum Computing

    NASA Astrophysics Data System (ADS)

    Steffen, Matthias

    2013-03-01

    Quantum mechanics plays a crucial role in many day-to-day products, and has been successfully used to explain a wide variety of observations in Physics. While some quantum effects such as tunneling limit the degree to which modern CMOS devices can be scaled to ever reducing dimensions, others may potentially be exploited to build an entirely new computing architecture: The quantum computer. In this talk I will review several basic concepts of a quantum computer. Why quantum computing and how do we do it? What is the status of several (but not all) approaches towards building a quantum computer, including IBM's approach using superconducting qubits? And what will it take to build a functional machine? The promise is that a quantum computer could solve certain interesting computational problems such as factoring using exponentially fewer computational steps than classical systems. Although the most sophisticated modern quantum computing experiments to date do not outperform simple classical computations, it is increasingly becoming clear that small scale demonstrations with as many as 100 qubits are beginning to be within reach over the next several years. Such a demonstration would undoubtedly be a thrilling feat, and usher in a new era of controllably testing quantum mechanics or quantum computing aspects. At the minimum, future demonstrations will shed much light on what lies ahead.

  11. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  12. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips. PMID:3536223

  13. Computer Literacy: Teaching Computer Ethics.

    ERIC Educational Resources Information Center

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  14. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  15. Computational aerothermodynamics

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.; Green, Michael J.

    1987-01-01

    Computational aerothermodynamics (CAT) has in the past contributed to the understanding of real-gas flows encountered by hypervelocity reentry vehicles. With advances in computational fluid dynamics, in the modeling of high temperature phenomena, and in computer capability, CAT is an enabling technology for the design of many future space vehicles. An overview of the current capabilities of CAT is provided by describing available methods and their applications. Technical challenges that need to be met are discussed.

  16. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  17. Computed Tomography

    NASA Astrophysics Data System (ADS)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  18. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  19. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  20. Computer Code

    NASA Technical Reports Server (NTRS)

    1985-01-01

    COSMIC MINIVER, a computer code developed by NASA for analyzing aerodynamic heating and heat transfer on the Space Shuttle, has been used by Marquardt Company to analyze heat transfer on Navy/Air Force missile bodies. The code analyzes heat transfer by four different methods which can be compared for accuracy. MINIVER saved Marquardt three months in computer time and $15,000.

  1. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  2. Recreational Computing.

    ERIC Educational Resources Information Center

    Strot, Melody

    1999-01-01

    Urges teachers of gifted students to allow students unstructured recreational computer time in the classroom to encourage student exploration and discovery, to promote creativity, to develop problem-solving skills, and to allow time to revisit programs and complete their own tasks. Different types of educational computer programs are referenced.…

  3. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  4. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  5. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  6. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  7. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  8. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  9. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  10. Scanning protocol optimization and dose evaluation in coronary stenosis using multi-slices computed tomography

    NASA Astrophysics Data System (ADS)

    Huang, Yung-hui; Chen, Chia-lin; Sheu, Chin-yin; Lee, Jason J. S.

    2007-02-01

    Cardiovascular diseases are the most common incidence for premature death in developed countries. A major fraction is attributable to atherosclerotic coronary artery disease, which may result in sudden cardiac failure. A reduction of mortality caused by myocardial infarction may be achieved if coronary atherosclerosis can be detected and treated at an early stage before symptoms occur. Therefore, there is need for an effective tool that allows identification of patients at increased risk for future cardiac events. The current multi-detector CT has been widely used for detection and quantification of coronary calcifications as a sign of coronary atherosclerosis. The aim of this study is to optimize the diagnostic values and radiation exposure in coronary artery calcium-screening examination using multi-slice CT (MSCT) with different image scan protocols. The radiation exposure for all protocols is evaluated by using computed tomography dose index (CTDI) phantom measurements. We chose an optimal scanning protocol and evaluated patient radiation dose in the MSCT coronary artery screenings and preserved its expecting diagnostic accuracy. These changes make the MSCT have more operation flexibility and provide more diagnostic values in current practice.

  11. Computer-aided detection of small bowel strictures in CT enterography

    NASA Astrophysics Data System (ADS)

    Sainani, Nisha I.; Näppi, Janne J.; Sahani, Dushyant V.; Yoshida, Hiroyuki

    2011-03-01

    The workflow of CT enterography in an emergency setting could be improved significantly by computer-aided detection (CAD) of small bowel strictures to enable even non-expert radiologists to detect sites of obstruction rapidly. We developed a CAD scheme to detect strictures automatically from abdominal multi-detector CT enterography image data by use of multi-scale template matching and a blob detector method. A pilot study was performed on 15 patients with 22 surgically confirmed strictures to study the effect of the CAD scheme on observer performance. The 77% sensitivity of an inexperienced radiologist assisted by CAD was comparable with the 81% sensitivity of an unaided expert radiologist (p=0.07). The use of CAD reduced the reading time to identify strictures significantly (p<0.0001). Most of the false-positive CAD detections were caused by collapsed bowel loops, approximated bowel wall, muscles, or vessels, and they were easy to dismiss. The results indicate that CAD could provide radiologists with a rapid and accurate interpretation of strictures to improve workflow in an emergency setting.

  12. Evolutionary Computing

    SciTech Connect

    Patton, Robert M; Cui, Xiaohui; Jiao, Yu; Potok, Thomas E

    2008-01-01

    The rate at which information overwhelms humans is significantly more than the rate at which humans have learned to process, analyze, and leverage this information. To overcome this challenge, new methods of computing must be formulated, and scientist and engineers have looked to nature for inspiration in developing these new methods. Consequently, evolutionary computing has emerged as new paradigm for computing, and has rapidly demonstrated its ability to solve real-world problems where traditional techniques have failed. This field of work has now become quite broad and encompasses areas ranging from artificial life to neural networks. This chapter focuses specifically on two sub-areas of nature-inspired computing: Evolutionary Algorithms and Swarm Intelligence.

  13. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  14. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  15. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  16. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  17. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  18. Computational structures for robotic computations

    NASA Technical Reports Server (NTRS)

    Lee, C. S. G.; Chang, P. R.

    1987-01-01

    The computational problem of inverse kinematics and inverse dynamics of robot manipulators by taking advantage of parallelism and pipelining architectures is discussed. For the computation of inverse kinematic position solution, a maximum pipelined CORDIC architecture has been designed based on a functional decomposition of the closed-form joint equations. For the inverse dynamics computation, an efficient p-fold parallel algorithm to overcome the recurrence problem of the Newton-Euler equations of motion to achieve the time lower bound of O(log sub 2 n) has also been developed.

  19. [DNA computing].

    PubMed

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  20. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  1. From macro-scale to micro-scale computational anatomy: a perspective on the next 20 years.

    PubMed

    Mori, Kensaku

    2016-10-01

    This paper gives our perspective on the next two decades of computational anatomy, which has made great strides in the recognition and understanding of human anatomy from conventional clinical images. The results from this field are now used in a variety of medical applications, including quantitative analysis of organ shapes, interventional assistance, surgical navigation, and population analysis. Several anatomical models have also been used in computational anatomy, and these mainly target millimeter-scale shapes. For example, liver-shape models are almost completely modeled at the millimeter scale, and shape variations are described at such scales. Most clinical 3D scanning devices have had just under 1 or 0.5 mm per voxel resolution for over 25 years, and this resolution has not changed drastically in that time. Although Z-axis (head-to-tail direction) resolution has been drastically improved by the introduction of multi-detector CT scanning devices, in-plane resolutions have not changed very much either. When we look at human anatomy, we can see different anatomical structures at different scales. For example, pulmonary blood vessels and lung lobes can be observed in millimeter-scale images. If we take 10-µm-scale images of a lung specimen, the alveoli and bronchiole regions can be located in them. Most work in millimeter-scale computational anatomy has been done by the medical-image analysis community. In the next two decades, we encourage our community to focus on micro-scale computational anatomy. In this perspective paper, we briefly review the achievements of computational anatomy and its impacts on clinical applications; furthermore, we show several possibilities from the viewpoint of microscopic computational anatomy by discussing experimental results from our recent research activities. PMID:27423408

  2. Computational psychiatry.

    PubMed

    Wang, Xiao-Jing; Krystal, John H

    2014-11-01

    Psychiatric disorders such as autism and schizophrenia, arise from abnormalities in brain systems that underlie cognitive, emotional, and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically realistic modeling bridging cellular and synaptic mechanisms with behavior, and model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  3. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  4. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  5. Computational vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  6. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  7. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  8. Quantum Computation

    NASA Astrophysics Data System (ADS)

    Ekert, Artur

    1994-08-01

    As computers become faster they must become smaller because of the finiteness of the speed of light. The history of computer technology has involved a sequence of changes from one type of physical realisation to another - from gears to relays to valves to transistors to integrated circuits and so on. Quantum mechanics is already important in the design of microelectronic components. Soon it will be necessary to harness quantum mechanics rather than simply take it into account, and at that point it will be possible to give data processing devices new functionality.

  9. Computer Guerrillas.

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1983-01-01

    Describes several cases in which microcomputers were used to prevent large organizations (e.g., utility companies, U.S. Government Forestry Commission) from carrying out actions considered not to be in the public's best interests. The use of the computers by social activitists in their efforts to halt environmental destruction is discussed. (EAO)

  10. Computer Corner.

    ERIC Educational Resources Information Center

    Smith, David A.; And Others

    1986-01-01

    APL was invented specifically as a mathematical teaching tool, and is an excellent vehicle for teaching mathematical concepts using computers. This article illustrates the use of APL in teaching many different topics in mathematics, including logic, set theory, functions, statistics, linear algebra, and matrices. (MNS)

  11. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    Provides tips to help primary-aged students with computer keyboarding skills (suggesting the use of color codes and listing currently available software). Also describes (and lists) a program which helps test students' understanding of IF-THEN statements and illustrates some hazards of "spaghetti programming" (debugging). (JN)

  12. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  13. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  14. Computer proposals

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    To expand the research community's access to supercomputers, the National Science Foundation (NSF) has begun a program to match researchers who require the capabilities of a supercomputer with those facilities that have such computer resources available.Recent studies on computer needs in scientific and engineering research underscore the need for greater access to supercomputers (Eos, July 6, 1982, p. 562), especially those categorized as “Class VI” machines. Complex computer models for research on astronomy, the oceans, and the atmosphere often require such capabilities. In addition, similar needs are emerging in the earth sciences: A Union session at the AGU Fall Meeting in San Francisco this week will focus on the research computing needs of the geosciences. A Class VI supercomputer has a memory capacity of at least 1 megaword, a speed of upwards of 100 MFLOPS (million floating point operations per second), and both scalar and vector registers in the CPU (central processing unit). Examples of Class VI machines are the CRAY-1 and the CYBER 205. The high costs o f these machines, the most powerful ones available, preclude most research facilities from owning one.

  15. Amorphous Computing

    NASA Astrophysics Data System (ADS)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  16. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  17. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  18. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  19. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  20. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  1. New computer architectures

    SciTech Connect

    Tiberghien, J.

    1984-01-01

    This book presents papers on supercomputers. Topics considered include decentralized computer architecture, new programming languages, data flow computers, reduction computers, parallel prefix calculations, structural and behavioral descriptions of digital systems, instruction sets, software generation, personal computing, and computer architecture education.

  2. Radiological Protection in Cone Beam Computed Tomography (CBCT). ICRP Publication 129.

    PubMed

    Rehani, M M; Gupta, R; Bartling, S; Sharp, G C; Pauwels, R; Berris, T; Boone, J M

    2015-07-01

    The objective of this publication is to provide guidance on radiological protection in the new technology of cone beam computed tomography (CBCT). Publications 87 and 102 dealt with patient dose management in computed tomography (CT) and multi-detector CT. The new applications of CBCT and the associated radiological protection issues are substantially different from those of conventional CT. The perception that CBCT involves lower doses was only true in initial applications. CBCT is now used widely by specialists who have little or no training in radiological protection. This publication provides recommendations on radiation dose management directed at different stakeholders, and covers principles of radiological protection, training, and quality assurance aspects. Advice on appropriate use of CBCT needs to be made widely available. Advice on optimisation of protection when using CBCT equipment needs to be strengthened, particularly with respect to the use of newer features of the equipment. Manufacturers should standardise radiation dose displays on CBCT equipment to assist users in optimisation of protection and comparisons of performance. Additional challenges to radiological protection are introduced when CBCT-capable equipment is used for both fluoroscopy and tomography during the same procedure. Standardised methods need to be established for tracking and reporting of patient radiation doses from these procedures. The recommendations provided in this publication may evolve in the future as CBCT equipment and applications evolve. As with previous ICRP publications, the Commission hopes that imaging professionals, medical physicists, and manufacturers will use the guidelines and recommendations provided in this publication for implementation of the Commission's principle of optimisation of protection of patients and medical workers, with the objective of keeping exposures as low as reasonably achievable, taking into account economic and societal factors, and

  3. A hybrid lung and vessel segmentation algorithm for computer aided detection of pulmonary embolism

    NASA Astrophysics Data System (ADS)

    Raghupathi, Laks; Lakare, Sarang

    2009-02-01

    Advances in multi-detector technology have made CT pulmonary angiography (CTPA) a popular radiological tool for pulmonary emboli (PE) detection. CTPA provide rich detail of lung anatomy and is a useful diagnostic aid in highlighting even very small PE. However analyzing hundreds of slices is laborious and time-consuming for the practicing radiologist which may also cause misdiagnosis due to the presence of various PE look-alike. Computer-aided diagnosis (CAD) can be a potential second reader in providing key diagnostic information. Since PE occurs only in vessel arteries, it is important to mark this region of interest (ROI) during CAD preprocessing. In this paper, we present a new lung and vessel segmentation algorithm for extracting contrast-enhanced vessel ROI in CTPA. Existing approaches to segmentation either provide only the larger lung area without highlighting the vessels or is computationally prohibitive. In this paper, we propose a hybrid lung and vessel segmentation which uses an initial lung ROI and determines the vessels through a series of refinement steps. We first identify a coarse vessel ROI by finding the "holes" from the lung ROI. We then use the initial ROI as seed-points for a region-growing process while carefully excluding regions which are not relevant. The vessel segmentation mask covers 99% of the 259 PE from a real-world set of 107 CTPA. Further, our algorithm increases the net sensitivity of a prototype CAD system by 5-9% across all PE categories in the training and validation data sets. The average run-time of algorithm was only 100 seconds on a standard workstation.

  4. Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Astsatryan, H. V.

    2015-07-01

    Present astronomical archives that contain billions of objects, both Galactic and extragalactic, and the vast amount of data on them allow new studies and discoveries. Astrophysical Virtual Observatories (VO) use available databases and current observing material as a collection of interoperating data archives and software tools to form a research environment in which complex research programs can be conducted. Most of the modern databases give at present VO access to the stored information, which makes possible also a fast analysis and managing of these data. Cross-correlations result in revealing new objects and new samples. Very often dozens of thousands of sources hide a few very interesting ones that are needed to be discovered by comparison of various physical characteristics. VO is a prototype of Grid technologies that allows distributed data computation, analysis and imaging. Particularly important are data reduction and analysis systems: spectral analysis, SED building and fitting, modelling, variability studies, cross correlations, etc. Computational astrophysics has become an indissoluble part of astronomy and most of modern research is being done by means of it.

  5. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536

  6. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  7. Computational Cosmology

    NASA Astrophysics Data System (ADS)

    Abel, Tom

    2013-01-01

    Gravitational instability of small density fluctuations, possibly created during an early inflationary period, is the key process leading to the formation of all structure in the Universe. New numerical algorithms have recently enabled much progress in understanding the relevant physical processes dominating the first billion years of structure formation. Computational cosmologists are attempting to simulate on their supercomputers how galaxies come about. In recent years first attempts trying to follow the formation and eventual death of every single star in these model galaxies has become to be within reach. The models now include gravity for both dark matter and baryonic matter, hydrodynamics, follow the radiation from massive stars and its impact in shaping the surrounding material, gas chemistry and all the key radiative atomic and molecular physics determining the thermal state of the model gas. In a small number of cases even the rold of magnetic fields on galactic scales is being studied. At the same time we are learning more about the limitations of certain numerical techniques and developing new schemes to more accurately follow the interplay of these many different physical processes. This talk is in two parts. First we consider a birds eye view of the relevant physical processes relevant for structure formation and potential approaches in solving the relevant equations efficiently and accurately on modern supercomputers. Secondly, we focus in on one of those processes. Namely the intricate and fascinating dynamics of the likely collsionless fluid dynamics of dark matter. A novel way of following the intricate evolution of such collisionless fluids in phase space is allowing us to construct new numerical methods to help understand the nature of dark matter halos as well as problems in astrophysical and terrestial plasmas.

  8. Computer Technology and Education.

    ERIC Educational Resources Information Center

    Senter, Joy

    1981-01-01

    Examines educational tasks in general computing, including computer-assisted instruction, computer-managed instruction, word processing, secretarial and business applications, time sharing, and networking to larger computers. (CT)

  9. Computer security in DOE distributed computing systems

    SciTech Connect

    Hunteman, W.J.

    1990-01-01

    The modernization of DOE facilities amid limited funding is creating pressure on DOE facilities to find innovative approaches to their daily activities. Distributed computing systems are becoming cost-effective solutions to improved productivity. This paper defines and describes typical distributed computing systems in the DOE. The special computer security problems present in distributed computing systems are identified and compared with traditional computer systems. The existing DOE computer security policy supports only basic networks and traditional computer systems and does not address distributed computing systems. A review of the existing policy requirements is followed by an analysis of the policy as it applies to distributed computing systems. Suggested changes in the DOE computer security policy are identified and discussed. The long lead time in updating DOE policy will require guidelines for applying the existing policy to distributed systems. Some possible interim approaches are identified and discussed. 2 refs.

  10. Making the Computer Neuter.

    ERIC Educational Resources Information Center

    Sanders, Jo Shuchat

    1985-01-01

    Summarizes findings of Computer Equity Training Project studies concerning female presence in computer magazines; home computer use variability by sex; student software evaluation; and influence on computer use of teacher gender, gender of other computer users, and work environment. Successful classroom computer equity approaches based on these…

  11. Analysis of in-plane signal-to-noise ratio in computed tomography

    NASA Astrophysics Data System (ADS)

    Hara, Takanori; Ichikawa, Katsuhiro; Sanada, Shigeru; Ida, Yoshihiro

    2008-03-01

    The purposes of this study are to analyze signal-to-noise ratio (SNR) changes for in-plane (axial plane) position and in-plane direction in X-ray computed tomography (CT) system and to verify those visual effects by using simulated small low-contrast disc objects. Three-models of multi detector-row CT were employed. Modulation transfer function (MTF) was obtained using a thin metal wire. Noise power spectrum (NPSs) was obtained using a cylindrical water phantom. The measurement positions were set to center and off-centered positions of 64mm, 128mm and 192mm. One-dimensional MTFs and NPSs for the x- and y-direction were calculated by means of a numerical slit scanning method. SNRs were then calculated from MTFs and NPSs. The simulated low-contrast disc objects with diameter of 2 to 10mm and contrast to background of 3.0%, 4.5% and 6.0% were superimposed on the water phantom images. Respective simulated objects in the images are then visually evaluated in degree of their recognition, and then the validity of the resultant SNRs are examined. Resultant in-plane SNRs differed between the center and peripheries and indicated a trend that the SNR values increase in accordance with distance from the center. The increasing degree differed between x- and y-direction, and also changed by the CT systems. These results suggested that the peripheries region has higher low-contrast detectability than the center. The properties derived in this study indicated that the depiction abilities at various in-plane positions are not uniform in clinical CT images, and detectability of the low contrast lesion may be influenced.

  12. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  13. Ultrasonography in the diagnosis of nasal bone fractures: a comparison with conventional radiography and computed tomography.

    PubMed

    Lee, In Sook; Lee, Jung-Hoon; Woo, Chang-Ki; Kim, Hak Jin; Sol, Yu Li; Song, Jong Woon; Cho, Kyu-Sup

    2016-02-01

    The purpose of this study was to evaluate and compare the diagnostic efficacy of ultrasonography (US) with radiography and multi-detector computed tomography (CT) for the detection of nasal bone fractures. Forty-one patients with a nasal bone fracture who underwent prospective US examinations were included. Plain radiographs and CT images were obtained on the day of trauma. For US examinations, radiologist used a linear array transducer (L17-5 MHz) in 24 patients and hockey-stick probe (L15-7 MHz) in 17. The bony component of the nose was divided into three parts (right and left lateral nasal walls, and midline of nasal bone). Fracture detection by three modalities was subjected to analysis. Furthermore, findings made by each modality were compared with intraoperative findings. Nasal bone fractures were located in the right lateral wall (n = 28), midline of nasal bone (n = 31), or left lateral wall (n = 31). For right and left lateral nasal walls, CT had greater sensitivity and specificity than US or radiography, and better agreed with intraoperative findings. However, for midline fractures of nasal bone, US had higher specificity, positive predictive value, and negative predictive value than CT. Although two US evaluations showed good agreements at all three sites, US findings obtained by the hockey-stick probe showed closer agreement with intraoperative findings for both lateral nasal wall and midline of nasal bone. Although CT showed higher sensitivity and specificity than US or radiography, US found to be helpful for evaluating the midline of nasal bone. Furthermore, for US examinations of the nasal bone, a smaller probe and higher frequency may be required. PMID:25749616

  14. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults. PMID:26107430

  15. Changes in entrance surface dose in relation to the location of shielding material in chest computed tomography

    NASA Astrophysics Data System (ADS)

    Kang, Y. M.; Cho, J. H.; Kim, S. C.

    2015-07-01

    This study examined the effects of entrance surface dose (ESD) on the abdomen and pelvis of the patient when undergoing chest computed tomography (CT) procedure, and evaluated the effects of ESD reduction depending on the location of radiation shield. For CT scanner, the 64-slice multi-detector computed tomography was used. The alderson radiation therapy phantom and optically stimulated luminescence dosimeter (OSLD), which enabled measurement from low to high dose, were also used. For measurement of radiation dose, the slice number from 9 to 21 of the phantom was set as the test range, which included apex up to both costophrenic angles. A total of 10 OSLD nanoDots were attached for measurement of the front and rear ESD. Cyclic tests were performed using the low-dose chest CT and high-resolution CT (HRCT) protocol on the following set-ups: without shielding; shielding only on the front side; shielding only on the rear side; and shielding for both front and rear sides. According to the test results, ESD for both front and rear sides was higher in HRCT than low-dose CT when radiation shielding was not used. It was also determined that, compared to the set-up that did not use the radiation shield, locating the radiation shield on the front side was effective in reducing front ESD, while locating the radiation shield on the rear side reduced rear ESD level. Shielding both the front and rear sides resulted in ESD reduction. In conclusion, it was confirmed that shielding the front and rear sides was the most effective method to reduce the ESD effect caused by scatter ray during radiography.

  16. CAA: Computer Assisted Athletics.

    ERIC Educational Resources Information Center

    Hall, John H.

    Computers have been used in a variety of applications for athletics since the late 1950's. These have ranged from computer-controlled electric scoreboards to computer-designed pole vaulting poles. Described in this paper are a computer-based athletic injury reporting system and a computer-assisted football scouting system. The injury reporting…

  17. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  18. The assumptions of computing

    SciTech Connect

    Huggins, J.K.

    1994-12-31

    The use of computers, like any technological activity, is not content-neutral. Users of computers constantly interact with assumptions regarding worthwhile activity which are embedded in any computing system. Directly questioning these assumptions in the context of computing allows us to develop an understanding of responsible computing.

  19. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  20. Overview of Computer Hardware.

    ERIC Educational Resources Information Center

    Tidball, Charles S.

    1980-01-01

    Reviews development in electronics technology of digital computers, considering the binary number representation, miniaturization of electronic components, cost and space requirements of computers, ways in which computers are used, and types of computers appropriate for teaching computer literacy and demonstrating physiological simulation. (CS)

  1. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  2. Computer Inequities in Opportunities for Computer Literacy.

    ERIC Educational Resources Information Center

    Anderson, Ronald E.; And Others

    The Science Assessment and Research Project conducted an assessment of the opportunities for computer learning in the nation's schools. As part of this study, 15,847 junior and senior high school students (13 and 17 years old) responded to a questionnaire regarding computers and computer usage. This is a summary of the findings: Opportunities for…

  3. computePk: Power spectrum computation

    NASA Astrophysics Data System (ADS)

    L'Huillier, Benjamin

    2014-03-01

    ComputePk computes the power spectrum in cosmological simulations. It is MPI parallel and has been tested up to a 4096^3 mesh. It uses the FFTW library. It can read Gadget-3 and GOTPM outputs, and computes the dark matter component. The user may choose between NGP, CIC, and TSC for the mass assignment scheme.

  4. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  5. [The features of adults' coronary artery anomalies shown by 64-multi-detector rows CT].

    PubMed

    Yuan, Zhentuan; Yu, Jianqun; Zhang, Youyi; Yuan, Hongmei

    2009-06-01

    To analyze the adults' coronary artery anomalies revealed by 64-MDCT, we retrospectively analyzed 34 cases of coronary artery anomalies (26 males and 8 females, averagely aged 53. 4 years with a range from 30 to 72 years). Multi-plannar reconstruction (MPR), maximum intensity projection (MIP), surface shadow display (SSD) and volume rendering (VR) were used to demonstrate the anomalous coronary artery. We found 4 cases of RCA from the left coronary sinus, 8 cases with secondary RCA, 1 case with high locations of left main (LM) segments from left sinus of valves, 1 case with LAD originated from main pulmonary artery, 3 cases with separate origin of LAD and LCX. Ten cases with myocardial bridge were shown (9 cases in LAD, 1 case in LCX); coronary fistula was seen in 2 cases (one was RCA-RA fistula, another was LAD-RV fistula), coronary aneurysm was found in 2 cases. Three cases in RCA were short and small. In conclusion, 64-MDCT is a good choice for diagnosing the anomalous coronary artery. PMID:19634659

  6. Real-time operating system for a multi-laser/multi-detector system

    NASA Technical Reports Server (NTRS)

    Coles, G.

    1980-01-01

    The laser-one hazard detector system, used on the Rensselaer Mars rover, is reviewed briefly with respect to the hardware subsystems, the operation, and the results obtained. A multidetector scanning system was designed to improve on the original system. Interactive support software was designed and programmed to implement real time control of the rover or platform with the elevation scanning mast. The formats of both the raw data and the post-run data files were selected. In addition, the interface requirements were selected and some initial hardware-software testing was completed.

  7. A multi-detector neutron spectrometer with nearly isotropic response for environmental and workplace monitoring

    NASA Astrophysics Data System (ADS)

    Gómez-Ros, J. M.; Bedogni, R.; Moraleda, M.; Delgado, A.; Romero, A.; Esposito, A.

    2010-01-01

    This communication describes an improved design for a neutron spectrometer consisting of 6Li thermoluminescent dosemeters located at selected positions within a single moderating polyethylene sphere. The spatial arrangement of the dosemeters has been designed using the MCNPX Monte Carlo code to calculate the response matrix for 56 log-equidistant energies from 10 -9 to 100 MeV, looking for a configuration that permits to obtain a nearly isotropic response for neutrons in the energy range from thermal to 20 MeV. The feasibility of the proposed spectrometer and the isotropy of its response have been evaluated by simulating exposures to different reference and workplace neutron fields. The FRUIT code has been used for unfolding purposes. The results of the simulations as well as the experimental tests confirm the suitability of the prototype for environmental and workplace monitoring applications.

  8. Sub-10-Minute Characterization of an Ultrahigh Molar Mass Polymer by Multi-detector Hydrodynamic Chromatography

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Molar mass averages, distributions, and architectural information of polymers are routinely obtained using size-exclusion chromatography (SEC). It has previously been shown that ultrahigh molar mass polymers may experience degradation during SEC analysis, leading to inaccurate molar mass averages a...

  9. On Teaching Computer Programming.

    ERIC Educational Resources Information Center

    Er, M. C.

    1984-01-01

    Points out difficulties associated with teaching introductory computer programing courses, discussing the importance of computer programing and explains activities associated with its use. Possible solutions to help teachers resolve problem areas in computer instruction are also offered. (ML)

  10. Computers: Instruments of Change.

    ERIC Educational Resources Information Center

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  11. The New Administrative Computing.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    1988-01-01

    The past decade has seen dramatic changes in administrative computing, including more systems, more applications, a new group of computer users, and new opportunities for computer use in campus administration. (Author/MSE)

  12. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  13. Avoiding Computer Viruses.

    ERIC Educational Resources Information Center

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  14. Selecting Appropriate Computing Tools.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    1990-01-01

    Selecting computer tools requires analyzing information requirements and audiences, assessing existing institutional research and computing capacities, creating or improving a planning database, using computer experts, determining software needs, obtaining sufficient resources for independent operations, acquiring quality, and insisting on…

  15. Environmentalists and the Computer.

    ERIC Educational Resources Information Center

    Baron, Robert C.

    1982-01-01

    Review characteristics, applications, and limitations of computers, including word processing, data/record keeping, scientific and industrial, and educational applications. Discusses misuse of computers and role of computers in environmental management. (JN)

  16. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  17. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  18. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  19. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  20. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  1. Heart CT scan

    MedlinePlus

    CAT scan - heart; Computed axial tomography scan - heart; Computed tomography scan - heart; Calcium scoring; Multi-detector CT scan - heart; Electron beam computed tomography - heart; Agaston score; Coronary calcium scan

  2. My Computer Romance

    ERIC Educational Resources Information Center

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  3. Elementary School Computer Literacy.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  4. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  5. Computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of computational fluid dynamics (CFD) activities at the Langley Research Center is given. The role of supercomputers in CFD research, algorithm development, multigrid approaches to computational fluid flows, aerodynamics computer programs, computational grid generation, turbulence research, and studies of rarefied gas flows are among the topics that are briefly surveyed.

  6. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  7. The Glass Computer

    ERIC Educational Resources Information Center

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  8. Computers and the Schools.

    ERIC Educational Resources Information Center

    Brown, Edmund G., Jr.

    1982-01-01

    The Governor of California discusses the role/importance of computer science education and proposed steps to support the cause of computer-aided education in California. Proposals include establishing computer demonstration centers, providing stipends for teachers studying computer-aided instruction, and funding of summer institutes and exemplary…

  9. How Computer Graphics Work.

    ERIC Educational Resources Information Center

    Prosise, Jeff

    This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…

  10. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation and the Emerging Home Computer Market" (Freeman); and…

  11. Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  12. Computer Innovations in Education.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    Computers in education are put in context by a brief review of current social and technological trends, a short history of the development of computers and the vast expansion of their use, and a brief description of computers and their use. Further chapters describe instructional applications, administrative uses, uses of computers for libraries…

  13. Parallel computing works

    SciTech Connect

    Not Available

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  14. Matrix computations in MACSYMA

    NASA Technical Reports Server (NTRS)

    Wang, P. S.

    1977-01-01

    Facilities built into MACSYMA for manipulating matrices with numeric or symbolic entries are described. Computations will be done exactly, keeping symbols as symbols. Topics discussed include how to form a matrix and create other matrices by transforming existing matrices within MACSYMA; arithmetic and other computation with matrices; and user control of computational processes through the use of optional variables. Two algorithms designed for sparse matrices are given. The computing times of several different ways to compute the determinant of a matrix are compared.

  15. (Computer vision and robotics)

    SciTech Connect

    Jones, J.P.

    1989-02-13

    The traveler attended the Fourth Aalborg International Symposium on Computer Vision at Aalborg University, Aalborg, Denmark. The traveler presented three invited lectures entitled, Concurrent Computer Vision on a Hypercube Multicomputer'', The Butterfly Accumulator and its Application in Concurrent Computer Vision on Hypercube Multicomputers'', and Concurrency in Mobile Robotics at ORNL'', and a ten-minute editorial entitled, It Concurrency an Issue in Computer Vision.'' The traveler obtained information on current R D efforts elsewhere in concurrent computer vision.

  16. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  17. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  18. Computers and Computation. Readings from Scientific American.

    ERIC Educational Resources Information Center

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is much easier…

  19. Multidetector computed tomography angiography for assessment of in-stent restenosis: meta-analysis of diagnostic performance

    PubMed Central

    Vanhoenacker, Piet K; Decramer, Isabel; Bladt, Olivier; Sarno, Giovanna; Van Hul, Erik; Wijns, William; Dwamena, Ben A

    2008-01-01

    Background Multi-detector computed tomography angiography (MDCTA)of the coronary arteries after stenting has been evaluated in multiple studies. The purpose of this study was to perform a structured review and meta-analysis of the diagnostic performance of MDCTA for the detection of in-stent restenosis in the coronary arteries. Methods A Pubmed and manual search of the literature on in-stent restenosis (ISR) detected on MDCTA compared with conventional coronary angiography (CA) was performed. Bivariate summary receiver operating curve (SROC) analysis, with calculation of summary estimates was done on a stent and patient basis. In addition, the influence of study characteristics on diagnostic performance and number of non-assessable segments (NAP) was investigated with logistic meta-regression. Results Fourteen studies were included. On a stent basis, Pooled sensitivity and specificity were 0.82(0.72–0.89) and 0.91 (0.83–0.96). Pooled negative likelihood ratio and positive likelihood ratio were 0.20 (0.13–0.32) and 9.34 (4.68–18.62) respectively. The exclusion of non-assessable stents and the strut thickness of the stents had an influence on the diagnostic performance. The proportion of non-assessable stents was influenced by the number of detectors, stent diameter, strut thickness and the use of an edge-enhancing kernel. Conclusion The sensitivity of MDTCA for the detection of in-stent stenosis is insufficient to use this test to select patients for further invasive testing as with this strategy around 20% of the patients with in-stent stenosis would be missed. Further improvement of scanner technology is needed before it can be recommended as a triage instrument in practice. In addition, the number of non-assessable stents is also high. PMID:18671850

  20. Feasibility of tissue characterization of coronary plaques using 320-detector row computed tomography: comparison with integrated backscatter intravascular ultrasound.

    PubMed

    Takahashi, Shigekiyo; Kawasaki, Masanori; Miyata, Shusaku; Suzuki, Keita; Yamaura, Makoto; Ido, Takahisa; Aoyama, Takuma; Fujiwara, Hisayoshi; Minatoguchi, Shinya

    2016-01-01

    Recently, a new generation of multi-detector row computed tomography (CT) with 320-detector rows (DR) has become available in the clinical settings. The purpose of the present study was to determine the cutoff values of Hounsfield unit (HU) for discrimination of plaque components by comparing HU of coronary plaques with integrated backscatter intravascular ultrasound (IB-IVUS) serving as a gold standard. Seventy-seven coronary atherosclerotic lesions in 77 patients with angina were visualized by both 320-DR CT (Aquilion One, Toshiba, Japan) and IB-IVUS at the same site. To determine the thresholds for discrimination of plaque components, we compared HU with IB values as a gold standard. Optimal thresholds were determined from receiver operating characteristic (ROC) curves analysis. The HU values of lipid pool (n = 115), fibrosis (n = 93), vessel lumen and calcification (n = 73) were 28 ± 19 HU (range -18 to 69 HU), 98 ± 31 HU (44 to 195 HU), 357 ± 65 HU (227 to 534 HU) and 998 ± 236 HU (366 to 1,489 HU), respectively. The thresholds of 56 HU, 210 HU and 490 HU were the most reliable predictors of lipid pool, fibrosis, vessel lumen and calcification, respectively. Lipid volume measured by 320-DR CT was correlated with that measured by IB-IVUS (r = 0.63, p < 0.05), whereas fibrous volume measured by 320-DR CT was not. Lipid volume measured by 320-DR CT was correlated with that measured by IB-IVUS, whereas fibrous volume was not correlated with that measured by IB-IVUS because manual exclusion of the outside of vessel hindered rigorous discrimination between fibrosis and extravascular components. PMID:25217036

  1. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. PMID:26078351

  2. Understanding student computational thinking with computational modeling

    NASA Astrophysics Data System (ADS)

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2013-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.

  3. Heterogeneous Distributed Computing for Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  4. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560

  5. Computer ray tracing speeds.

    PubMed

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations. PMID:20563112

  6. Polymorphous computing fabric

    DOEpatents

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  7. Computer Buyer's Guide.

    ERIC Educational Resources Information Center

    Castaldi, John, Comp.

    1988-01-01

    A directory of several hundred computer companies, divided into sections on hardware, software, and computer-related services and materials. Entries include company name, address, phone number, principals, contact person, product descriptions, and services. (DMM)

  8. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  9. Computer Crime and Insurance.

    ERIC Educational Resources Information Center

    Beaudoin, Ralph H.

    1985-01-01

    The susceptibility of colleges and universities to computer crime is great. While insurance coverage is available to cover the risks, an aggressive loss-prevention program is the wisest approach to limiting the exposures presented by computer technology. (MLW)

  10. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  11. Computers and Technological Forecasting

    ERIC Educational Resources Information Center

    Martino, Joseph P.

    1971-01-01

    Forecasting is becoming increasingly automated, thanks in large measure to the computer. It is now possible for a forecaster to submit his data to a computation center and call for the appropriate program. (No knowledge of statistics is required.) (Author)

  12. Computer Technology Directory.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  13. Computer Intrusions and Attacks.

    ERIC Educational Resources Information Center

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  14. Novel Applications of Computers

    ERIC Educational Resources Information Center

    Levi, Barbara G.

    1970-01-01

    Presents some novel applications of the computer to physics research. They include (1) a computer program for calculating Compton scattering, (2) speech simulation, (3) data analysis in spectrometry, and (4) measurement of complex alpha-particle spectrum. Bibliography. (LC)

  15. Cognitive Computing for Security.

    SciTech Connect

    Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  16. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  17. Bicycle Computers in Kinematics.

    ERIC Educational Resources Information Center

    Rich, Nathan H.

    1989-01-01

    Describes the mechanism of bicycle computers functioning as speedometers and timers. Discusses why the computers do not display the continuously changing readings and show the correct values at higher speeds. (YP)

  18. Algorithmically specialized parallel computers

    SciTech Connect

    Snyder, L.; Jamieson, L.H.; Gannon, D.B.; Siegel, H.J.

    1985-01-01

    This book is based on a workshop which dealt with array processors. Topics considered include algorithmic specialization using VLSI, innovative architectures, signal processing, speech recognition, image processing, specialized architectures for numerical computations, and general-purpose computers.

  19. Human Computers 1947

    NASA Technical Reports Server (NTRS)

    1947-01-01

    Langley's human computers at work in 1947. The female presence at Langley, who performed mathematical computations for male staff. Photograph published in Winds of Change, 75th Anniversary NASA publication (page 48), by James Schultz.

  20. Expanding Computer Service with Personal Computers.

    ERIC Educational Resources Information Center

    Bomzer, Herbert

    1983-01-01

    A planning technique, the mission justification document, and the evaluation procedures developed at Central Michigan University to ensure the orderly growth of computer-dependent resources within the constraints of tight budgets are described. (Author/MLW)

  1. Debugging embedded computer programs. [tactical missile computers

    NASA Technical Reports Server (NTRS)

    Kemp, G. H.

    1980-01-01

    Every embedded computer program must complete its debugging cycle using some system that will allow real time debugging. Many of the common items addressed during debugging are listed. Seven approaches to debugging are analyzed to evaluate how well they treat those items. Cost evaluations are also included in the comparison. The results indicate that the best collection of capabilities to cover the common items present in the debugging task occurs in the approach where a minicomputer handles the environment simulation with an emulation of some kind representing the embedded computer. This approach can be taken at a reasonable cost. The case study chosen is an embedded computer in a tactical missile. Several choices of computer for the environment simulation are discussed as well as different approaches to the embedded emulator.

  2. Nanoelectronics: Metrology and Computation

    SciTech Connect

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-09-26

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example.

  3. Introduction to Quantum Computation

    NASA Astrophysics Data System (ADS)

    Ekert, A.

    A computation is a physical process. It may be performed by a piece of electronics or on an abacus, or in your brain, but it is a process that takes place in nature and as such it is subject to the laws of physics. Quantum computers are machines that rely on characteristically quantum phenomena, such as quantum interference and quantum entanglement in order to perform computation. In this series of lectures I want to elaborate on the computational power of such machines.

  4. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  5. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  6. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2010-01-08

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  7. Personal Computer Networks.

    ERIC Educational Resources Information Center

    Barkley, John

    This report develops a model of a personal computer network for office use from the standpoint of the end user. A network designed for personal computers is differentiated from personal computers which must be attached to an existing communications system. Three types of the latter networks are discussed: (1) networks which connect personal…

  8. Getting To Know Computers.

    ERIC Educational Resources Information Center

    Lundgren, Mary Beth

    Originally written for adult new readers involved in literacy programs, this book is also helpful to those individuals who want a basic book about computers. It uses the carefully controlled vocabulary with which adult new readers are familiar. Chapter 1 addresses the widespread use of computers. Chapter 2 discusses what a computer is and…

  9. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  10. The fifth generation computer

    SciTech Connect

    Moto-Oka, T.; Kitsuregawa, M.

    1985-01-01

    The leader of Japan's Fifth Generation computer project, known as the 'Apollo' project, and a young computer scientist elucidate in this book the process of how the idea came about, international reactions, the basic technology, prospects for realization, and the abilities of the Fifth Generation computer. Topics considered included forecasting, research programs, planning, and technology impacts.

  11. Dietary Interviewing by Computer.

    ERIC Educational Resources Information Center

    Slack, Warner V.; And Others

    1976-01-01

    A computer based dietary interviewing program enhanced self awareness for overweight participants. In a three part interview designed for direct interaction between patient and computer, questions dealt with general dietary behavior and details of food intake. The computer assisted the patient in planning a weight reducing diet of approximately…

  12. Computed Tomography (CT) - Spine

    MedlinePlus

    ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Spine Computed tomography (CT) of the spine is a diagnostic imaging ... Spine? What is CT Scanning of the Spine? Computed tomography, more commonly known as a CT or CAT ...

  13. Personal Computer Communications.

    ERIC Educational Resources Information Center

    Leclerc, Gerry

    The interconnection between personal computers and other personal, mini, or mainframe computer systems is discussed. The following topics relevant to college personnel are addressed: hardware techniques for tying computers together, advantages and disadvantages of available software, the prospects for sophisticated micro/mainframe links with major…

  14. Computer Augmented Lectures

    ERIC Educational Resources Information Center

    Seitz, W. A.; Matsen, F. A.

    1974-01-01

    Discusses the use of a central computer linked to a CRT console, with display projected onto a large screen, to operate computer augmentation of lectures in large group instruction. Indicates that both introductory tutorial and computer modes are feasible in subject matter presentation. (CC)

  15. Computers in Engineering Teaching.

    ERIC Educational Resources Information Center

    Rushby, N. J.

    This bibliography cites 26 books, papers, and reports dealing with various uses of computers in engineering education; and describes several computer programs available for use in teaching aeronautical, chemical, civil, electrical and electronic, mechanical, and nuclear engineering. Each computer program entry is presented by name, author,…

  16. Computer Speeds Registration Process

    ERIC Educational Resources Information Center

    American School and University, 1977

    1977-01-01

    Academic/administrative data processing is centralized in one computer facility at East Tennessee State University. Students check computer printouts listing course availability, fill out one schedule card, present it to an operator at a computer terminal, and are able to register for classes in less than a minute. (Author/MLF)

  17. Optimizing Computer Technology Integration

    ERIC Educational Resources Information Center

    Dillon-Marable, Elizabeth; Valentine, Thomas

    2006-01-01

    The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…

  18. Computing environment logbook

    DOEpatents

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  19. Computer Education: Getting Started.

    ERIC Educational Resources Information Center

    Davies, Anne

    Designed to assist teachers who are not familiar with computer applications in elementary education, this guide provides information on personal skill development, computer terminology, software organization and evaluation, and troubleshooting. A tentative set of computer education objectives is outlined, and examples and strategies for effective…

  20. Personal Computers on Campus.

    ERIC Educational Resources Information Center

    Waldrop, M. Mitchell

    1985-01-01

    Examines issues involving the use of on-line databases, magnetic and optical data storage, digital telecommunications, and microcomputers on college campuses. These issues include access to computers and computer networking, and educational uses of the computers. Examples of efforts at four universities are included. (JN)

  1. The Computer Delusion.

    ERIC Educational Resources Information Center

    Oppenheimer, Todd

    1997-01-01

    Challenges research and prevailing attitudes that maintain that computers improve teaching and academic achievement. Criticizes and questions research methodology, computer literacy education, the need for computer skills to make a competitive workforce, support from the business community resulting from technology programs, and Internet use. (LRW)

  2. Computer Series, 87.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1987-01-01

    Included are two articles related to the use of computers. One activity is a computer exercise in chemical reaction engineering and applied kinetics for undergraduate college students. The second article shows how computer-assisted analysis can be used with reaction rate data. (RH)

  3. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  4. Quantum walk computation

    SciTech Connect

    Kendon, Viv

    2014-12-04

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer.

  5. Quantum computation: Honesty test

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2013-11-01

    Alice does not have a quantum computer so she delegates a computation to Bob, who does own one. But how can Alice check whether the computation that Bob performs for her is correct? An experiment with photonic qubits demonstrates such a verification protocol.

  6. Computer Applications for Children.

    ERIC Educational Resources Information Center

    Dulsky, Dwight; And Others

    1993-01-01

    Four articles discuss computer-assisted instruction, including (1) a middle school art and computer departments project that used LOGO to create rose window designs; (2) student journals; (3) the application of Piaget constructivism and Vygotskin social interaction to LOGO learning; and (4) computer lab writing workshops for elementary school…

  7. Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  8. Coping with Computing Success.

    ERIC Educational Resources Information Center

    Breslin, Richard D.

    Elements of computing success of Iona College, the challenges it currently faces, and the strategies conceived to cope with future computing needs are discussed. The college has mandated computer literacy for students and offers nine degrees in the computerized information system/management information system areas. Since planning is needed in…

  9. Writing, Thinking and Computers.

    ERIC Educational Resources Information Center

    Hartley, James

    1993-01-01

    Reviews the potential of word processors for changing the ways in which students process written text and think about writing. Three levels of computer-aided writing are considered: simple word processors; computer-aided writing programs; and higher-level computer-aided processing; and improvements in writing quality. (41 references) (LRW)

  10. Computers in Education.

    ERIC Educational Resources Information Center

    O'Neil, Carole; Pytlik, Mark

    The Canadian Department of Education developed this manual to provide teachers and administrators with information about the potential use of computers. Part I describes at length the five components of the computer input, output, storage, control, and arithmetic/logic functions) and gives a discussion of computer languages, programing, batch…

  11. Understanding Computer Terms.

    ERIC Educational Resources Information Center

    Lilly, Edward R.

    Designed to assist teachers and administrators approaching the subject of computers for the first time to acquire a feel for computer terminology, this document presents a computer term glossary on three levels. (1) The terms most frequently used, called a "basic vocabulary," are presented first in three paragraphs which explain their meanings:…

  12. Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Insolia, Gerard

    This document contains course outlines in computer-aided manufacturing developed for a business-industry technology resource center for firms in eastern Pennsylvania by Northampton Community College. The four units of the course cover the following: (1) introduction to computer-assisted design (CAD)/computer-assisted manufacturing (CAM); (2) CAM…

  13. Education for Computers

    ERIC Educational Resources Information Center

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  14. BNL ATLAS Grid Computing

    SciTech Connect

    Michael Ernst

    2008-10-02

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  15. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  16. Neural Computation and the Computational Theory of Cognition

    ERIC Educational Resources Information Center

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  17. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  18. Scalable optical quantum computer

    SciTech Connect

    Manykin, E A; Mel'nichenko, E V

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  19. Pediatric Computational Models

    NASA Astrophysics Data System (ADS)

    Soni, Bharat K.; Kim, Jong-Eun; Ito, Yasushi; Wagner, Christina D.; Yang, King-Hay

    A computational model is a computer program that attempts to simulate a behavior of a complex system by solving mathematical equations associated with principles and laws of physics. Computational models can be used to predict the body's response to injury-producing conditions that cannot be simulated experimentally or measured in surrogate/animal experiments. Computational modeling also provides means by which valid experimental animal and cadaveric data can be extrapolated to a living person. Widely used computational models for injury biomechanics include multibody dynamics and finite element (FE) models. Both multibody and FE methods have been used extensively to study adult impact biomechanics in the past couple of decades.

  20. An artificial muscle computer

    NASA Astrophysics Data System (ADS)

    Marc O'Brien, Benjamin; Alexander Anderson, Iain

    2013-03-01

    We have built an artificial muscle computer based on Wolfram's "2, 3" Turing machine architecture, the simplest known universal Turing machine. Our computer uses artificial muscles for its instruction set, output buffers, and memory write and addressing mechanisms. The computer is very slow and large (0.15 Hz, ˜1 m3); however by using only 13 artificial muscle relays, it is capable of solving any computable problem given sufficient memory, time, and reliability. The development of this computer shows that artificial muscles can think—paving the way for soft robots with reflexes like those seen in nature.

  1. Computer algebra and operators

    NASA Technical Reports Server (NTRS)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  2. ALMA correlator computer systems

    NASA Astrophysics Data System (ADS)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  3. Multidisciplinary computational aerosciences

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1992-01-01

    As the challenges of single disciplinary computational physics are met, such as computational fluid dynamics, computational structural mechanics, computational propulsion, computational aeroacoustics, computational electromagnetics, etc., scientists have begun investigating the combination of these single disciplines into what is being called multidisciplinary computational aerosciences (MCAS). The combination of several disciplines not only offers simulation realism but also formidable computational challenges. The solution of such problems will require computers orders of magnitude larger than those currently available. Such computer power can only be supplied by massively parallel machines because of the current speed-of-light limitation of conventional serial systems. Even with such machines, MCAS problems will require hundreds of hours for their solution. To efficiently utilize such a machine, research is required in three areas that include parallel architectures, systems software, and applications software. The main emphasis of this paper is the applications software element. Examples that demonstrate application software for multidisciplinary problems currently being solved at NASA Ames Research Center are presented. Pacing items for MCAS are discussed such as solution methodology, physical modeling, computer power, and multidisciplinary validation experiments.

  4. Richard Feynman and computation

    NASA Astrophysics Data System (ADS)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  5. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  6. Computer-assisted psychotherapy

    PubMed Central

    Wright, Jesse H.; Wright, Andrew S.

    1997-01-01

    The rationale for using computers in psychotherapy includes the possibility that therapeutic software could improve the efficiency of treatment and provide access for greater numbers of patients. Computers have not been able to reliably duplicate the type of dialogue typically used in clinician-administered therapy. However, computers have significant strengths that can be used to advantage in designing treatment programs. Software developed for computer-assisted therapy generally has been well accepted by patients. Outcome studies have usually demonstrated treatment effectiveness for this form of therapy. Future development of computer tools may be influenced by changes in health care financing and rapid growth of new technologies. An integrated care delivery model incorporating the unique attributes of both clinicians and computers should be adopted for computer-assisted therapy. PMID:9292446

  7. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  8. Navier-Stokes Computations on Commodity Computers

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Faulkner, Thomas R.

    1998-01-01

    In this paper we discuss and demonstrate the feasibility of solving high-fidelity, nonlinear computational fluid dynamics (CFD) problems of practical interest on commodity machines, namely Pentium Pro PC's. Such calculations have now become possible due to the progress in computational power and memory of the off-the-shelf commodity computers, along with the growth in bandwidth and communication speeds of networks. A widely used CFD code known as TLNS3D, which was developed originally on large shared memory computers was selected for this effort. This code has recently been ported to massively parallel processor (MPP) type machines, where natural partitioning along grid blocks is adopted in which one or more blocks are distributed to each of the available processors. In this paper, a similar approach is adapted to port this code to a cluster of Pentium Pro computers. The message passing among the processors is accomplished through the use of standard message passing interface (MPI) libraries. Scaling studies indicate fairly high level of parallelism on such clusters of commodity machines, thus making solutions to Navier-Stokes equations for practical problems more affordable.

  9. Computational Biology and High Performance Computing 2000

    SciTech Connect

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  10. Photonic Quantum Computing

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie

    2013-05-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. In this talk I will present a series of experiments in the field of photonic quantum computing. The first experiment is in the field of photonic state engineering and realizes the generation of heralded polarization-entangled photon pairs. It overcomes the limited applicability of photon-based schemes for quantum information processing tasks, which arises from the probabilistic nature of photon generation. The second experiment uses polarization-entangled photonic qubits to implement ``blind quantum computing,'' a new concept in quantum computing. Blind quantum computing enables a nearly-classical client to access the resources of a more computationally-powerful quantum server without divulging the content of the requested computation. Finally, the concept of blind quantum computing is applied to the field of verification. A new method is developed and experimentally demonstrated, which verifies the entangling capabilities of a quantum computer based on a blind Bell test.

  11. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. PMID:22985531

  12. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  13. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  14. Computational engine structural analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Johns, R. H.

    1986-01-01

    A significant research activity at the NASA Lewis Research Center is the computational simulation of complex multidisciplinary engine structural problems. This simulation is performed using computational engine structural analysis (CESA) which consists of integrated multidisciplinary computer codes in conjunction with computer post-processing for problem-specific application. A variety of the computational simulations of specific cases are described in some detail in this paper. These case studies include: (1) aeroelastic behavior of bladed rotors, (2) high velocity impact of fan blades, (3) blade-loss transient response, (4) rotor/stator/squeeze-film/bearing interaction, (5) blade-fragment/rotor-burst containment, and (6) structural behavior of advanced swept turboprops. These representative case studies are selected to demonstrate the breath of the problems analyzed and the role of the computer including post-processing and graphical display of voluminous output data.

  15. Personal computers on campus.

    PubMed

    Waldrop, M M

    1985-04-26

    Colleges and universities are becoming test beds for the much-heralded "information society" as they incorporate a new series of information technologies. These include on-line databases, magnetic and optical data storage, digital telecommunications, computer networks, and, most visibly and dramatically, personal computers. The transition is presenting administrators and faculty with major challenges, however. This article discusses some of the issues involved, including access to computers and to computer networking, managing the transition, and the educational uses of personal computers. A final section discusses efforts at Massachusetts Institute of Technology, Brown University, and Camegie-Mellon University to shape a new-generation personal computer, the so-called "scholar's workstation." PMID:17746874

  16. Adiabatic topological quantum computing

    NASA Astrophysics Data System (ADS)

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; Flammia, Steven T.; Neels, Alice

    2015-07-01

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev's surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computation size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.

  17. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  18. Sensor sentinel computing device

    DOEpatents

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  19. Computer aided production engineering

    SciTech Connect

    Not Available

    1986-01-01

    This book presents the following contents: CIM in avionics; computer analysis of product designs for robot assembly; a simulation decision mould for manpower forecast and its application; development of flexible manufacturing system; advances in microcomputer applications in CAD/CAM; an automated interface between CAD and process planning; CAM and computer vision; low friction pneumatic actuators for accurate robot control; robot assembly of printed circuit boards; information systems design for computer integrated manufacture; and a CAD engineering language to aid manufacture.

  20. Attitude computation system

    NASA Technical Reports Server (NTRS)

    Werking, R. D.

    1973-01-01

    An attitude computation facility for the control of unmanned satellite missions is reported. The system's major components include: the ability to transfer the attitude data from the control center to the attitude computer at a rate of 2400 bps; an attitude computation center which houses communications, closed circuit TV, graphics devices and a data evaluation area; and the use of interactive graphics devices to schedule jobs and to control program flow.

  1. Mobile computing for radiology.

    PubMed

    Auffermann, William F; Chetlen, Alison L; Sharma, Arjun; Colucci, Andrew T; DeQuesada, Ivan M; Grajo, Joseph R; Kung, Justin W; Loehfelm, Thomas W; Sherry, Steven J

    2013-12-01

    The rapid advances in mobile computing technology have the potential to change the way radiology and medicine as a whole are practiced. Several mobile computing advances have not yet found application to the practice of radiology, while others have already been applied to radiology but are not in widespread clinical use. This review addresses several areas where radiology and medicine in general may benefit from adoption of the latest mobile computing technologies and speculates on potential future applications. PMID:24200475

  2. Tracking and computing

    SciTech Connect

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology.

  3. New computing systems and their impact on computational mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    Recent advances in computer technology that are likely to impact computational mechanics are reviewed. The technical needs for computational mechanics technology are outlined. The major features of new and projected computing systems, including supersystems, parallel processing machines, special-purpose computing hardware, and small systems are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism on multiprocessor computers with a shared memory.

  4. Educational Computer Utilization and Computer Communications.

    ERIC Educational Resources Information Center

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  5. Computational results for parallel unstructured mesh computations

    SciTech Connect

    Jones, M.T.; Plassmann, P.E.

    1994-12-31

    The majority of finite element models in structural engineering are composed of unstructured meshes. These unstructured meshes are often very large and require significant computational resources; hence they are excellent candidates for massively parallel computation. Parallel solution of the sparse matrices that arise from such meshes has been studied heavily, and many good algorithms have been developed. Unfortunately, many of the other aspects of parallel unstructured mesh computation have gone largely ignored. The authors present a set of algorithms that allow the entire unstructured mesh computation process to execute in parallel -- including adaptive mesh refinement, equation reordering, mesh partitioning, and sparse linear system solution. They briefly describe these algorithms and state results regarding their running-time and performance. They then give results from the 512-processor Intel DELTA for a large-scale structural analysis problem. These results demonstrate that the new algorithms are scalable and efficient. The algorithms are able to achieve up to 2.2 gigaflops for this unstructured mesh problem.

  6. Computing machinery and understanding.

    PubMed

    Ramscar, Michael

    2010-08-01

    How are natural symbol systems best understood? Traditional "symbolic" approaches seek to understand cognition by analogy to highly structured, prescriptive computer programs. Here, we describe some problems the traditional computational metaphor inevitably leads to, and a very different approach to computation (Ramscar, Yarlett, Dye, Denny, & Thorpe, 2010; Turing, 1950) that allows these problems to be avoided. The way we conceive of natural symbol systems depends to a large degree on the computational metaphors we use to understand them, and machine learning suggests an understanding of symbolic thought that is very different to traditional views (Hummel, 2010). The empirical question then is: Which metaphor is best? PMID:21564241

  7. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  8. Highly parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Among the highly parallel computing architectures required for advanced scientific computation, those designated 'MIMD' and 'SIMD' have yielded the best results to date. The present development status evaluation of such architectures shown neither to have attained a decisive advantage in most near-homogeneous problems' treatment; in the cases of problems involving numerous dissimilar parts, however, such currently speculative architectures as 'neural networks' or 'data flow' machines may be entailed. Data flow computers are the most practical form of MIMD fine-grained parallel computers yet conceived; they automatically solve the problem of assigning virtual processors to the real processors in the machine.

  9. Computer applications in orthopaedics.

    PubMed

    Pho, R W; Lim, S Y; Pereira, B P

    1990-09-01

    With the rapid developments in microprocessors, the widespread availability of computers has brought about broad applications in the field of orthopaedics. The present technology enables large quantities of data to be logically processed in a very short span of time. This has led to the development of information management database systems where relevant medical information may be retrieved very quickly and effectively. The analytical power of the computer has also been utilised in expert systems to assist in clinical-decision making process. Computer graphics have revolutionised the visualisation of physical features of internal and external body parts, providing new and improved modalities of diagnosis. In some centres, surgical planning and rehearsals are already being carried out at the computer terminal with the use of animation and computer graphics. Computer technology has also played an active role in the field of prosthetics and rehabilitation. Intelligent robotic systems and microprocessor with functional neuromuscular stimulation have been applied to benefit, and in some cases restore some motor functions to the physically disabled. With more collaboration between engineers, scientists and the medical community, several prototypes of computer-controlled prostheses and prosthesis designed and manufactured by Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM) technology are available today to assist the amputees in their daily living and ambulatory activities. PMID:2260826

  10. Computational approaches to vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  11. Assessment of sub-milli-sievert abdominal computed tomography with iterative reconstruction techniques of different vendors

    PubMed Central

    Padole, Atul; Sainani, Nisha; Lira, Diego; Khawaja, Ranish Deedar Ali; Pourjabbar, Sarvenaz; Lo Gullo, Roberto; Otrakji, Alexi; Kalra, Mannudeep K

    2016-01-01

    AIM: To assess diagnostic image quality of reduced dose (RD) abdominal computed tomography (CT) with 9 iterative reconstruction techniques (IRTs) from 4 different vendors to the standard of care (SD) CT. METHODS: In an Institutional Review Board approved study, 66 patients (mean age 60 ± 13 years, 44 men, and 22 women) undergoing routine abdomen CT on multi-detector CT (MDCT) scanners from vendors A, B, and C (≥ 64 row CT scanners) (22 patients each) gave written informed consent for acquisition of an additional RD CT series. Sinogram data of RD CT was reconstructed with two vendor-specific and a vendor-neutral IRTs (A-1, A-2, A-3; B-1, B-2, B-3; and C-1, C-2, C-3) and SD CT series with filtered back projection. Subjective image evaluation was performed by two radiologists for each SD and RD CT series blinded and independently. All RD CT series (198) were assessed first followed by SD CT series (66). Objective image noise was measured for SD and RD CT series. Data were analyzed by Wilcoxon signed rank, kappa, and analysis of variance tests. RESULTS: There were 13/50, 18/57 and 9/40 missed lesions (size 2-7 mm) on RD CT for vendor A, B, and C, respectively. Missed lesions includes liver cysts, kidney cysts and stone, gall stone, fatty liver, and pancreatitis. There were also 5, 4, and 4 pseudo lesions (size 2-3 mm) on RD CT for vendor A, B, and C, respectively. Lesions conspicuity was sufficient for clinical diagnostic performance for 6/24 (RD-A-1), 10/24 (RD-A-2), and 7/24 (RD-A-3) lesions for vendor A; 5/26 (RD-B-1), 6/26 (RD-B-2), and 7/26 (RD-B-3) lesions for vendor B; and 4/20 (RD-C-1) 6/20 (RD-C-2), and 10/20 (RD-C-3) lesions for vendor C (P = 0.9). Mean objective image noise in liver was significantly lower for RD A-1 compared to both RD A-2 and RD A-3 images (P < 0.001). Similarly, mean objective image noise lower for RD B-2 (compared to RD B-1, RD B-3) and RD C-3 (compared to RD C-1 and C-2) (P = 0.016). CONCLUSION: Regardless of IRTs and MDCT vendors

  12. Computer Confrontation: Suppes and Albrecht

    ERIC Educational Resources Information Center

    Suppes, Patrick; Albrecht, Bob

    1973-01-01

    Two well-known computer specialists argue about the function of computers in schools. Patrick Suppes believes mastery of basic skills is the prime function of computers. Bob Albrecht believes computers should be learning devices and not drill masters. (DS)

  13. Introduction to Computer Programming Languages.

    ERIC Educational Resources Information Center

    Bork, Alfred M.

    1971-01-01

    A brief introduction to computer programing explains the basic grammar of computer language as well as fundamental computer techniques. What constitutes a computer program is made clear, then three simple kinds of statements basic to the computational computer are defined: assignment statements, input-output statements, and branching statements. A…

  14. Quantum Analog Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  15. Computations in Plasma Physics.

    ERIC Educational Resources Information Center

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  16. Computer Processed Evaluation.

    ERIC Educational Resources Information Center

    Griswold, George H.; Kapp, George H.

    A student testing system was developed consisting of computer generated and scored equivalent but unique repeatable tests based on performance objectives for undergraduate chemistry classes. The evaluation part of the computer system, made up of four separate programs written in FORTRAN IV, generates tests containing varying numbers of multiple…

  17. Computer Series, 25.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1982-01-01

    Nine computer programs (available from the authors) are described including graphic display of molecular structures from crystallographic data, computer assisted instruction (CAI) with MATH subroutine, CAI preparation-for-chemistry course, calculation of statistical thermodynamic properties, qualitative analysis program, automated conductimetric…

  18. Profiling Computing Coordinators.

    ERIC Educational Resources Information Center

    Edwards, Sigrid; Morton, Allan

    The people responsible for managing school computing resources in Australia have become known as Computing Coordinators. To date there has been no large systematic study of the role, responsibilities and characteristics of this position. This paper represents a first attempt to provide information on the functions and attributes of the Computing…

  19. UltraScale Computing

    NASA Astrophysics Data System (ADS)

    Maynard, , Jr.

    1997-08-01

    The Defense Advanced Research Projects Agency Information Technology Office (DARPA/ITO) supports research in technology for defense-critical applications. Defense Applications are always insatiable consumers of computing. Futuristic applications such as automated image interpretation/whole vehicle radar-cross-section/real-time prototyping/faster-than-real-time simulation will require computing capabilities orders-of-magnitude beyond the best performance that can be projected from contemporary scalable parallel processors. To reach beyond the silicon digital paradigm, DARPA has initiated a program in UltraScale Computing to explore the domain of innovative computational models, methods, and mechanisms. The objective is to encourage a complete re-thinking of computing. Novel architectures, program synthesis, and execution environments are needed as well as alternative underlying physical mechanisms including molecular, biological, optical and quantum mechanical processes. Development of these advanced computing technologies will offer spectacular performance and cost improvements beyond the threshold of traditional materials and processes. The talk will focus on novel approaches for employing vastly more computational units than shrinking transistors will enable and exploration of the biological options for solving computationally difficult problems.

  20. Computer Anxiety and Instruction.

    ERIC Educational Resources Information Center

    Baumgarte, Roger

    While the computer is commonly viewed as a tool for simplifying and enriching lives, many individuals react to this technology with feelings of anxiety, paranoia, and alienation. These reactions may have potentially serious career and educational consequences. Fear of computers reflects a generalized fear of current technology and is most…

  1. Computer Guided Instructional Design.

    ERIC Educational Resources Information Center

    Merrill, M. David; Wood, Larry E.

    1984-01-01

    Describes preliminary efforts to create the Lesson Design System, a computer-guided instructional design system written in Pascal for Apple microcomputers. Its content outline, strategy, display, and online lesson editors correspond roughly to instructional design phases of content and strategy analysis, display creation, and computer programing…

  2. MLJ Computer Corner.

    ERIC Educational Resources Information Center

    Brink, Dan

    1986-01-01

    Discusses the question of whether one must know how to program in order to make the most effective use of the computer in language classes. Looks at four computer languages which may be of interest to language teachers who want to learn programing: BASIC, assembler, high-level languages, and authoring systems. (SED)

  3. Computers Across the Curriculum.

    ERIC Educational Resources Information Center

    Wresch, William; Hieser, Rex

    1984-01-01

    Describes a program in which two-year college faculty were trained to program computers, the materials the faculty developed, and student reactions to these programs and to the computer assisted instructional strategy used. Indicates that faculty can learn to program and that orientation is needed to reduce student anxiety toward…

  4. Nature, computation and complexity

    NASA Astrophysics Data System (ADS)

    Binder, P.-M.; Ellis, G. F. R.

    2016-06-01

    The issue of whether the unfolding of events in the world can be considered a computation is explored in this paper. We come to different conclusions for inert and for living systems (‘no’ and ‘qualified yes’, respectively). We suggest that physical computation as we know it exists only as a tool of complex biological systems: us.

  5. Computer controlled antenna system

    NASA Technical Reports Server (NTRS)

    Raumann, N. A.

    1972-01-01

    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  6. Computer Series, 78.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1986-01-01

    Presents six brief articles dealing with the use of computers in teaching various topics in chemistry. Describes hardware and software applications which relate to protein graphics, computer simulated metabolism, interfaces between microcomputers and measurement devices, courseware available for spectrophotometers, and the calculation of elemental…

  7. Overview of computer vision

    SciTech Connect

    Gevarter, W.B.

    1982-09-01

    An overview of computer vision is provided. Image understanding and scene analysis are emphasized, and pertinent aspects of pattern recognition are treated. The basic approach to computer vision systems, the techniques utilized, applications, the current existing systems and state-of-the-art issues and research requirements, who is doing it and who is funding it, and future trends and expectations are reviewed.

  8. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  9. Computer Series, 97.

    ERIC Educational Resources Information Center

    Kay, Jack G.; And Others

    1988-01-01

    Describes two applications of the microcomputer for laboratory exercises. Explores radioactive decay using the Batemen equations on a Macintosh computer. Provides examples and screen dumps of data. Investigates polymer configurations using a Monte Carlo simulation on an IBM personal computer. (MVL)

  10. Computer grading of examinations

    NASA Technical Reports Server (NTRS)

    Frigerio, N. A.

    1969-01-01

    A method, using IBM cards and computer processing, automates examination grading and recording and permits use of computational problems. The student generates his own answers, and the instructor has much greater freedom in writing questions than is possible with multiple choice examinations.

  11. Computers on Wheels.

    ERIC Educational Resources Information Center

    Rosemead Elementary School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: How does a school provide the computer learning experiences for students given the paucity of available funding for hardware, software, and staffing? Here is what one school, Emma W. Shuey in Rosemead, did after exploratory research on computers by a committee of teachers and administrators. The…

  12. Computer Aided Art Major.

    ERIC Educational Resources Information Center

    Gibson, Jim

    The Computer Aided Art program offered at Northern State State University (Aberdeen, South Dakota), is coordinated with the traditional art major. The program is designed to familiarize students with a wide range of art-related computer hardware and software and their applications and to prepare students for problem-solving with unfamiliar…

  13. Computer Crimes in Schools.

    ERIC Educational Resources Information Center

    Telem, Moshe

    1984-01-01

    Analyzes the occurrence of computer crimes in schools, focusing on the main types of crimes possible, potential criminals in schools, and how the organizational characteristics of schools invite computer crimes. Means to counter this problem and minimize it as far as possible are suggested. (MBR)

  14. Teaching Using Computer Games

    ERIC Educational Resources Information Center

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  15. Computers, Networks and Education.

    ERIC Educational Resources Information Center

    Kay, Alan C.

    1991-01-01

    Discussed is how globally networked, easy-to-use computers can enhance learning only within an educational environment that encourages students to question "facts" and seek challenges. The strengths and weaknesses of computers used as amplifiers for learning are described. (KR)

  16. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  17. Computer Virus Protection

    ERIC Educational Resources Information Center

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  18. Notebook Computers Increase Communication.

    ERIC Educational Resources Information Center

    Carey, Doris M.; Sale, Paul

    1994-01-01

    Project FIT (Full Inclusion through Technology) provides notebook computers for children with severe disabilities. The computers offer many input and output options. Assessing the students' equipment needs is a complex process, requiring determination of communication goals and baseline abilities, and consideration of equipment features such as…

  19. African Studies Computer Resources.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    African studies computer resources that are readily available in the United States with linkages to Africa are described, highlighting those most directly corresponding to African content. Africanists can use the following four fundamental computer systems: (1) Internet/Bitnet; (2) Fidonet; (3) Usenet; and (4) dial-up bulletin board services. The…

  20. Computational Modeling of Tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)

    1995-01-01

    This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.

  1. K-12 Computer Networking.

    ERIC Educational Resources Information Center

    ERIC Review, 1993

    1993-01-01

    The "ERIC Review" is published three times a year and announces research results, publications, and new programs relevant to each issue's theme topic. This issue explores computer networking in elementary and secondary schools via two principal articles: "Plugging into the 'Net'" (Michael B. Eisenberg and Donald P. Ely); and "Computer Networks for…

  2. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell, Ed.

    1988-01-01

    Describes three situations in which computer software was used in a chemistry laboratory. Discusses interfacing voltage output instruments with Apple II computers and using spreadsheet programs to simulate gas chromatography and analysis of kinetic data. Includes information concerning procedures, hardware, and software used in each situation. (CW)

  3. Computer Programmer/Analyst.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 25 subjects appropriate for use in a competency list for the occupation of computer programmer/analyst, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 25 units are as…

  4. Computer Support Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 18 subjects appropriate for use in a competency list for the occupation of computer support technician, 1 of 12 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 18 units are as…

  5. Teaching Computer Applications.

    ERIC Educational Resources Information Center

    Lundgren, Carol A.; And Others

    This document, which is designed to provide classroom teachers at all levels with practical ideas for a computer applications course, examines curricular considerations, teaching strategies, delivery techniques, and assessment methods applicable to a course focusing on applications of computers in business. The guide is divided into three…

  6. Connecting Kids and Computers

    ERIC Educational Resources Information Center

    Giles, Rebecca McMahon

    2006-01-01

    Exposure to cell phones, DVD players, video games, computers, digital cameras, and iPods has made today's young people more technologically advanced than those of any previous generation. As a result, parents are now concerned that their children are spending too much time in front of the computer. In this article, the author focuses her…

  7. Computer Processor Allocator

    Energy Science and Technology Software Center (ESTSC)

    2004-03-01

    The Compute Processor Allocator (CPA) provides an efficient and reliable mechanism for managing and allotting processors in a massively parallel (MP) computer. It maintains information in a database on the health. configuration and allocation of each processor. This persistent information is factored in to each allocation decision. The CPA runs in a distributed fashion to avoid a single point of failure.

  8. Preventing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  9. Chippy's Computer Words.

    ERIC Educational Resources Information Center

    Willing, Kathlene R.; Girard, Suzanne

    Intended for young children just becoming familiar with computers, this naming book introduces and reinforces new computer vocabulary and concepts. The 20 words are presented alphabetically, along with illustrations, providing room for different activities in which children can match and name the pictures and words. The 20 vocabulary items are…

  10. Administration of Computer Resources.

    ERIC Educational Resources Information Center

    Franklin, Gene F.

    Computing at Stanford University has, until recently, been performed at one of five facilities. The Stanford hospital operates an IBM 370/135 mainly for administrative use. The university business office has an IBM 370/145 for its administrative needs and support of the medical clinic. Under the supervision of the Stanford Computation Center are…

  11. Computer Yearbook 72.

    ERIC Educational Resources Information Center

    1972

    Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…

  12. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  13. Computer Technology for Industry.

    ERIC Educational Resources Information Center

    Aviation/Space, 1982

    1982-01-01

    A special National Aeronautics and Space Administration (NASA) service is contributing to national productivity by providing industry with reusable, low-cost, government-developed computer programs. Located at the University of Georgia, NASA's Computer Software Management and Information Center (COSMIC) has developed programs for equipment…

  14. Computer Networking for Educators.

    ERIC Educational Resources Information Center

    McCain, Ted D. E.; Ekelund, Mark

    This book is intended to introduce the basic concepts of connecting computers together and to equip individuals with the technical background necessary to begin constructing small networks. For those already experienced with creating and maintaining computer networks, the book can help in considering the creation of a schoolwide network. The book…

  15. ELECTRONIC DIGITAL COMPUTER

    DOEpatents

    Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.

    1957-10-01

    The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.

  16. Marketing via Computer Diskette.

    ERIC Educational Resources Information Center

    Thombs, Michael

    This report describes the development and evaluation of an interactive marketing diskette which describes the characteristics, advantages, and application procedures for each of the major computer-based graduate programs at Nova University. Copies of the diskettes were distributed at the 1988 Florida Instructional Computing Conference and were…

  17. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  18. Computing in Research.

    ERIC Educational Resources Information Center

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  19. Who Owns Computer Software?

    ERIC Educational Resources Information Center

    Branscomb, Anne Wells

    1995-01-01

    Discusses the protection of intellectual property as it applies to computer software and its impact on private enterprise and the public good. Highlights include the role of patents, copyrights, and trade secrets; some court cases; and recommendations for alternatives to the existing legal framework for protecting computer software. (KRN)

  20. The Overdominance of Computers

    ERIC Educational Resources Information Center

    Monke, Lowell W.

    2006-01-01

    Most schools are unwilling to consider decreasing computer use at school because they fear that without screen time, students will not be prepared for the demands of a high-tech 21st century. Monke argues that having young children spend a significant amount of time on computers in school is harmful, particularly when children spend so much…

  1. Flexible Animation Computer Program

    NASA Technical Reports Server (NTRS)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  2. Computers as Cognitive Tools.

    ERIC Educational Resources Information Center

    Lajoie, Susanne P., Ed.; Derry, Sharon J., Ed.

    This book provides exemplars of the types of computer-based learning environments represented by the theoretical camps within the field and the practical applications of the theories. The contributors discuss a variety of computer applications to learning, ranging from school-related topics such as geometry, algebra, biology, history, physics, and…

  3. Ubiquitous human computing.

    PubMed

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring. PMID:18672463

  4. Videodisc-Computer Interfaces.

    ERIC Educational Resources Information Center

    Zollman, Dean

    1984-01-01

    Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…

  5. Computer Communications and Learning.

    ERIC Educational Resources Information Center

    Bellman, Beryl L.

    1992-01-01

    Computer conferencing offers many opportunities for linking college students and faculty at a distance. From the Binational English and Spanish Telecommunications Network (BESTNET) has evolved a variety of bilingual video/computer/face-to-face instructional packages to serve institutions and nontraditional students on several continents. (MSE)

  6. The Economics of Computers.

    ERIC Educational Resources Information Center

    Sharpe, William F.

    A microeconomic theory is applied in this book to computer services and costs and for the benefit of those who are decision-makers in the selection, financing, and use of computers. Subtopics of the theory discussed include value and demand; revenue and profits; time and risk; and costs, inputs, and outputs. Application of the theory is explained…

  7. Computer Assisted Learning Feature.

    ERIC Educational Resources Information Center

    Davies, Peter; Minogue, Claire

    1988-01-01

    Discusses the goals of the Computer Working Party in Great Britain, presenting their assessment of current computer hardware and the market for economics software. Examines "Running the British Economy," a macroeconomic policy simulation that investigates the links between values and policy objectives and encourages questioning of economic models.…

  8. Computers, Culture, and Learning.

    ERIC Educational Resources Information Center

    Westby, Carol; Atencio, David J.

    2002-01-01

    Opinions vary regarding the impact of computers on children and value of using computers in the education of children. This article discusses the impact of technology on culture, attitudes about technological change, the nature of literacy in a technological society, and frameworks for thinking about teaching and learning with technology.…

  9. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  10. Statistical Mapping by Computer.

    ERIC Educational Resources Information Center

    Utano, Jack J.

    The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…

  11. Computers and Personal Privacy.

    ERIC Educational Resources Information Center

    Ware, Willis H.

    Privacy is an issue that arises from the intersection of a demand for improved recordkeeping processes, and computing technology as the response to the demand. Recordkeeping in the United States centers on information about people. Modern day computing technology has the ability to maintain, store, and retrieve records quickly; however, this…

  12. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  13. Decoding Technology: Computer Shortcuts

    ERIC Educational Resources Information Center

    Walker, Tim; Donohue, Chip

    2008-01-01

    For the typical early childhood administrator, there will never be enough hours in a day to finish the work that needs to be done. This includes numerous hours spent on a computer tracking enrollment, managing the budget, researching curriculum ideas online, and many other administrative tasks. Improving an administrator's computer efficiency can…

  14. Computer Series, 112.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1990-01-01

    Four microcomputer applications are presented including: "Computer Simulated Process of 'Lead Optimization': A Student-Interactive Program,""A PROLOG Program for the Generation of Molecular Formulas,""Determination of Inflection Points from Experimental Data," and "LAOCOON PC: NMR Simulation on a Personal Computer." Software, availability,…

  15. Hypercard Another Computer Tool.

    ERIC Educational Resources Information Center

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  16. COMPUTER MODELS/EPANET

    EPA Science Inventory

    Pipe network flow analysis was among the first civil engineering applications programmed for solution on the early commercial mainframe computers in the 1960s. Since that time, advancements in analytical techniques and computing power have enabled us to solve systems with tens o...

  17. Computer workstation speeds

    SciTech Connect

    Grcar, J.F.

    1996-06-01

    This report compares the performance of several computers. Some of the machines are discontinued, and some are anticipated, but most are currently installed at Sandia Laboratories. All the computers are personal workstations or departmental servers, except for comparison, one is a Cray C90 mainframe supercomputer (not owned by the Laboratories). A few of the computers have multiple processors, but parallelism is not tested. The time to run three programs is reported for every computer. Unlike many benchmarks, these are complete application programs. They were written and are used at Sandia Laboratories. Also SPECmarks are reported for many computers. These are industry standard performance ratings. They are in general agreement with the speeds of running the Sandia programs. This report concludes with some background material and notes about specific manufacturers.

  18. (Computer) Vision without Sight

    PubMed Central

    Manduchi, Roberto; Coughlan, James

    2012-01-01

    Computer vision holds great promise for helping persons with blindness or visual impairments (VI) to interpret and explore the visual world. To this end, it is worthwhile to assess the situation critically by understanding the actual needs of the VI population and which of these needs might be addressed by computer vision. This article reviews the types of assistive technology application areas that have already been developed for VI, and the possible roles that computer vision can play in facilitating these applications. We discuss how appropriate user interfaces are designed to translate the output of computer vision algorithms into information that the user can quickly and safely act upon, and how system-level characteristics affect the overall usability of an assistive technology. Finally, we conclude by highlighting a few novel and intriguing areas of application of computer vision to assistive technology. PMID:22815563

  19. Symmetry Effects in Computation

    NASA Astrophysics Data System (ADS)

    Yao, Andrew Chi-Chih

    2008-12-01

    The concept of symmetry has played a key role in the development of modern physics. For example, using symmetry, C.N. Yang and other physicists have greatly advanced our understanding of the fundamental laws of physics. Meanwhile, computer scientists have been pondering why some computational problems seem intractable, while others are easy. Just as in physics, the laws of computation sometimes can only be inferred indirectly by considerations of general principles such as symmetry. The symmetry properties of a function can indeed have a profound effect on how fast the function can be computed. In this talk, we present several elegant and surprising discoveries along this line, made by computer scientists using symmetry as their primary tool. Note from Publisher: This article contains the abstract only.

  20. Indirection and computer security.

    SciTech Connect

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  1. (Computer) Vision without Sight.

    PubMed

    Manduchi, Roberto; Coughlan, James

    2012-01-01

    Computer vision holds great promise for helping persons with blindness or visual impairments (VI) to interpret and explore the visual world. To this end, it is worthwhile to assess the situation critically by understanding the actual needs of the VI population and which of these needs might be addressed by computer vision. This article reviews the types of assistive technology application areas that have already been developed for VI, and the possible roles that computer vision can play in facilitating these applications. We discuss how appropriate user interfaces are designed to translate the output of computer vision algorithms into information that the user can quickly and safely act upon, and how system-level characteristics affect the overall usability of an assistive technology. Finally, we conclude by highlighting a few novel and intriguing areas of application of computer vision to assistive technology. PMID:22815563

  2. Onward to Petaflops Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.

  3. Next-generation computers

    SciTech Connect

    Torrero, E.A.

    1985-01-01

    Developments related to tomorrow's computers are discussed, taking into account advances toward the fifth generation in Japan, the challenge to U.S. supercomputers, plans concerning the creation of supersmart computers for the U.S. military, a U.S. industry response to the Japanese challenge, a survey of U.S. and European research, Great Britain, the European Common Market, codifying human knowledge for machine reading, software engineering, the next-generation softwave, plans for obtaining the million-transistor chip, and fabrication issues for next-generation circuits. Other topics explored are related to a status report regarding artificial intelligence, an assessment of the technical challenges, aspects of sociotechnology, and defense advanced research projects. Attention is also given to expert systems, speech recognition, computer vision, function-level programming and automated programming, computing at the speed limit, VLSI, and superpower computers.

  4. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  5. Computational electromagnetics and parallel dense matrix computations

    SciTech Connect

    Forsman, K.; Kettunen, L.; Gropp, W.; Levine, D.

    1995-06-01

    We present computational results using CORAL, a parallel, three-dimensional, nonlinear magnetostatic code based on a volume integral equation formulation. A key feature of CORAL is the ability to solve, in parallel, the large, dense systems of linear equations that are inherent in the use of integral equation methods. Using the Chameleon and PSLES libraries ensures portability and access to the latest linear algebra solution technology.

  6. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  7. Global computing for bioinformatics.

    PubMed

    Loewe, Laurence

    2002-12-01

    Global computing, the collaboration of idle PCs via the Internet in a SETI@home style, emerges as a new way of massive parallel multiprocessing with potentially enormous CPU power. Its relations to the broader, fast-moving field of Grid computing are discussed without attempting a review of the latter. This review (i) includes a short table of milestones in global computing history, (ii) lists opportunities global computing offers for bioinformatics, (iii) describes the structure of problems well suited for such an approach, (iv) analyses the anatomy of successful projects and (v) points to existing software frameworks. Finally, an evaluation of the various costs shows that global computing indeed has merit, if the problem to be solved is already coded appropriately and a suitable global computing framework can be found. Then, either significant amounts of computing power can be recruited from the general public, or--if employed in an enterprise-wide Intranet for security reasons--idle desktop PCs can substitute for an expensive dedicated cluster. PMID:12511066

  8. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  9. Computing with synthetic protocells.

    PubMed

    Courbet, Alexis; Molina, Franck; Amar, Patrick

    2015-09-01

    In this article we present a new kind of computing device that uses biochemical reactions networks as building blocks to implement logic gates. The architecture of a computing machine relies on these generic and composable building blocks, computation units, that can be used in multiple instances to perform complex boolean functions. Standard logical operations are implemented by biochemical networks, encapsulated and insulated within synthetic vesicles called protocells. These protocells are capable of exchanging energy and information with each other through transmembrane electron transfer. In the paradigm of computation we propose, protoputing, a machine can solve only one problem and therefore has to be built specifically. Thus, the programming phase in the standard computing paradigm is represented in our approach by the set of assembly instructions (specific attachments) that directs the wiring of the protocells that constitute the machine itself. To demonstrate the computing power of protocellular machines, we apply it to solve a NP-complete problem, known to be very demanding in computing power, the 3-SAT problem. We show how to program the assembly of a machine that can verify the satisfiability of a given boolean formula. Then we show how to use the massive parallelism of these machines to verify in less than 20 min all the valuations of the input variables and output a fluorescent signal when the formula is satisfiable or no signal at all otherwise. PMID:25969126

  10. Computers as tools

    SciTech Connect

    Eriksson, I.V.

    1994-12-31

    The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

  11. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  12. Recognizing Computational Science

    NASA Astrophysics Data System (ADS)

    Bland-Hawthorn, J.

    2006-08-01

    There are prestigious international awards that recognize the role of theory and experiment in science and mathematics, but there are no awards of a similar stature that explicitly recognize the role of computational science in a scientific field. In 1945, John von Neumann noted that "many branches of both pure and applied mathematics are in great need of computing instruments to break the present stalemate created by the failure of the purely analytical approach to nonlinear problems." In the past few decades, great strides in mathematics and in the applied sciences can be linked to computational science.

  13. Computational Tractability - Beyond Turing?

    NASA Astrophysics Data System (ADS)

    Marcer, Peter; Rowlands, Peter

    A fundamental problem in the theory of computing concerns whether descriptions of systems at all times remain tractable, that is whether the complexity that inevitably results can be reduced to a polynomial form (P) or whether some problems lead to a non-polynomial (NP) exponential growth in complexity. Here, we propose that the universal computational rewrite system that can be shown to be responsible ultimately for the development of mathematics, physics, chemistry, biology and even human consciousness, is so structured that Nature will always be structured as P at any scale and so will be computationally tractable.

  14. Interactive computer graphics

    NASA Astrophysics Data System (ADS)

    Purser, K.

    1980-08-01

    Design layouts have traditionally been done on a drafting board by drawing a two-dimensional representation with section cuts and side views to describe the exact three-dimensional model. With the advent of computer graphics, a three-dimensional model can be created directly. The computer stores the exact three-dimensional model, which can be examined from any angle and at any scale. A brief overview of interactive computer graphics, how models are made and some of the benefits/limitations are described.

  15. Computer aided surface representation

    SciTech Connect

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  16. NEMAR plotting computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.

  17. Human Computer Interaction

    NASA Astrophysics Data System (ADS)

    Bhagwani, Akhilesh; Sengar, Chitransh; Talwaniper, Jyotsna; Sharma, Shaan

    2012-08-01

    The paper basically deals with the study of HCI (Human computer interaction) or BCI(Brain-Computer-Interfaces) Technology that can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments. The HCI is based as a direct communication pathway between the brain and an external device. BCIs are often aimed at assisting, augmenting, or repairing human cognitive or sensory-motor functions.The paper also deals with many advantages of BCI Technology along with some of its applications and some major drawbacks.

  18. World's Most Powerful Computer

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The use of the Cray 2 supercomputer, the fastest computer in the world, at ARC is detailed. The Cray 2 can perform 250 million calculations per second and has 10 times the memory of any other computer. Ames researchers are shown creating computer simulations of aircraft airflow, waterflow around a submarine, and fuel flow inside of the Space Shuttle's engines. The video also details the Cray 2's use in calculating airflow around the Shuttle and its external rockets during liftoff for the first time and in the development of the National Aero Space Plane.

  19. Teaching Physics with Computers

    NASA Astrophysics Data System (ADS)

    Botet, R.; Trizac, E.

    2005-09-01

    Computers are now so common in our everyday life that it is difficult to imagine the computer-free scientific life of the years before the 1980s. And yet, in spite of an unquestionable rise, the use of computers in the realm of education is still in its infancy. This is not a problem with students: for the new generation, the pre-computer age seems as far in the past as the the age of the dinosaurs. It may instead be more a question of teacher attitude. Traditional education is based on centuries of polished concepts and equations, while computers require us to think differently about our method of teaching, and to revise the content accordingly. Our brains do not work in terms of numbers, but use abstract and visual concepts; hence, communication between computer and man boomed when computers escaped the world of numbers to reach a visual interface. From this time on, computers have generated new knowledge and, more importantly for teaching, new ways to grasp concepts. Therefore, just as real experiments were the starting point for theory, virtual experiments can be used to understand theoretical concepts. But there are important differences. Some of them are fundamental: a virtual experiment may allow for the exploration of length and time scales together with a level of microscopic complexity not directly accessible to conventional experiments. Others are practical: numerical experiments are completely safe, unlike some dangerous but essential laboratory experiments, and are often less expensive. Finally, some numerical approaches are suited only to teaching, as the concept necessary for the physical problem, or its solution, lies beyond the scope of traditional methods. For all these reasons, computers open physics courses to novel concepts, bringing education and research closer. In addition, and this is not a minor point, they respond naturally to the basic pedagogical needs of interactivity, feedback, and individualization of instruction. This is why one can

  20. Computations underlying sensorimotor learning.

    PubMed

    Wolpert, Daniel M; Flanagan, J Randall

    2016-04-01

    The study of sensorimotor learning has a long history. With the advent of innovative techniques for studying learning at the behavioral and computational levels new insights have been gained in recent years into how the sensorimotor system acquires, retains, represents, retrieves and forgets sensorimotor tasks. In this review we highlight recent advances in the field of sensorimotor learning from a computational perspective. We focus on studies in which computational models are used to elucidate basic mechanisms underlying adaptation and skill acquisition in human behavior. PMID:26719992

  1. High Performance Computing Today

    SciTech Connect

    Dongarra, Jack; Meuer,Hans; Simon,Horst D.; Strohmaier,Erich

    2000-04-01

    In last 50 years, the field of scientific computing has seen a rapid change of vendors, architectures, technologies and the usage of systems. Despite all these changes the evolution of performance on a large scale however seems to be a very steady and continuous process. Moore's Law is often cited in this context. If the authors plot the peak performance of various computers of the last 5 decades in Figure 1 that could have been called the supercomputers of their time they indeed see how well this law holds for almost the complete lifespan of modern computing. On average they see an increase in performance of two magnitudes of order every decade.

  2. Convergence: Computing and communications

    SciTech Connect

    Catlett, C.

    1996-12-31

    This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

  3. Computational fluid dynamics simulation of airflow in the trachea and main bronchi for the subjects with left pulmonary artery sling

    PubMed Central

    2014-01-01

    Background Left pulmonary artery sling (LPAS) is a rare but severe congenital anomaly, in which the stenoses are formed in the trachea and/or main bronchi. Multi-detector computed tomography (MDCT) provides useful anatomical images, but does not offer functional information. The objective of the present study is to quantitatively analyze the airflow in the trachea and main bronchi of LPAS subjects through computational fluid dynamics (CFD) simulation. Methods Five subjects (four LPAS patients, one normal control) aging 6-19 months are analyzed. The geometric model of the trachea and the two main bronchi is extracted from the MDCT images. The inlet velocity is determined based on the body weight and the inlet area. Both the geometric model and personalized inflow conditions are imported into CFD software, ANSYS. The pressure drop, mass flow ratio through two bronchi, wall pressure, flow velocity and wall shear stress (WSS) are obtained, and compared to the normal control. Results Due to the tracheal and/or bronchial stenosis, the pressure drop for the LPAS patients ranges 78.9 - 914.5 Pa, much higher than for the normal control (0.7 Pa). The mass flow ratio through the two bronchi does not correlate with the sectional area ratio if the anomalous left pulmonary artery compresses the trachea or bronchi. It is suggested that the C-shaped trachea plays an important role on facilitating the air flow into the left bronchus with the inertia force. For LPAS subjects, the distributions of velocities, wall pressure and WSS are less regular than for the normal control. At the stenotic site, high velocity, low wall pressure and high WSS are observed. Conclusions Using geometric models extracted from CT images and the patient-specified inlet boundary conditions, CFD simulation can provide vital quantitative flow information for LPAS. Due to the stenosis, high pressure drops, inconsistent distributions of velocities, wall pressure and WSS are observed. The C-shaped trachea may

  4. Neural computation and the computational theory of cognition.

    PubMed

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. PMID:23126542

  5. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  6. Computer surety: computer system inspection guidance. [Contains glossary

    SciTech Connect

    Not Available

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  7. Experiments in Biomolecular Computing

    NASA Astrophysics Data System (ADS)

    Kaplan, Peter; Cecchi, Guillermo; Libchaber, Albert

    1996-03-01

    We review our experiments on computing with DNA(L. M. Adleman, Science) 266, 1021 (1994)., presenting findings about the technologies that will be required to make biomolecular computing useful. The advantages of using DNA for molecular computation include large information density (> 1 terabyte/mm^3), massive parallelism, and the existence of tools (enzymes) with which to manipulate the molecules. The major disadvantage is the slow cycle time (hours per biochemical step). The only demonstrated DNA computing algorithm involves first producing a pool of DNA containing all possible answers to a combinatorial question and then searching in that pool for the correct answer. We will discuss new work on the technical details of the polymerase chain reaction (PCR), a DNA amplifier, and on the construction of the pool of DNA from which the correct answer will be extracted.

  8. Computers and Qualitative Research.

    ERIC Educational Resources Information Center

    Willis, Jerry; Jost, Muktha

    1999-01-01

    Discusses the use of computers in qualitative research, including sources of information; collaboration; electronic discussion groups; Web sites; Internet search engines; electronic sources of data; data collection; communicating research results; desktop publishing; hypermedia and multimedia documents; electronic publishing; holistic and…

  9. Quantum information and computation

    SciTech Connect

    Bennett, C.H.

    1995-10-01

    A new quantum theory of communication and computation is emerging, in which the stuff transmitted or processed is not classical information, but arbitrary superpositions of quantum states. {copyright} 1995 {ital American} {ital Institute} {ital of} {ital Physics}.

  10. Computers boost structural technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.

    1989-01-01

    Derived from matrix methods of structural analysis and finite element methods developed over the last three decades, computational structures technology (CST) blends computer science, numerical analysis, and approximation theory into structural analysis and synthesis. Recent significant advances in CST include stochastic-based modeling, strategies for performing large-scale structural calculations on new computing systems, and the integration of CST with other disciplinary modules for multidisciplinary analysis and design. New methodologies have been developed at NASA for integrated fluid-thermal structural analysis and integrated aerodynamic-structure-control design. The need for multiple views of data for different modules also led to the development of a number of sophisticated data-base management systems. For CST to play a role in the future development of structures technology and in the multidisciplinary design of future flight vehicles, major advances and computational tools are needed in a number of key areas.

  11. Cloud computing security.

    SciTech Connect

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    2010-10-01

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

  12. Computer memory access technique

    NASA Technical Reports Server (NTRS)

    Zottarelli, L. J.

    1967-01-01

    Computer memory access commutator and steering gate configuration produces bipolar current pulses while still employing only the diodes and magnetic cores of the classic commutator, thereby appreciably reducing the complexity of the memory assembly.

  13. Computer-Assisted Testing.

    ERIC Educational Resources Information Center

    Edwards, John S.

    1980-01-01

    The most common functions of computer-assisted testing are item-banking, in which test items are collected and stored; test-construction, specifying item attributes and determining information required for identification of the test; and test scoring. (JN)

  14. Computational Aeroacoustics: An Overview

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    2003-01-01

    An overview of recent advances in computational aeroacoustics (CAA) is presented. CAA algorithms must not be dispersive and dissipative. It should propagate waves supported by the Euler equations with the correct group velocities. Computation domains are inevitably finite in size. To avoid the reflection of acoustic and other outgoing waves at the boundaries of the computation domain, it is required that special boundary conditions be imposed at the boundary region. These boundary conditions either absorb all the outgoing waves without reflection or allow the waves to exit smoothly. High-order schemes, invariably, supports spurious short waves. These spurious waves tend to pollute the numerical solution. They must be selectively damped or filtered out. All these issues and relevant computation methods are briefly reviewed. Jet screech tones are known to have caused structural fatigue in military combat aircrafts. Numerical simulation of the jet screech phenomenon is presented as an example of a successful application of CAA.

  15. Ant-based computing.

    PubMed

    Michael, Loizos

    2009-01-01

    A biologically and physically plausible model for ants and pheromones is proposed. It is argued that the mechanisms described in this model are sufficiently powerful to reproduce the necessary components of universal computation. The claim is supported by illustrating the feasibility of designing arbitrary logic circuits, showing that the interactions of ants and pheromones lead to the expected behavior, and presenting computer simulation results to verify the circuits' working. The conclusions of this study can be taken as evidence that coherent deterministic and centralized computation can emerge from the collective behavior of simple distributed Markovian processes such as those followed by biological ants, but also, more generally, by artificial agents with limited computational and communication abilities. PMID:19239348

  16. Computer Lens Design Program

    NASA Astrophysics Data System (ADS)

    Shiue, S. G.; Chang, M. W.

    1986-02-01

    An interactive computer lens design program has been developed. It has capabilities for editing lens data, optimizing zoom lens, evaluating image qualities, etc.. A Tessar lens and an IR zoom telescope designed by using this program are discussed.

  17. Goldstone (GDSCC) administrative computing

    NASA Technical Reports Server (NTRS)

    Martin, H.

    1981-01-01

    The GDSCC Data Processing Unit provides various administrative computing services for Goldstone. Those activities, including finance, manpower and station utilization, deep-space station scheduling and engineering change order (ECO) control are discussed.

  18. Computer security engineering management

    SciTech Connect

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system.

  19. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Larrabee, C. E., Jr.; And Others

    1988-01-01

    Describes the use of computer spreadsheet programs in physical chemistry classrooms. Stresses the application of bar graphs to fundamental quantum and statistical mechanics. Lists the advantages of the use of spreadsheets and gives examples of possible uses. (ML)

  20. Computers and Early Learning.

    ERIC Educational Resources Information Center

    Banet, Bernard

    1978-01-01

    This paper discusses the effect that microelectronic technology will have on elementary education in the decades ahead, and some of the uses of computers as learning aids for young children, including interactive games, tutorial systems, creative activity, and simulation. (MP)

  1. CMS computing model evolution

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Bonacorsi, D.; Colling, D.; Fisk, I.; Girone, M.

    2014-06-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  2. Biomarkers in Computational Toxicology

    EPA Science Inventory

    Biomarkers are a means to evaluate chemical exposure and/or the subsequent impacts on toxicity pathways that lead to adverse health outcomes. Computational toxicology can integrate biomarker data with knowledge of exposure, chemistry, biology, pharmacokinetics, toxicology, and e...

  3. Resurrecting the computer graveyard

    SciTech Connect

    McAdams, C.L.

    1995-02-01

    What eventually happens to all the old monitors, circuit boards, and other peripherial computer equipment when they die or simply become obsolete Disposal has been made difficult by a US EPA landfill ban on some circuit boards with high lead content, and other computer parts that contain what some call dangerous amounts of toxic substances. With few other options, most companies simply store the obsolete equipment. Often referred to as computer or dinosaur graveyards'', these increasingly numerous office purgatories can seem like permanent fixtures in the modern work-place. Not to worry, according to a new and expanding branch of the recycling industry. While there are some companies cashing in on the refurbishment and resale of these old computers, a growing number are successfully recycling the component parts--selling the plastics, metal, and circuit boards--and doing it safely.

  4. Computers and Cognitive Styles.

    ERIC Educational Resources Information Center

    Chadwick, Sandra S.; Watson, J. Allen

    1986-01-01

    The computer's role in matching instructional content and method to the learning disabled student's individual cognitive style is examined for the four phases of instructional delivery: assessment, prescription, evaluation, and instruction. (DB)

  5. Computer Keyboard Savvy.

    ERIC Educational Resources Information Center

    Binderup, Denise Belick

    1988-01-01

    Elementary school students are often exposed to computer usage before they have been taught correct typing techniques. Sample exercises to teach touch-typing to young children are provided on a reproducible page. (JL)

  6. Computers Transform an Industry.

    ERIC Educational Resources Information Center

    Simich, Jack

    1982-01-01

    Describes the use of computer technology in the graphics communication industry. Areas that are examined include typesetting, color scanners, communications satellites, page make-up systems, and the business office. (CT)

  7. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  8. Brain-Computer Symbiosis

    PubMed Central

    Schalk, Gerwin

    2009-01-01

    The theoretical groundwork of the 1930’s and 1940’s and the technical advance of computers in the following decades provided the basis for dramatic increases in human efficiency. While computers continue to evolve, and we can still expect increasing benefits from their use, the interface between humans and computers has begun to present a serious impediment to full realization of the potential payoff. This article is about the theoretical and practical possibility that direct communication between the brain and the computer can be used to overcome this impediment by improving or augmenting conventional forms of human communication. It is about the opportunity that the limitations of our body’s input and output capacities can be overcome using direct interaction with the brain, and it discusses the assumptions, possible limitations, and implications of a technology that I anticipate will be a major source of pervasive changes in the coming decades. PMID:18310804

  9. Computer processed LANDSAT data

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Background information and exercises are provided to: (1) establish or expand understanding of the concepts, methods, and terminology of computer processing of image producing data; (2) develop insight into the advantages of computer based image processing compared with the photointerpretation approach for processing, classifying, interpreting, and applying remote sensing data; (3) foster a broad perspective on the principal of the main techniques for image enhancement, pattern recognition, and thematic classification; (4) appreciate the pros and cons of batch and interactive modes of image analysis; (5) examine and evaluate some specific computer generated products for subscenes in Pennsylvania and New Jersey; and (6) interrelate these particular examples of output with more theoretical explanations of computer processing strategies and procedures.

  10. Computational drug discovery

    PubMed Central

    Ou-Yang, Si-sheng; Lu, Jun-yan; Kong, Xiang-qian; Liang, Zhong-jie; Luo, Cheng; Jiang, Hualiang

    2012-01-01

    Computational drug discovery is an effective strategy for accelerating and economizing drug discovery and development process. Because of the dramatic increase in the availability of biological macromolecule and small molecule information, the applicability of computational drug discovery has been extended and broadly applied to nearly every stage in the drug discovery and development workflow, including target identification and validation, lead discovery and optimization and preclinical tests. Over the past decades, computational drug discovery methods such as molecular docking, pharmacophore modeling and mapping, de novo design, molecular similarity calculation and sequence-based virtual screening have been greatly improved. In this review, we present an overview of these important computational methods, platforms and successful applications in this field. PMID:22922346

  11. Computer Aided Creativity.

    ERIC Educational Resources Information Center

    Proctor, Tony

    1988-01-01

    Explores the conceptual components of a computer program designed to enhance creative thinking and reviews software that aims to stimulate creative thinking. Discusses BRAIN and ORACLE, programs intended to aid in creative problem solving. (JOW)

  12. Large Scale Computing

    NASA Astrophysics Data System (ADS)

    Capiluppi, Paolo

    2005-04-01

    Large Scale Computing is acquiring an important role in the field of data analysis and treatment for many Sciences and also for some Social activities. The present paper discusses the characteristics of Computing when it becomes "Large Scale" and the current state of the art for some particular application needing such a large distributed resources and organization. High Energy Particle Physics (HEP) Experiments are discussed in this respect; in particular the Large Hadron Collider (LHC) Experiments are analyzed. The Computing Models of LHC Experiments represent the current prototype implementation of Large Scale Computing and describe the level of maturity of the possible deployment solutions. Some of the most recent results on the measurements of the performances and functionalities of the LHC Experiments' testing are discussed.

  13. Computer Programs (Turbomachinery)

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA computer programs are extensively used in design of industrial equipment. Available from the Computer Software Management and Information Center (COSMIC) at the University of Georgia, these programs are employed as analysis tools in design, test and development processes, providing savings in time and money. For example, two NASA computer programs are used daily in the design of turbomachinery by Delaval Turbine Division, Trenton, New Jersey. The company uses the NASA splint interpolation routine for analysis of turbine blade vibration and the performance of compressors and condensers. A second program, the NASA print plot routine, analyzes turbine rotor response and produces graphs for project reports. The photos show examples of Delaval test operations in which the computer programs play a part. In the large photo below, a 24-inch turbine blade is undergoing test; in the smaller photo, a steam turbine rotor is being prepared for stress measurements under actual operating conditions; the "spaghetti" is wiring for test instrumentation

  14. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  15. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1985-01-01

    Synopses are given for NASA supported work in computer science at the University of Virginia. Some areas of research include: error seeding as a testing method; knowledge representation for engineering design; analysis of faults in a multi-version software experiment; implementation of a parallel programming environment; two computer graphics systems for visualization of pressure distribution and convective density particles; task decomposition for multiple robot arms; vectorized incomplete conjugate gradient; and iterative methods for solving linear equations on the Flex/32.

  16. Computer-assisted instruction

    NASA Technical Reports Server (NTRS)

    Atkinson, R. C.

    1974-01-01

    The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.

  17. Professionalism in Computer Forensics

    NASA Astrophysics Data System (ADS)

    Irons, Alastair D.; Konstadopoulou, Anastasia

    The paper seeks to address the need to consider issues regarding professionalism in computer forensics in order to allow the discipline to develop and to ensure the credibility of the discipline from the differing perspectives of practitioners, the criminal justice system and in the eyes of the public. There is a need to examine and develop professionalism in computer forensics in order to promote the discipline and maintain the credibility of the discipline.

  18. Computer Models of Proteins

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  19. Recent computational chemistry

    NASA Astrophysics Data System (ADS)

    Onishi, Taku

    2015-12-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  20. Computer Games and Instruction

    ERIC Educational Resources Information Center

    Tobias, Sigmund, Ed.; Fletcher, J. D., Ed.

    2011-01-01

    There is intense interest in computer games. A total of 65 percent of all American households play computer games, and sales of such games increased 22.9 percent last year. The average amount of game playing time was found to be 13.2 hours per week. The popularity and market success of games is evident from both the increased earnings from games,…

  1. Recent computational chemistry

    SciTech Connect

    Onishi, Taku

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  2. Partnership in Computational Science

    SciTech Connect

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  3. Quantum Computing since Democritus

    NASA Astrophysics Data System (ADS)

    Aaronson, Scott

    2013-03-01

    1. Atoms and the void; 2. Sets; 3. Gödel, Turing, and friends; 4. Minds and machines; 5. Paleocomplexity; 6. P, NP, and friends; 7. Randomness; 8. Crypto; 9. Quantum; 10. Quantum computing; 11. Penrose; 12. Decoherence and hidden variables; 13. Proofs; 14. How big are quantum states?; 15. Skepticism of quantum computing; 16. Learning; 17. Interactive proofs and more; 18. Fun with the Anthropic Principle; 19. Free will; 20. Time travel; 21. Cosmology and complexity; 22. Ask me anything.

  4. SLAC B Factory computing

    SciTech Connect

    Kunz, P.F.

    1992-02-01

    As part of the research and development program in preparation for a possible B Factory at SLAC, a group has been studying various aspects of HEP computing. In particular, the group is investigating the use of UNIX for all computing, from data acquisition, through analysis, and word processing. A summary of some of the results of this study will be given, along with some personal opinions on these topics.

  5. Is thinking computable?

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.

  6. Computing with Neural Synchrony

    PubMed Central

    Brette, Romain

    2012-01-01

    Neurons communicate primarily with spikes, but most theories of neural computation are based on firing rates. Yet, many experimental observations suggest that the temporal coordination of spikes plays a role in sensory processing. Among potential spike-based codes, synchrony appears as a good candidate because neural firing and plasticity are sensitive to fine input correlations. However, it is unclear what role synchrony may play in neural computation, and what functional advantage it may provide. With a theoretical approach, I show that the computational interest of neural synchrony appears when neurons have heterogeneous properties. In this context, the relationship between stimuli and neural synchrony is captured by the concept of synchrony receptive field, the set of stimuli which induce synchronous responses in a group of neurons. In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. This theory of synchrony-based computation shows that relative spike timing may indeed have computational relevance, and suggests new types of neural network models for sensory processing with appealing computational properties. PMID:22719243

  7. EVOLUTIONARY COMPUTING PROJECT

    SciTech Connect

    C. BARRETT; C. REIDYS

    2000-09-01

    This report summarizes LDRD-funded mathematical research related to computer simulation, inspired in part by combinatorial analysis of sequence to structure relationships of bio-molecules. Computer simulations calculate the interactions among many individual, local entities, thereby generating global dynamics. The objective of this project was to establish a mathematical basis for a comprehensive theory of computer simulations. This mathematical theory is intended to rigorously underwrite very large complex simulations, including simulation of bio- and socio-technical systems. We believe excellent progress has been made. Abstraction of three main ingredients of simulation forms the mathematical setting, called Sequential Dynamical Systems (SDS): (1) functions realized as data-local procedures represent entity state transformations, (2) a graph that expresses locality of the functions and which represents the dependencies among entities, and (3) an ordering, or schedule according to which the entities are evaluated, e.g., up-dated. The research spans algebraic foundations, formal dynamical systems, computer simulation, and theoretical computer science. The theoretical approach is also deeply related to theoretical issues in parallel compilation. Numerous publications were produced, follow-on projects have been identified and are being developed programmatically, and a new area in computational algebra, SDS, was produced.

  8. Word To Compute By: Keeping Up with Personal Computing Terminology.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1997-01-01

    Reviews current terminology related to personal computing, referring to a previous article written five years ago. Highlights include Macintosh processors; Pentium processors; computer memory; computer buses; video displays; storage devices; newer media; and miscellaneous terms and concepts. (LRW)

  9. Science Teaching and Computer Languages.

    ERIC Educational Resources Information Center

    Bork, Alfred M.

    Computer languages are analyzed and compared from the standpoint of the science teacher using computers in the classroom. Computers have three basic uses in teaching, to compute, to instruct, and to motivate; effective computer languages should be responsive to these three modes. Widely-used languages, including FORTRAN, ALGOL, PL/1, and APL, are…

  10. Computer Technology in Adult Education.

    ERIC Educational Resources Information Center

    Slider, Patty; Hodges, Kathy; Carter, Cea; White, Barbara

    This publication provides materials to help adult educators use computer technology in their teaching. Section 1, Computer Basics, contains activities and materials on these topics: increasing computer literacy, computer glossary, parts of a computer, keyboard, disk care, highlighting text, scrolling and wrap-around text, setting up text,…

  11. New Computing in Higher Education.

    ERIC Educational Resources Information Center

    Gilbert, Steven W.; Green, Kenneth C.

    1986-01-01

    With the advent of the computer revolution, major changes are underway in the ways campuses deal with computing and computers. One change is the gathering momentum of faculty members and administrators to become computer users. Another is the vast amount of individual and institutional effort invested in plans for integrating computing into the…

  12. Quantum computing with trapped ions

    SciTech Connect

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  13. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    ERIC Educational Resources Information Center

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  14. Publishing Trends in Educational Computing.

    ERIC Educational Resources Information Center

    O'Hair, Marilyn; Johnson, D. LaMont

    1989-01-01

    Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…

  15. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  16. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  17. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  18. Trends in Higher Education Computing.

    ERIC Educational Resources Information Center

    Thomas, Charles R.

    1984-01-01

    Discusses the effects which changes in computer technology are having on the organization, staffing, and budgets at institutions of higher education. Trends in computer hardware, computer software, and in office automation are also discussed. (JN)

  19. [Theme: Computers in Agricultural Education.].

    ERIC Educational Resources Information Center

    Leising, James G.; And Others

    1982-01-01

    This theme issue on computer applications in agriculture covers uses of computers in farm management and vocational agriculture instruction; strategies for evaluating and purchasing hardware and software; and ideas for improving the computer literacy of teachers and students. (SK)

  20. Hot, Hot, Hot Computer Careers.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1988-01-01

    Discusses the increasing need for electrical, electronic, and computer engineers; and scientists. Provides current status of the computer industry and average salaries. Considers computer chip manufacture and the current chip shortage. (MVL)

  1. PERSONAL COMPUTERS AND ENVIRONMENTAL ENGINEERING

    EPA Science Inventory

    This article discusses how personal computers can be applied to environmental engineering. fter explaining some of the differences between mainframe and Personal computers, we will review the development of personal computers and describe the areas of data management, interactive...

  2. Highly parallel computer architecture for robotic computation

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Bejczy, Anta K. (Inventor)

    1991-01-01

    In a computer having a large number of single instruction multiple data (SIMD) processors, each of the SIMD processors has two sets of three individual processor elements controlled by a master control unit and interconnected among a plurality of register file units where data is stored. The register files input and output data in synchronism with a minor cycle clock under control of two slave control units controlling the register file units connected to respective ones of the two sets of processor elements. Depending upon which ones of the register file units are enabled to store or transmit data during a particular minor clock cycle, the processor elements within an SIMD processor are connected in rings or in pipeline arrays, and may exchange data with the internal bus or with neighboring SIMD processors through interface units controlled by respective ones of the two slave control units.

  3. Computational models of planning.

    PubMed

    Geffner, Hector

    2013-07-01

    The selection of the action to do next is one of the central problems faced by autonomous agents. Natural and artificial systems address this problem in various ways: action responses can be hardwired, they can be learned, or they can be computed from a model of the situation, the actions, and the goals. Planning is the model-based approach to action selection and a fundamental ingredient of intelligent behavior in both humans and machines. Planning, however, is computationally hard as the consideration of all possible courses of action is not computationally feasible. The problem has been addressed by research in Artificial Intelligence that in recent years has uncovered simple but powerful computational principles that make planning feasible. The principles take the form of domain-independent methods for computing heuristics or appraisals that enable the effective generation of goal-directed behavior even over huge spaces. In this paper, we look at several planning models, at methods that have been shown to scale up to large problems, and at what these methods may suggest about the human mind. WIREs Cogn Sci 2013, 4:341-356. doi: 10.1002/wcs.1233 The authors have declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. PMID:26304223

  4. Extensible Computational Chemistry Environment

    Energy Science and Technology Software Center (ESTSC)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing themore » power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of the inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  5. Verifiable Quantum Computing

    NASA Astrophysics Data System (ADS)

    Kashefi, Elham

    Over the next five to ten years we will see a state of flux as quantum devices become part of the mainstream computing landscape. However adopting and applying such a highly variable and novel technology is both costly and risky as this quantum approach has an acute verification and validation problem: On the one hand, since classical computations cannot scale up to the computational power of quantum mechanics, verifying the correctness of a quantum-mediated computation is challenging; on the other hand, the underlying quantum structure resists classical certification analysis. Our grand aim is to settle these key milestones to make the translation from theory to practice possible. Currently the most efficient ways to verify a quantum computation is to employ cryptographic methods. I will present the current state of the art of various existing protocols where generally there exists a trade-off between the practicality of the scheme versus their generality, trust assumptions and security level. EK gratefully acknowledges funding through EPSRC Grants EP/N003829/1 and EP/M013243/1.

  6. Optical computer motherboards

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Xu, Guoda; Bartha, John M.; Gruntman, Michael A.

    1997-09-01

    In this paper, we investigate the application of precision plastic optics into a communication/computer sub-system, such as a hybrid computer motherboard. We believe that using optical waveguides for next-generation computer motherboards can provide a high performance alternative for present multi-layer printed circuit motherboards. In response to this demand, we suggest our novel concept of a hybrid motherboard based on an internal-fiber-coupling (IFC) wavelength-division-multiplexing (WDM) optical backplane. The IFC/WDM backplane provides dedicated Tx/Rx connections, and applies low-cost, high-performance components, including CD LDs, GRIN plastic fibers, molding housing, and nonimaging optics connectors. Preliminary motherboard parameters are: speed 100 MHz/100 m, or 1 GHz/10 m; fiber loss approximately 0.01 dB/m; almost zero fan-out/fan-in optical power loss, and eight standard wavelength channels. The proposed hybrid computer motherboard, based on innovative optical backplane technology, should solve low-speed, low-parallelism bottlenecks in present electric computer motherboards.

  7. Computer Use and Computer Anxiety in Older Korean Americans.

    PubMed

    Yoon, Hyunwoo; Jang, Yuri; Xie, Bo

    2016-09-01

    Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population. PMID:25698717

  8. Guide to computing at ANL

    SciTech Connect

    Heiberger, A.A.

    1990-05-01

    This guide to computing at ANL acquaints you with the various computers, computing systems, and computing procedures at Argonne and assists you in selecting the computer, system, network, and software with which you can do your work most efficiently. The chapters discuss available computers and services; getting started; sources of information; application software; mathematical libraries and utilities; programming languages, graphics, storing your data on disk and tape files; and your user voice. (LSP)

  9. Personal computers on Ethernet

    NASA Technical Reports Server (NTRS)

    Kao, R.

    1988-01-01

    Many researchers in the Division have projects which require transferring large files between their personal computers (PC) and VAX computers in the Laboratory for Oceans Computing Facility (LOCF). Since Ethernet local area network provides high speed communication channels which make file transfers (among other capabilities) practical, a network plan was assembled to connect IBM and IBM compatible PC's to Ethernet for participating personnel. The design employs ThinWire Ethernet technology. A simplified configuration diagram is shown. A DEC multiport repeater (DEMPR) is used for connection of ThinWire Ethernet segments. One port of DEMPR is connected to a H4000 transceiver and the transceiver is clamped onto the Goddard Ethernet backbonecoaxial cable so that the PC's can be optionally on the SPAN network. All these common elements were successfully installed and tested.

  10. Optoelectronic Reservoir Computing

    PubMed Central

    Paquot, Y.; Duport, F.; Smerieri, A.; Dambre, J.; Schrauwen, B.; Haelterman, M.; Massar, S.

    2012-01-01

    Reservoir computing is a recently introduced, highly efficient bio-inspired approach for processing time dependent data. The basic scheme of reservoir computing consists of a non linear recurrent dynamical system coupled to a single input layer and a single output layer. Within these constraints many implementations are possible. Here we report an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line. Our implementation is sufficiently fast for real time information processing. We illustrate its performance on tasks of practical importance such as nonlinear channel equalization and speech recognition, and obtain results comparable to state of the art digital implementations. PMID:22371825

  11. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  12. Computer networking at FERMILAB

    SciTech Connect

    Chartrand, G.

    1986-05-01

    Management aspects of data communications facilities at Fermilab are described. Local area networks include Ferminet, a broadband CATV system which serves as a backbone-type carrier for high-speed data traffic between major network nodes; micom network, four Micom Micro-600/2A port selectors via private twisted pair cables, dedicated telephone circuits, or Micom 800/2 statistical multiplexors; and Decnet/Ethernet, several small local area networks which provide host-to-host communications for about 35 VAX computers systems. Wide area (off site) computer networking includes an off site Micom network which provides access to all of Fermilab's computer systems for 10 universities via leased lines or modem; Tymnet, used by many European and Japanese collaborations: Physnet, used for shared data processing task communications by large collaborations of universities; Bitnet, used for file transfer, electronic mail, and communications with CERN; and Mfenet, for access to supercomputers. Plans to participate in Hepnet are also addressed. 3 figs. (DWL)

  13. The future of computing

    NASA Astrophysics Data System (ADS)

    Simmons, Michelle

    2016-05-01

    Down-scaling has been the leading paradigm of the semiconductor industry since the invention of the first transistor in 1947. However miniaturization will soon reach the ultimate limit, set by the discreteness of matter, leading to intensified research in alternative approaches for creating logic devices. This talk will discuss the development of a radical new technology for creating atomic-scale devices which is opening a new frontier of research in electronics globally. We will introduce single atom transistors where we can measure both the charge and spin of individual dopants with unique capabilities in controlling the quantum world. To this end, we will discuss how we are now demonstrating atom by atom, the best way to build a quantum computer - a new type of computer that exploits the laws of physics at very small dimensions in order to provide an exponential speed up in computational processing power.

  14. DNA computing in microreactors

    NASA Astrophysics Data System (ADS)

    Wagler, Patrick; van Noort, Danny; McCaskill, John S.

    2001-11-01

    The goal of this research is to improve the modular stability and programmability of DNA-based computers and in a second step towards optical programmable DNA computing. The main focus here is on hydrodynamic stability. Clockable microreactors can be connected in various ways to solve combinatorial optimisation problems, such as Maximum Clique or 3-SAT. This work demonstrates by construction how one micro-reactor design can be programmed to solve any instance of Maximum Clique up to its given maximum size (N). It reports on an implementation of the architecture proposed previously. This contrasts with conventional DNA computing where the individual sequence of biochemical operations depends on the specific problem. In this pilot study we are tackling a graph for the Maximum Clique problem with N

  15. The Computing Grids

    NASA Astrophysics Data System (ADS)

    Govoni, P.

    2009-12-01

    Since the beginning of the millennium, High Energy Physics research institutions like CERN and INFN pioneered several projects aimed at exploiting the synergy among computing power, storage and network resources, and creating an infrastructure of distributed computing on a worldwide scale. In the year 2000, after the Monarch project [ http://monarc.web.cern.ch/MONARC/], DataGrid started [ http://eu-datagrid.web.cern.ch/eu-datagrid/] aimed at providing High Energy Physics with the computing power needed for the LHC enterprise. This program evolved into the EU DataGrid project, that implemented the first actual prototype of a Grid middleware running on a testbed environment. The next step consisted in the application to the LHC experiments, with the LCG project [ http://lcg.web.cern.ch/LCG/], in turn followed by the EGEE [ http://www.eu-egee.org/] and EGEE II programs.

  16. Computers for symbolic processing

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Lowrie, Matthew B.; Li, Guo-Jie

    1989-01-01

    A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

  17. Computational Systems Biology

    SciTech Connect

    McDermott, Jason E.; Samudrala, Ram; Bumgarner, Roger E.; Montogomery, Kristina; Ireton, Renee

    2009-05-01

    Computational systems biology is the term that we use to describe computational methods to identify, infer, model, and store relationships between the molecules, pathways, and cells (“systems”) involved in a living organism. Based on this definition, the field of computational systems biology has been in existence for some time. However, the recent confluence of high throughput methodology for biological data gathering, genome-scale sequencing and computational processing power has driven a reinvention and expansion of this field. The expansions include not only modeling of small metabolic{Ishii, 2004 #1129; Ekins, 2006 #1601; Lafaye, 2005 #1744} and signaling systems{Stevenson-Paulik, 2006 #1742; Lafaye, 2005 #1744} but also modeling of the relationships between biological components in very large systems, incluyding whole cells and organisms {Ideker, 2001 #1124; Pe'er, 2001 #1172; Pilpel, 2001 #393; Ideker, 2002 #327; Kelley, 2003 #1117; Shannon, 2003 #1116; Ideker, 2004 #1111}{Schadt, 2003 #475; Schadt, 2006 #1661}{McDermott, 2002 #878; McDermott, 2005 #1271}. Generally these models provide a general overview of one or more aspects of these systems and leave the determination of details to experimentalists focused on smaller subsystems. The promise of such approaches is that they will elucidate patterns, relationships and general features that are not evident from examining specific components or subsystems. These predictions are either interesting in and of themselves (for example, the identification of an evolutionary pattern), or are interesting and valuable to researchers working on a particular problem (for example highlight a previously unknown functional pathway). Two events have occurred to bring about the field computational systems biology to the forefront. One is the advent of high throughput methods that have generated large amounts of information about particular systems in the form of genetic studies, gene expression analyses (both protein and

  18. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  19. Human-computer interface

    DOEpatents

    Anderson, Thomas G.

    2004-12-21

    The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

  20. Constructing computer virus phylogenies

    SciTech Connect

    Goldberg, L.A.; Goldberg, P.W.; Phillips, C.A.; Sorkin, G.B.

    1996-03-01

    There has been much recent algorithmic work on the problem of reconstructing the evolutionary history of biological species. Computer virus specialists are interested in finding the evolutionary history of computer viruses--a virus is often written using code fragments from one or more other viruses, which are its immediate ancestors. A phylogeny for a collection of computer viruses is a directed acyclic graph whose nodes are the viruses and whose edges map ancestors to descendants and satisfy the property that each code fragment is ``invented`` only once. To provide a simple explanation for the data, we consider the problem of constructing such a phylogeny with a minimal number of edges. In general, this optimization problem cannot be solved in quasi-polynomial time unless NQP=QP; we present positive and negative results for associated approximated problems. When tree solutions exist, they can be constructed and randomly sampled in polynomial time.

  1. Auto covariance computer

    NASA Technical Reports Server (NTRS)

    Hepner, T. E.; Meyers, J. F. (Inventor)

    1985-01-01

    A laser velocimeter covariance processor which calculates the auto covariance and cross covariance functions for a turbulent flow field based on Poisson sampled measurements in time from a laser velocimeter is described. The device will process a block of data that is up to 4096 data points in length and return a 512 point covariance function with 48-bit resolution along with a 512 point histogram of the interarrival times which is used to normalize the covariance function. The device is designed to interface and be controlled by a minicomputer from which the data is received and the results returned. A typical 4096 point computation takes approximately 1.5 seconds to receive the data, compute the covariance function, and return the results to the computer.

  2. Computer integrated laboratory testing

    NASA Technical Reports Server (NTRS)

    Dahl, Charles C.

    1992-01-01

    The objective is the integration of computers into the Engineering Materials Science Laboratory course, where existing test equipment is not computerized. The first lab procedure is to demonstrate and produce a material phase change curve. The second procedure is a demonstration of the modulus of elasticity and related stress-strain curve, plastic performance, maximum and failure strength. The process of recording data by sensors that are connected to a data logger which adds a time base, and the data logger in turn connected to a computer, places the materials labs into a computer integrated mode with minimum expense and maximum flexibility. The sensor signals are input into a spread sheet for tabular records, curve generation, and graph printing.

  3. Stopping computer crimes

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Two new books about intrusions and computer viruses remind us that attacks against our computers on networks are the actions of human beings. Cliff Stoll's book about the hacker who spent a year, beginning in Aug. 1986, attempting to use the Lawrence Berkeley Computer as a stepping-stone for access to military secrets is a spy thriller that illustrates the weaknesses of our password systems and the difficulties in compiling evidence against a hacker engaged in espionage. Pamela Kane's book about viruses that attack IBM PC's shows that viruses are the modern version of the old problem of a Trojan horse attack. It discusses the most famous viruses and their countermeasures, and it comes with a floppy disk of utility programs that will disinfect your PC and thwart future attack.

  4. Computer cast blast modelling

    SciTech Connect

    Chung, S.; McGill, M.; Preece, D.S.

    1994-12-31

    Cast blasting can be designed to utilize explosive energy effectively and economically for coal mining operations to remove overburden material. This paper compares two blast models known as DMC (Distinct Motion Code) and SABREX (Scientific Approach to Breaking Rock with Explosives). DMC applies discrete spherical elements interacted with the flow of explosive gases and the explicit time integration to track particle motion resulting from a blast. The input to this model includes multi-layer rock properties, and both loading geometry and explosives equation-of-state parameters. It enables the user to have a wide range of control over drill pattern and explosive loading design parameters. SABREX assumes that heave process is controlled by the explosive gases which determines the velocity and time of initial movement of blocks within the burden, and then tracks the motion of the blocks until they come to a rest. In order to reduce computing time, the in-flight collisions of blocks are not considered and the motion of the first row is made to limit the motion of subsequent rows. Although modelling a blast is a complex task, the advance in computer technology has increased the computing power of small work stations as well as PC (personal computers) to permit a much shorter turn-around time for complex computations. The DMC can perform a blast simulation in 0.5 hours on the SUN SPARC station 10-41 while the new SABREX 3.5 produces results of a cast blast in ten seconds on a 486-PC. Predicted percentage of cast and face velocities from both computer codes compare well with the measured results from a full scale cast blast.

  5. DNA computing on surfaces

    NASA Astrophysics Data System (ADS)

    Liu, Qinghua; Wang, Liman; Frutos, Anthony G.; Condon, Anne E.; Corn, Robert M.; Smith, Lloyd M.

    2000-01-01

    DNA computing was proposed as a means of solving a class of intractable computational problems in which the computing time can grow exponentially with problem size (the `NP-complete' or non-deterministic polynomial time complete problems). The principle of the technique has been demonstrated experimentally for a simple example of the hamiltonian path problem (in this case, finding an airline flight path between several cities, such that each city is visited only once). DNA computational approaches to the solution of other problems have also been investigated. One technique involves the immobilization and manipulation of combinatorial mixtures of DNA on a support. A set of DNA molecules encoding all candidate solutions to the computational problem of interest is synthesized and attached to the surface. Successive cycles of hybridization operations and exonuclease digestion are used to identify and eliminate those members of the set that are not solutions. Upon completion of all the multi-step cycles, the solution to the computational problem is identified using a polymerase chain reaction to amplify the remaining molecules, which are then hybridized to an addressed array. The advantages of this approach are its scalability and potential to be automated (the use of solid-phase formats simplifies the complex repetitive chemical processes, as has been demonstrated in DNA and protein synthesis). Here we report the use of this method to solve a NP-complete problem. We consider a small example of the satisfiability problem (SAT), in which the values of a set of boolean variables satisfying certain logical constraints are determined.

  6. Computers, Nanotechnology and Mind

    NASA Astrophysics Data System (ADS)

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  7. Computational evolution: taking liberties.

    PubMed

    Correia, Luís

    2010-09-01

    Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research. PMID:20532997

  8. Computer Programs for Construction

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A NASA computer program aids Hudson Engineering Corporation, Houston, Texas in the design and construction of huge petrochemical processing plants like the one shown, which is located at Ju'aymah, Saud Arabia. The pipes handling the flow of chemicals are subject to a variety of stresses, such as weight and variations in temperature. Hudson Engineering uses a COSMIC piping flexibility analysis computer program to analyze and insure the necessary strength and flexibility of the pipes. This program helps the company realize substantial savings in reduced engineering time.

  9. Computationally efficient Bayesian tracking

    NASA Astrophysics Data System (ADS)

    Aughenbaugh, Jason; La Cour, Brian

    2012-06-01

    In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.

  10. Computational quantum chemistry website

    SciTech Connect

    1997-08-22

    This report contains the contents of a web page related to research on the development of quantum chemistry methods for computational thermochemistry and the application of quantum chemistry methods to problems in material chemistry and chemical sciences. Research programs highlighted include: Gaussian-2 theory; Density functional theory; Molecular sieve materials; Diamond thin-film growth from buckyball precursors; Electronic structure calculations on lithium polymer electrolytes; Long-distance electronic coupling in donor/acceptor molecules; and Computational studies of NOx reactions in radioactive waste storage.

  11. Adventures in Computational Grids

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Sometimes one supercomputer is not enough. Or your local supercomputers are busy, or not configured for your job. Or you don't have any supercomputers. You might be trying to simulate worldwide weather changes in real time, requiring more compute power than you could get from any one machine. Or you might be collecting microbiological samples on an island, and need to examine them with a special microscope located on the other side of the continent. These are the times when you need a computational grid.

  12. Computer networking for scientists.

    PubMed

    Jennings, D M; Landweber, L H; Fuchs, I H; Farber, D J; Adrion, W R

    1986-02-28

    Scientific research has always relied on communication for gathering and providing access to data; for exchanging information; for holding discussions, meetings, and seminars; for collaborating with widely dispersed researchers; and for disseminating results. The pace and complexity of modern research, especially collaborations of researchers in different institutions, has dramatically increased scientists' communications needs. Scientists now need immediate access to data and information, to colleagues and collaborators, and to advanced computing and information services. Furthermore, to be really useful, communication facilities must be integrated with the scientist's normal day-to-day working environment. Scientists depend on computing and communications tools and are handicapped without them. PMID:17740290

  13. Making maps with computers

    USGS Publications Warehouse

    Guptill, S.C.; Starr, L.E.

    1988-01-01

    Soon after their introduction in the 1950s, digital computers were used for various phases of the mapping process, especially for trigonometric calculations of survey data and for orientation of aerial photographs on map manuscripts. In addition, computer-controlled plotters were used to draw simple outline maps. The process of collecting data for the plotters was slow and not as precise as those produced by the best manual cartography. Only during the 1980s has it become technologically feasible and cost-effective to assemble and use the data required to automate the mapping process. -from Authors

  14. Exercises in Molecular Computing

    PubMed Central

    2014-01-01

    Conspectus The successes of electronic digital logic have transformed every aspect of human life over the last half-century. The word “computer” now signifies a ubiquitous electronic device, rather than a human occupation. Yet evidently humans, large assemblies of molecules, can compute, and it has been a thrilling challenge to develop smaller, simpler, synthetic assemblies of molecules that can do useful computation. When we say that molecules compute, what we usually mean is that such molecules respond to certain inputs, for example, the presence or absence of other molecules, in a precisely defined but potentially complex fashion. The simplest way for a chemist to think about computing molecules is as sensors that can integrate the presence or absence of multiple analytes into a change in a single reporting property. Here we review several forms of molecular computing developed in our laboratories. When we began our work, combinatorial approaches to using DNA for computing were used to search for solutions to constraint satisfaction problems. We chose to work instead on logic circuits, building bottom-up from units based on catalytic nucleic acids, focusing on DNA secondary structures in the design of individual circuit elements, and reserving the combinatorial opportunities of DNA for the representation of multiple signals propagating in a large circuit. Such circuit design directly corresponds to the intuition about sensors transforming the detection of analytes into reporting properties. While this approach was unusual at the time, it has been adopted since by other groups working on biomolecular computing with different nucleic acid chemistries. We created logic gates by modularly combining deoxyribozymes (DNA-based enzymes cleaving or combining other oligonucleotides), in the role of reporting elements, with stem–loops as input detection elements. For instance, a deoxyribozyme that normally exhibits an oligonucleotide substrate recognition region is

  15. Computation in gene networks

    NASA Astrophysics Data System (ADS)

    Ben-Hur, Asa; Siegelmann, Hava T.

    2004-03-01

    Genetic regulatory networks have the complex task of controlling all aspects of life. Using a model of gene expression by piecewise linear differential equations we show that this process can be considered as a process of computation. This is demonstrated by showing that this model can simulate memory bounded Turing machines. The simulation is robust with respect to perturbations of the system, an important property for both analog computers and biological systems. Robustness is achieved using a condition that ensures that the model equations, that are generally chaotic, follow a predictable dynamics.

  16. Highly parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Highly parallel computing architectures are the only means to achieve the computation rates demanded by advanced scientific problems. A decade of research has demonstrated the feasibility of such machines and current research focuses on which architectures designated as multiple instruction multiple datastream (MIMD) and single instruction multiple datastream (SIMD) have produced the best results to date; neither shows a decisive advantage for most near-homogeneous scientific problems. For scientific problems with many dissimilar parts, more speculative architectures such as neural networks or data flow may be needed.

  17. [The Computing Teacher. Selected Articles on Computer Literacy.

    ERIC Educational Resources Information Center

    Moursund, David; And Others

    1985-01-01

    This document consists of a compilation of nine articles, on computer literacy, that have been extracted from the 1984-1985 issues of the journal "The Computing Teacher". The articles include: (1) "ICLEP (Individual Computer Literacy Education Plan): A Powerful Idea" (David Moursund); (2) "Computers, Kids, and Values" (Stephen J. Taffee); (3)…

  18. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  19. Computer Piracy and the Myth of Computer Innocence.

    ERIC Educational Resources Information Center

    Weintraub, William

    1986-01-01

    Outlines the important role that schools have in teaching students ethical considerations in computer use. Describes how the Cupertino Union School District developed a computer curriculum that includes ethics as a major component. Includes computer listings of crime definitions and a computer ethics policy. (MD)

  20. Center for computer security: Computer Security Group conference. Summary

    SciTech Connect

    Not Available

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  1. Computer Availability, Computer Experience and Technophobia among Public School Teachers.

    ERIC Educational Resources Information Center

    Rosen, Larry D.; Weil, Michelle M.

    1995-01-01

    Describes a study that examined technophobia in elementary and secondary public school teachers as an explanation for low levels of computer utilization. Highlights include empirical studies of technophobia; technophobia interventions; demographic differences; computer availability and use; computer anxiety; computer attitudes; and predictive…

  2. The Relationship between Computer Anxiety and Computer Self-Efficacy

    ERIC Educational Resources Information Center

    Simsek, Ali

    2011-01-01

    This study examined the relationship between computer anxiety and computer self-efficacy of students and teachers in elementary and secondary schools. The sample included a total of 845 subjects from two private school systems in Turkey. The Oetting's Computer Anxiety Scale was used to measure computer anxiety whereas the Murphy's Computer…

  3. EDITORIAL: Computational materials science Computational materials science

    NASA Astrophysics Data System (ADS)

    Kahl, Gerhard; Kresse, Georg

    2011-10-01

    Special issue in honour of Jürgen Hafner On 30 September 2010, Jürgen Hafner, one of the most prominent and influential members within the solid state community, retired. His remarkably broad scientific oeuvre has made him one of the founding fathers of modern computational materials science: more than 600 scientific publications, numerous contributions to books, and a highly cited monograph, which has become a standard reference in the theory of metals, witness not only the remarkable productivity of Jürgen Hafner but also his impact in theoretical solid state physics. In an effort to duly acknowledge Jürgen Hafner's lasting impact in this field, a Festsymposium was held on 27-29 September 2010 at the Universität Wien. The organizers of this symposium (and authors of this editorial) are proud to say that a large number of highly renowned scientists in theoretical condensed matter theory—co-workers, friends and students—accepted the invitation to this celebration of Hafner's jubilee. Some of these speakers also followed our invitation to submit their contribution to this Festschrift, published in Journal of Physics: Condensed Matter, a journal which Jürgen Hafner served in 2000-2003 and 2003-2006 as a member of the Advisory Editorial Board and member of the Executive Board, respectively. In the subsequent article, Volker Heine, friend and co-worker of Jürgen Hafner over many decades, gives an account of Hafner's impact in the field of theoretical condensed matter physics. Computational materials science contents Theoretical study of structural, mechanical and spectroscopic properties of boehmite (γ-AlOOH) D Tunega, H Pašalić, M H Gerzabek and H Lischka Ethylene epoxidation catalyzed by chlorine-promoted silver oxide M O Ozbek, I Onal and R A Van Santen First-principles study of Cu2ZnSnS4 and the related band offsets for photovoltaic applicationsA Nagoya, R Asahi and G Kresse Renormalization group study of random quantum magnetsIstván A Kovács and

  4. An introduction to computer viruses

    SciTech Connect

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  5. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2014-12-30

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  6. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2015-01-27

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  7. Computationally efficient control allocation

    NASA Technical Reports Server (NTRS)

    Durham, Wayne (Inventor)

    2001-01-01

    A computationally efficient method for calculating near-optimal solutions to the three-objective, linear control allocation problem is disclosed. The control allocation problem is that of distributing the effort of redundant control effectors to achieve some desired set of objectives. The problem is deemed linear if control effectiveness is affine with respect to the individual control effectors. The optimal solution is that which exploits the collective maximum capability of the effectors within their individual physical limits. Computational efficiency is measured by the number of floating-point operations required for solution. The method presented returned optimal solutions in more than 90% of the cases examined; non-optimal solutions returned by the method were typically much less than 1% different from optimal and the errors tended to become smaller than 0.01% as the number of controls was increased. The magnitude of the errors returned by the present method was much smaller than those that resulted from either pseudo inverse or cascaded generalized inverse solutions. The computational complexity of the method presented varied linearly with increasing numbers of controls; the number of required floating point operations increased from 5.5 i, to seven times faster than did the minimum-norm solution (the pseudoinverse), and at about the same rate as did the cascaded generalized inverse solution. The computational requirements of the method presented were much better than that of previously described facet-searching methods which increase in proportion to the square of the number of controls.

  8. Computers for the Disabled.

    ERIC Educational Resources Information Center

    Lazzaro, Joseph J.

    1993-01-01

    Describes adaptive technology for personal computers that accommodate disabled users and may require special equipment including hardware, memory, expansion slots, and ports. Highlights include vision aids, including speech synthesizers, magnification, braille, and optical character recognition (OCR); hearing adaptations; motor-impaired…

  9. Computational gearing mechanics

    NASA Technical Reports Server (NTRS)

    Huston, Ronald L.

    1992-01-01

    The Final Report on computational gear mechanics is presented. This is an expository report summarizing the research efforts and results. Research on gear geometry, gear stress, and gear dynamics is discussed. Current research and planned future efforts are also discussed. A comprehensive bibliography is presented.

  10. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  11. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  12. The Computer and Recreation.

    ERIC Educational Resources Information Center

    Edwards, Paul

    The paper examines the applications of microcomputers to recreation programing for blind persons. The accessibility of microcomputers to this population is discussed, and the advantages as well as disadvantages of speech synthesis equipment are noted. Information is presented on the modification of hardware for Radio Shack and Apple computers.…

  13. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1989-01-01

    Described are the uses of a spreadsheet program to compute the free energy change associated with a reaction; to model acid/base equilibria; and to solve equilibrium constant expressions. Stressed are the advantages derived from having the spreadsheet perform the multiple, trivial calculations. (CW)

  14. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Collins, Michael J.; Vitz, Ed

    1988-01-01

    Examines two computer interfaced lab experiments: 1) discusses the automation of a Perkin Elmer 337 infrared spectrophotometer noting the mechanical and electronic changes needed; 2) uses the Gouy method and Lotus Measure software to automate magnetic susceptibility determinations. Methodology is described. (MVL)

  15. International computer vision directory

    SciTech Connect

    Flora, P.C.

    1986-01-01

    This book contains information on: computerized automation technologies. State-of-the-art computer vision systems for many areas of industrial use are covered. Other topics discussed include the following automated inspection systems; robot/vision systems; vision process control; cameras (vidicon and solid state); vision peripherals and components; and pattern processor.

  16. Computer/Information Science

    ERIC Educational Resources Information Center

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  17. Computationally modeling interpersonal trust

    PubMed Central

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust. PMID:24363649

  18. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  19. Cloud Computing Explained

    ERIC Educational Resources Information Center

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  20. Computing in the Clouds

    ERIC Educational Resources Information Center

    Johnson, Doug

    2010-01-01

    Web-based applications offer teachers, students, and school districts a convenient way to accomplish a wide range of tasks, from accounting to word processing, for free. Cloud computing has the potential to offer staff and students better services at a lower cost than the technology deployment models they're using now. Saving money and improving…