Science.gov

Sample records for computed tomography-a preliminary

  1. Preliminary Experimental Results on Controlled Cardiac Computed Tomography: A Phantom Study

    PubMed Central

    Lu, Yang; Cai, Zhijun; Wang, Ge; Zhao, Jun; Bai, Er-Wei

    2010-01-01

    In this paper, we present the preliminary experimental results on controlled cardiac computed tomography (CT), which aims to reduce the motion artifacts by means of controlling the x-ray source rotation speed. An innovative cardiac phantom enables us to perform this experiment without modifying the scanner. It is the first experiment on the cardiac CT with speed controlled x-ray source. Experimental results demonstrate that the proposed method successfully separates the phantom images at different phases (improve the temporal resolution) though controlling the x-ray speed. PMID:19696470

  2. Age estimation by pulp-to-tooth area ratio using cone-beam computed tomography: A preliminary analysis

    PubMed Central

    Rai, Arpita; Acharya, Ashith B.; Naikmasur, Venkatesh G.

    2016-01-01

    Background: Age estimation of living or deceased individuals is an important aspect of forensic sciences. Conventionally, pulp-to-tooth area ratio (PTR) measured from periapical radiographs have been utilized as a nondestructive method of age estimation. Cone-beam computed tomography (CBCT) is a new method to acquire three-dimensional images of the teeth in living individuals. Aims: The present study investigated age estimation based on PTR of the maxillary canines measured in three planes obtained from CBCT image data. Settings and Design: Sixty subjects aged 20–85 years were included in the study. Materials and Methods: For each tooth, mid-sagittal, mid-coronal, and three axial sections—cementoenamel junction (CEJ), one-fourth root level from CEJ, and mid-root—were assessed. PTR was calculated using AutoCAD software after outlining the pulp and tooth. Statistical Analysis Used: All statistical analyses were performed using an SPSS 17.0 software program. Results and Conclusions: Linear regression analysis showed that only PTR in axial plane at CEJ had significant age correlation (r = 0.32; P < 0.05). This is probably because of clearer demarcation of pulp and tooth outline at this level. PMID:28123269

  3. 99mTc-IgG-Lung Scintigraphy in the Assessment of Pulmonary Involvement in Interstitial Lung Disease and Its Comparison With Pulmonary Function Tests and High-Resolution Computed Tomography: A Preliminary Study

    PubMed Central

    Bahtouee, Mehrzad; Saberifard, Jamshid; Javadi, Hamid; Nabipour, Iraj; Malakizadeh, Hasan; Monavvarsadegh, Gholamhossein; Ilkhani Pak, Hoda; Sadeghi, Azadeh; Assadi, Majid

    2015-01-01

    Background: The discrimination of inactive inflammatory processes from the active form of the disease is of great importance in the management of interstitial lung disease (ILD). Objectives: The aim of this study was to determine the efficacy of 99mTc-IgG scan for the detection of severity of disease compared to high-resolution computed tomography (HRCT) and pulmonary function test (PFT). Patients and Methods: Eight known cases of ILD including four cases of Mustard gas (MG) intoxication and four patients with ILD of unknown cause were included in this study. A population of six patients without lung disease was considered as the control group. The patients underwent PFT and high-resolution computed tomography, followed by 99mTc-IgG scan. They were followed up for one year. 99mTc-IgG scan assessment of IgG uptake was accomplished both qualitatively (subjectively) and semiquantitatively. Results: All eight ILD patients demonstrated a strong increase in 99mTc-IgG uptake in the lungs, compared to the control patients. The 99mTc-IgG scan scores were higher in the patient group (0.64[95% confidence interval(CI)=0.61-0.69])) than the control group (0.35 (0.35[95% CI=0.28-0.40]), (P<0.05)). In patients, a statistically significant positive correlation was detected between 99mTc-IgG scan and HRCT scores (Spearman’s correlation coefficient = 0.92, P < 0.008). The 99mTc-Human Immunoglobulin (HIG) scores were not significantly correlated with PFT findings (including FVC, FEV1, FEV1/FVC), O2 saturation and age ( P values > 0.05). There were no significant correlations between 99mTc-IgG score and HRCT patterns including ground glass opacity, reticular fibrosis and honeycombing (P value > 0.05). Conclusion: The present results confirmed that 99mTc-IgG scan could be applied to detect the severity of pulmonary involvement, which was well correlated with HRCT findings. This data also showed that the 99mTc-IgG scan might be used as a complement to HRCT in the functional evaluation

  4. 99mTc-MIBI Lung Scintigraphy in the Assessment of Pulmonary Involvement in Interstitial Lung Disease and Its Comparison With Pulmonary Function Tests and High-Resolution Computed Tomography: A Preliminary Study.

    PubMed

    Bahtouee, Mehrzad; Saberifard, Jamshid; Javadi, Hamid; Nabipour, Iraj; Raeisi, Alireza; Assadi, Majid; Eftekhari, Mohammad

    2015-11-01

    The differentiation of active inflammatory processes from an inactive form of the disease is of great value in the management of interstitial lung disease (ILD). The aim of this investigation was to assess the efficacy of 99mTc-methoxy-isobutyl-isonitrile (99mTc-MIBI) scans in distinguishing the severity of the disease compared to radiological and clinical parameters.In total, 19 known cases of ILD were included in this study and were followed up for 1 year. Five patients without lung disease were considered as the control group. The patients underwent pulmonary function tests (PFTs) and high-resolution computed tomography scans, followed by 99mTc-MIBI scanning. The 99mTc-MIBI scans were analyzed either qualitatively (subjectively) or semiquantitatively.All 19 ILD patients demonstrated a strong increase in 99mTc-MIBI uptake in the lungs compared to the control group. The 99mTc-MIBI scan scores were higher in the patient group in both the early phase (0.24[0.19-0.31] vs 0.11[0.10-0.15], P < 0.05) and the delayed phase (0.15[0.09-0.27] vs 0.04[0.01-0.09], P < 0.05) compared with the control group. A positive correlation was detected between the 99mTc-MIBI scan and the high-resolution computed tomography (HRCT) scores (Spearman's correlation coefficient = 0.65, P < 0.02) in the early phase but not in the delayed phase in patients (P > 0.14). The 99mTc-MIBI scan scores were not significantly correlated with the PFT findings (P > 0.05). In total, 5 patients died and 14 patients were still alive over the 1-year follow-up period. There was also a significant difference between the uptake intensity of 99mTc-MIBI and the outcome in the early phase (dead: 0.32[0.29-0.43] vs alive: 0.21[0.18-0.24], P < 0.05) and delayed phase (dead: 0.27[0.22-0.28] vs alive: 0.10[0.07-0.19], P < 0.05).The washout rate was ~40 min starting from 20 min up to 60 min and this rate was significantly different in our 2 study groups (ILD: 46.61[15.61-50.39] vs NL: 70.91[27.09-116.36], P = 0.04).The

  5. Nanoparticle Contrast Agents for Computed Tomography: A Focus on Micelles

    PubMed Central

    Cormode, David P.; Naha, Pratap C.; Fayad, Zahi A.

    2014-01-01

    Computed tomography (CT) is an X-ray based whole body imaging technique that is widely used in medicine. Clinically approved contrast agents for CT are iodinated small molecules or barium suspensions. Over the past seven years there has been a great increase in the development of nanoparticles as CT contrast agents. Nanoparticles have several advantages over small molecule CT contrast agents, such as long blood-pool residence times, and the potential for cell tracking and targeted imaging applications. Furthermore, there is a need for novel CT contrast agents, due to the growing population of renally impaired patients and patients hypersensitive to iodinated contrast. Micelles and lipoproteins, a micelle-related class of nanoparticle, have notably been adapted as CT contrast agents. In this review we discuss the principles of CT image formation and the generation of CT contrast. We discuss the progress in developing non-targeted, targeted and cell tracking nanoparticle CT contrast agents. We feature agents based on micelles and used in conjunction with spectral CT. The large contrast agent doses needed will necessitate careful toxicology studies prior to clinical translation. However, the field has seen tremendous advances in the past decade and we expect many more advances to come in the next decade. PMID:24470293

  6. Neutron stimulated emission computed tomography: a Monte Carlo simulation approach.

    PubMed

    Sharma, A C; Harrawood, B P; Bender, J E; Tourassi, G D; Kapadia, A J

    2007-10-21

    A Monte Carlo simulation has been developed for neutron stimulated emission computed tomography (NSECT) using the GEANT4 toolkit. NSECT is a new approach to biomedical imaging that allows spectral analysis of the elements present within the sample. In NSECT, a beam of high-energy neutrons interrogates a sample and the nuclei in the sample are stimulated to an excited state by inelastic scattering of the neutrons. The characteristic gammas emitted by the excited nuclei are captured in a spectrometer to form multi-energy spectra. Currently, a tomographic image is formed using a collimated neutron beam to define the line integral paths for the tomographic projections. These projection data are reconstructed to form a representation of the distribution of individual elements in the sample. To facilitate the development of this technique, a Monte Carlo simulation model has been constructed from the GEANT4 toolkit. This simulation includes modeling of the neutron beam source and collimation, the samples, the neutron interactions within the samples, the emission of characteristic gammas, and the detection of these gammas in a Germanium crystal. In addition, the model allows the absorbed radiation dose to be calculated for internal components of the sample. NSECT presents challenges not typically addressed in Monte Carlo modeling of high-energy physics applications. In order to address issues critical to the clinical development of NSECT, this paper will describe the GEANT4 simulation environment and three separate simulations performed to accomplish three specific aims. First, comparison of a simulation to a tomographic experiment will verify the accuracy of both the gamma energy spectra produced and the positioning of the beam relative to the sample. Second, parametric analysis of simulations performed with different user-defined variables will determine the best way to effectively model low energy neutrons in tissue, which is a concern with the high hydrogen content in

  7. Column flotation monitoring based on electrical capacitance volume tomography: A preliminary study

    NASA Astrophysics Data System (ADS)

    Haryono, Didied; Harjanto, Sri; Nugraha, Harisma; Huda, Mahfudz Al; Taruno, Warsito Purwo

    2017-01-01

    A preliminary study of column flotation monitoring process using electrical capacitance volume tomography (ECVT) was conducted. ECVT was one of the monitoring systems which based on the capacitance measurement. It was used to understand the phenomenon that occurs inside the column in a three-dimensional (3-D) image. A linear back projection (LBP) algorithm technique was used to reconstruct the 3-D ECVT images from all measurement data obtained in this study. As a preliminary study, the effect of gas injection in the two-phase (liquid and gas) system was conducted. This study is conducted to assess the possibility of ECVT system in the monitoring of column flotation process. The experiments were conducted by using column flotation with 5 cm diameter and 150 cm height in which a sparger was installed at the bottom of column to inject air inside the column. 32-CH rectangular ECVT sensor was installed at 13 cm above the sparger and placed around the column. The gas injection variations used were 2-7 l/min with interval 1 l/min and all experiments were conducted at room temperature. Based on the signal and image analysis, the signals and 3-D ECVT images showed differences when the gas injection was varied. An increase in gas injection will decrease the fluctuation of signal intensity which correlates to the 3-D ECVT images. Average signals obtained by ECVT when given gas injection variations were in the range of 440.09 to 453.62 mV from high to low gas injection. Based on these results, ECVT has a prospect as an imaging tool to monitor the column flotation process. And also, hopefully, based on the analysis of 3-D images generated by ECVT system, the metallurgical performance would be analyzed in the further researches.

  8. Standardized medical terminology for cardiac computed tomography: a report of the Society of Cardiovascular Computed Tomography.

    PubMed

    Weigold, Wm Guy; Abbara, Suhny; Achenbach, Stephan; Arbab-Zadeh, Armin; Berman, Daniel; Carr, J Jeffrey; Cury, Ricardo C; Halliburton, Sandra S; McCollough, Cynthia H; Taylor, Allen J

    2011-01-01

    Since the emergence of cardiac computed tomography (CT) at the turn of the 21st century, there has been an exponential growth in research and clinical development of the technique, with contributions from investigators and clinicians from varied backgrounds: physics and engineering, informatics, cardiology, and radiology. However, terminology for the field is not unified. As a consequence, there are multiple abbreviations for some terms, multiple terms for some concepts, and some concepts that lack clear definitions and/or usage. In an effort to aid the work of all those who seek to contribute to the literature, clinical practice, and investigation of the field, the Society of Cardiovascular Computed Tomography sets forth a standard set of medical terms commonly used in clinical and investigative practice of cardiac CT.

  9. Upper crustal structure beneath East Java from ambient noise tomography: A preliminary result

    SciTech Connect

    Martha, Agustya Adi; Widiyantoro, Sri; Cummins, Phil; Saygin, Erdinc; Masturyono

    2015-04-24

    East Java has a fairly complex geological structure. Physiographically East Java can be divided into three zones, i.e. the Southern Mountains zone in the southern part, the Kendeng zone in the middle part, and the Rembang zone in the northern part. Most of the seismic hazards in this region are due to processes in the upper crust. In this study, the Ambient Noise Tomography (ANT) method is used to image the upper crustal structure beneath East Java. We have used seismic waveform data recorded by 8Meteorological, Climatological and Geophysical Agency (BMKG) stationary seismographic stations and 16 portable seismographs installed for 2 to 8 weeks. The data were processed to obtain waveforms fromnoise cross-correlation between pairs of seismographic stations. Our preliminary results indicate that the Kendeng zone, an area of low gravity anomaly, is associated with a low velocity zone. On the other hand, the southern mountain range, which has a high gravity anomaly, is related to a high velocity anomaly as shown by our tomographic images.

  10. Upper crustal structure beneath East Java from ambient noise tomography: A preliminary result

    NASA Astrophysics Data System (ADS)

    Martha, Agustya Adi; Widiyantoro, Sri; Cummins, Phil; Saygin, Erdinc; Masturyono

    2015-04-01

    East Java has a fairly complex geological structure. Physiographically East Java can be divided into three zones, i.e. the Southern Mountains zone in the southern part, the Kendeng zone in the middle part, and the Rembang zone in the northern part. Most of the seismic hazards in this region are due to processes in the upper crust. In this study, the Ambient Noise Tomography (ANT) method is used to image the upper crustal structure beneath East Java. We have used seismic waveform data recorded by 8Meteorological, Climatological and Geophysical Agency (BMKG) stationary seismographic stations and 16 portable seismographs installed for 2 to 8 weeks. The data were processed to obtain waveforms fromnoise cross-correlation between pairs of seismographic stations. Our preliminary results indicate that the Kendeng zone, an area of low gravity anomaly, is associated with a low velocity zone. On the other hand, the southern mountain range, which has a high gravity anomaly, is related to a high velocity anomaly as shown by our tomographic images.

  11. Coronary artery imaging with multidetector computed tomography: a call for an evidence-based, multidisciplinary approach.

    PubMed

    Schoenhagen, Paul; Stillman, Arthur E; Garcia, Mario J; Halliburton, Sandra S; Tuzcu, E Murat; Nissen, Steven E; Modic, Michael T; Lytle, Bruce W; Topol, Eric J; White, Richard D

    2006-05-01

    Modern multidetector computed tomography systems are capable of a comprehensive assessment of the cardiovascular system, including noninvasive assessment of coronary anatomy. Multidetector computed tomography is expected to advance the role of noninvasive imaging for coronary artery disease, but clinical experience is still limited. Clinical guidelines are necessary to standardize scanner technology and appropriate clinical applications for coronary computed tomographic angiography. Further evaluation of this evolving technology will benefit from cooperation between different medical specialties, imaging scientists, and manufacturers of multidetector computed tomography systems, supporting multidisciplinary teams focused on the diagnosis and treatment of early and advanced stages of coronary artery disease. This cooperation will provide the necessary education, training, and guidelines for physicians and technologists assuring standard of care for their patients.

  12. Diffraction scattering computed tomography: a window into the structures of complex nanomaterials

    PubMed Central

    Birkbak, M. E.; Leemreize, H.; Frølich, S.; Stock, S. R.

    2015-01-01

    Modern functional nanomaterials and devices are increasingly composed of multiple phases arranged in three dimensions over several length scales. Therefore there is a pressing demand for improved methods for structural characterization of such complex materials. An excellent emerging technique that addresses this problem is diffraction/scattering computed tomography (DSCT). DSCT combines the merits of diffraction and/or small angle scattering with computed tomography to allow imaging the interior of materials based on the diffraction or small angle scattering signals. This allows, e.g., one to distinguish the distributions of polymorphs in complex mixtures. Here we review this technique and give examples of how it can shed light on modern nanoscale materials. PMID:26505175

  13. Virtual tomography: a new approach to efficient human-computer interaction for medical imaging

    NASA Astrophysics Data System (ADS)

    Teistler, Michael; Bott, Oliver J.; Dormeier, Jochen; Pretschner, Dietrich P.

    2003-05-01

    By utilizing virtual reality (VR) technologies the computer system virtusMED implements the concept of virtual tomography for exploring medical volumetric image data. Photographic data from a virtual patient as well as CT or MRI data from real patients are visualized within a virtual scene. The view of this scene is determined either by a conventional computer mouse, a head-mounted display or a freely movable flat panel. A virtual examination probe is used to generate oblique tomographic images which are computed from the given volume data. In addition, virtual models can be integrated into the scene such as anatomical models of bones and inner organs. virtusMED has shown to be a valuable tool to learn human anaotomy and to udnerstand the principles of medical imaging such as sonography. Furthermore its utilization to improve CT and MRI based diagnosis is very promising. Compared to VR systems of the past, the standard PC-based system virtusMED is a cost-efficient and easily maintained solution providing a highly intuitive time-saving user interface for medical imaging.

  14. High precision semi-automated vertebral height measurement using computed tomography: A phantom study.

    PubMed

    Tan, Sovira; Yao, Jianhua; Yao, Lawrence; Ward, Michael M

    2012-01-01

    The measurement of vertebral heights is necessary for the evaluation of many disorders affecting the spine. High precision is particularly important for longitudinal studies where subtle changes are to be detected. Computed tomography (CT) is the modality of choice for high precision studies. Radiography and dual emission X-ray absorptiometry (DXA) use 2D images to assess 3D structures, which can result in poor visualization due to the superimposition of extraneous anatomical objects on the same 2D space. We present a semi-automated computer algorithm to measure vertebral heights in the 3D space of a CT scan. The algorithm segments the vertebral bodies, extracts their end plates and computes vertebral heights as the mean distance between end plates. We evaluated the precision of our algorithm using repeat scans of an anthropomorphic vertebral phantom. Our method has high precision, with a coefficient of variation of only 0.197% and Bland-Altmann 95% limits of agreement of [-0.11, 0.13] mm. For local heights (anterior, middle, posterior) the algorithm was up to 4.2 times more precise than a manual mid-sagittal plane method.

  15. Stafne bone cavity and cone-beam computed tomography: a report of two cases.

    PubMed

    Venkatesh, Elluru

    2015-06-01

    In 1942 Stafne reported 35 asymptomatic, radiolucent cavities that were unilaterally located in the posterior region of the mandible between the mandibular angle and the third molar, and below the mandibular canal. The term Stafne bone cavity (SBC) is now used for such asymptomatic lingual bone depressions of the lower jaw. Since then there have been many reports of SBCs but very fews tudies have used cone-beam computed tomography (CBCT) for their diagnosis. The aim of this paper is to describe the clinical and radiological characteristics of two cases of SBCs and the importance of limited CBCT in confirming the diagnosis.

  16. Stimulated dual-band infrared computed tomography: A tool to inspect the aging infrastructure

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-06-27

    The authors have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. The system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. They conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. The dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness and type. The authors quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, they conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. They determined the sites and relative concrete displacement of 2-in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. They demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally-heated bridge decks.

  17. Radiation doses in cone-beam breast computed tomography: A Monte Carlo simulation study

    SciTech Connect

    Yi Ying; Lai, Chao-Jen; Han Tao; Zhong Yuncheng; Shen Youtao; Liu Xinming; Ge Shuaiping; You Zhicheng; Wang Tianpeng; Shaw, Chris C.

    2011-02-15

    Purpose: In this article, we describe a method to estimate the spatial dose variation, average dose and mean glandular dose (MGD) for a real breast using Monte Carlo simulation based on cone beam breast computed tomography (CBBCT) images. We present and discuss the dose estimation results for 19 mastectomy breast specimens, 4 homogeneous breast models, 6 ellipsoidal phantoms, and 6 cylindrical phantoms. Methods: To validate the Monte Carlo method for dose estimation in CBBCT, we compared the Monte Carlo dose estimates with the thermoluminescent dosimeter measurements at various radial positions in two polycarbonate cylinders (11- and 15-cm in diameter). Cone-beam computed tomography (CBCT) images of 19 mastectomy breast specimens, obtained with a bench-top experimental scanner, were segmented and used to construct 19 structured breast models. Monte Carlo simulation of CBBCT with these models was performed and used to estimate the point doses, average doses, and mean glandular doses for unit open air exposure at the iso-center. Mass based glandularity values were computed and used to investigate their effects on the average doses as well as the mean glandular doses. Average doses for 4 homogeneous breast models were estimated and compared to those of the corresponding structured breast models to investigate the effect of tissue structures. Average doses for ellipsoidal and cylindrical digital phantoms of identical diameter and height were also estimated for various glandularity values and compared with those for the structured breast models. Results: The absorbed dose maps for structured breast models show that doses in the glandular tissue were higher than those in the nearby adipose tissue. Estimated average doses for the homogeneous breast models were almost identical to those for the structured breast models (p=1). Normalized average doses estimated for the ellipsoidal phantoms were similar to those for the structured breast models (root mean square (rms

  18. Rare appearance of an odontogenic myxoma in cone-beam computed tomography: a case report

    PubMed Central

    Dabbaghi, Arash; Nikkerdar, Nafiseh; Bayati, Soheyla; Golshah, Amin

    2016-01-01

    Odontogenic myxoma (OM) is an infiltrative benign bone tumor that occurs almost exclusively in the facial skeleton. The radiographic characteristics of odontogenic myxoma may produce several patterns, making diagnosis difficult. Cone-beam computed tomography (CBCT) may prove extremely useful in clarifying the intraosseous extent of the tumor and its effects on surrounding structures. Here, we report a case of odontogenic myxoma of the mandible in a 27-year-old female. The patient exhibited a slight swelling in the left mandible. Surgical resection was performed. No recurrence was noted. In the CBCT sections, we observed perforation of the cortical plate and radiopaque line that extended from the periosteum, resembling "sunray" appearance—a rare feature of OM—which could not be assessed by panoramic radiography. PMID:27092217

  19. Neutron Stimulated Emission Computed Tomography: A New Technique for Spectroscopic Medical Imaging

    NASA Astrophysics Data System (ADS)

    Kapadia, A. J.

    Neutron stimulated emission computed tomography (NSECT) is being developed as a new medical-imaging technique to quantify spatial distributions of elements in a sample through inelastic scattering of fast neutrons and detection of the resulting gamma rays. It has the potential to diagnose several disorders in the human body that are characterized by changes in element concentration in the diseased tissue. NSECT is sensitive to several naturally occurring elements in the human body that demonstrate concentration changes in the presence of diseases. NSECT, therefore, has the potential to noninvasively diagnose such disorders with radiation dose that is comparable to other ionizing imaging modalities. This chapter discusses the development and progress of NSECT and presents an overview of the current status of the imaging technique.

  20. Head trauma evaluated by magnetic resonance and computed tomography: a comparison

    SciTech Connect

    Han, J.G.; Kaufman, B.; Alfidi, R.J.

    1984-01-01

    Magnetic resonance (MR) images and computed tomograms of 25 patients with head trauma were compared. MR proved to be superior in many ways for demonstrating extracerebral as well as intracerebral traumatic lesions. Isodense subdural hemotomas, which present a diagnostic dilemma on CT images were clearly seen on MR, regardless of their varying CT densities. In a case of epidural hematoma, the dura mater was shown directly as nearly devoid of signal on MR. Direct coronal images provided excellent visualization of extracerebral collections along the peritentorial space and subtemporal area. In a patient with intracerebral hematoma CT failed to demonstrate residual parenchymal changes in a 3-month follow-up study, but MR clearly depicted the abnormalities. The superiority of MR over CT was also well illustrated in a patient with post-traumatic osteomyelitis of the calvarium.

  1. [Dynamic cerebral computed tomography. A contribution to the nosology of cerebral space-occupying processes?].

    PubMed

    Westphal, M

    1983-12-01

    Dynamic cerebral studies were carried out in 21 patients with cerebral abnormalities. Thirteen had tumours and eight showed vascular abnormalities. In most cases the diagnosis was confirmed by histology, but occasionally by angiography or by computed tomography and the clinical course. Dynamic cerebral studies were performed, involving the production of concentration-time curves following bolus injection of ordinary contrast medium. The type of contrast enhancement gave a better indication of the nature of the lesion. The method can be used together with the more common type of investigations, such as plain scans and contrast scans. The small number of patients requires further studies with larger numbers; for this a multi-centric study would be suitable.

  2. Measurement of breast tissue composition with dual energy cone-beam computed tomography: A postmortem study

    PubMed Central

    Ding, Huanjun; Ducote, Justin L.; Molloi, Sabee

    2013-01-01

    Purpose: To investigate the feasibility of a three-material compositional measurement of water, lipid, and protein content of breast tissue with dual kVp cone-beam computed tomography (CT) for diagnostic purposes. Methods: Simulations were performed on a flat panel-based computed tomography system with a dual kVp technique in order to guide the selection of experimental acquisition parameters. The expected errors induced by using the proposed calibration materials were also estimated by simulation. Twenty pairs of postmortem breast samples were imaged with a flat-panel based dual kVp cone-beam CT system, followed by image-based material decomposition using calibration data obtained from a three-material phantom consisting of water, vegetable oil, and polyoxymethylene plastic. The tissue samples were then chemically decomposed into their respective water, lipid, and protein contents after imaging to allow direct comparison with data from dual energy decomposition. Results: Guided by results from simulation, the beam energies for the dual kVp cone-beam CT system were selected to be 50 and 120 kVp with the mean glandular dose divided equally between each exposure. The simulation also suggested that the use of polyoxymethylene as the calibration material for the measurement of pure protein may introduce an error of −11.0%. However, the tissue decomposition experiments, which employed a calibration phantom made out of water, oil, and polyoxymethylene, exhibited strong correlation with data from the chemical analysis. The average root-mean-square percentage error for water, lipid, and protein contents was 3.58% as compared with chemical analysis. Conclusions: The results of this study suggest that the water, lipid, and protein contents can be accurately measured using dual kVp cone-beam CT. The tissue compositional information may improve the sensitivity and specificity for breast cancer diagnosis. PMID:23718593

  3. Measurement of breast tissue composition with dual energy cone-beam computed tomography: A postmortem study

    SciTech Connect

    Ding Huanjun; Ducote, Justin L.; Molloi, Sabee

    2013-06-15

    Purpose: To investigate the feasibility of a three-material compositional measurement of water, lipid, and protein content of breast tissue with dual kVp cone-beam computed tomography (CT) for diagnostic purposes. Methods: Simulations were performed on a flat panel-based computed tomography system with a dual kVp technique in order to guide the selection of experimental acquisition parameters. The expected errors induced by using the proposed calibration materials were also estimated by simulation. Twenty pairs of postmortem breast samples were imaged with a flat-panel based dual kVp cone-beam CT system, followed by image-based material decomposition using calibration data obtained from a three-material phantom consisting of water, vegetable oil, and polyoxymethylene plastic. The tissue samples were then chemically decomposed into their respective water, lipid, and protein contents after imaging to allow direct comparison with data from dual energy decomposition. Results: Guided by results from simulation, the beam energies for the dual kVp cone-beam CT system were selected to be 50 and 120 kVp with the mean glandular dose divided equally between each exposure. The simulation also suggested that the use of polyoxymethylene as the calibration material for the measurement of pure protein may introduce an error of -11.0%. However, the tissue decomposition experiments, which employed a calibration phantom made out of water, oil, and polyoxymethylene, exhibited strong correlation with data from the chemical analysis. The average root-mean-square percentage error for water, lipid, and protein contents was 3.58% as compared with chemical analysis. Conclusions: The results of this study suggest that the water, lipid, and protein contents can be accurately measured using dual kVp cone-beam CT. The tissue compositional information may improve the sensitivity and specificity for breast cancer diagnosis.

  4. Image-Guided Drug Delivery with Single-Photon Emission Computed Tomography: A Review of Literature

    PubMed Central

    Chakravarty, Rubel; Hong, Hao; Cai, Weibo

    2014-01-01

    Tremendous resources are being invested all over the world for prevention, diagnosis, and treatment of various types of cancer. Successful cancer management depends on accurate diagnosis of the disease along with precise therapeutic protocol. The conventional systemic drug delivery approaches generally cannot completely remove the competent cancer cells without surpassing the toxicity limits to normal tissues. Therefore, development of efficient drug delivery systems holds prime importance in medicine and healthcare. Also, molecular imaging can play an increasingly important and revolutionizing role in disease management. Synergistic use of molecular imaging and targeted drug delivery approaches provides unique opportunities in a relatively new area called `image-guided drug delivery' (IGDD). Single-photon emission computed tomography (SPECT) is the most widely used nuclear imaging modality in clinical context and is increasingly being used to guide targeted therapeutics. The innovations in material science have fueled the development of efficient drug carriers based on, polymers, liposomes, micelles, dendrimers, microparticles, nanoparticles, etc. Efficient utilization of these drug carriers along with SPECT imaging technology have the potential to transform patient care by personalizing therapy to the individual patient, lessening the invasiveness of conventional treatment procedures and rapidly monitoring the therapeutic efficacy. SPECT-IGDD is not only effective for treatment of cancer but might also find utility in management of several other diseases. Herein, we provide a concise overview of the latest advances in SPECT-IGDD procedures and discuss the challenges and opportunities for advancement of the field. PMID:25182469

  5. [Brain abscesses. Value of computed tomography. A review of seven cases (author's transl)].

    PubMed

    Leriche, B; Boucetta, M; Jourdan, P; Desgeorges, M

    1980-01-01

    Based on a series of seven cases of subtentorial abscess, the authors analyze the results of different methods of exploration. Though in certain clinical conditions (intracranial hypertension and meningeal infections) the etiology is of no consequence, in most cases of definite diagnosis can be made of a space-occupying lesion by the use of EEG, arteriography, and scintigraphy examinations, without establishing the precise nature of the affection. As expected, computed tomography appears to be the most reliable examination. Diagnosis was confirmed by this method in 6 of the 7 cases, and it also enabled the number, size, and location of the lesions to be determined. Typical appearances after injection of an iodized contrast medium revealed the development of an abscess following the intracerebral infection, and determined the time for surgical intervention. In spite of intensive care and antibiotic therapy, an abscess remains a "delayed-action bomb", with poor prognosis, requiring drainage or surgical excision as soon as conditions are appropriate. The mortality and morbidity of this rare, and therefore poorly recognized, affection should improve with systematic use of the scanner during meningeal infections.

  6. Assessment of Mandibular Distraction Regenerate Using Ultrasonography and Cone Beam Computed Tomography: A Clinical Study

    PubMed Central

    Dabas, Jitender; Mohanty, Sujata; Chaudhary, Zainab; Rani, Amita

    2015-01-01

    Distraction osteogenesis (DO) is becoming a popular method of reconstruction for maxillofacial bony deformities or defects secondary to trauma or surgical tumor ablation. However, the technique is very sensitive in terms of the rate and rhythm of distraction. Because of this, there is a need for monitoring of the distraction regenerate during the distraction as well as the consolidation period. The present study was conducted to assess the regenerate using two imaging modalities, namely, ultrasonography (USG) and cone beam computed tomography (CBCT) to determine their relative efficacies and to weigh their clinical usefulness in assessment of DO regenerate. The study was conducted on 12 patients (18 sites) who underwent mandibular distraction for correction of facial deformities. The results showed that overall USG correlated better with the condition of regenerate (r = 0.606) as compared with CBCT (r = 0.476). However, USG was less effective as compared with CBCT in assessing the regenerate once corticomedullary differentiation occurred in the bone. PMID:26889351

  7. Computed tomography: a powerful imaging technique in the fields of dimensional metrology and quality control

    NASA Astrophysics Data System (ADS)

    Probst, Gabriel; Boeckmans, Bart; Dewulf, Wim; Kruth, Jean-Pierre

    2016-05-01

    X-ray computed tomography (CT) is slowly conquering its space in the manufacturing industry for dimensional metrology and quality control purposes. The main advantage is its non-invasive and non-destructive character. Currently, CT is the only measurement technique that allows full 3D visualization of both inner and outer features of an object through a contactless probing system. Using hundreds of radiographs, acquired while rotating the object, a 3D representation is generated and dimensions can be verified. In this research, this non-contact technique was used for the inspection of assembled components. A dental cast model with 8 implants, connected by a screwed retained bar made of titanium. The retained bar includes a mating interface connection that should ensure a perfect fitting without residual stresses when the connection is fixed with screws. CT was used to inspect the mating interfaces between these two components. Gaps at the connections can lead to bacterial growth and potential inconvenience for the patient who would have to face a new surgery to replace his/hers prosthesis. With the aid of CT, flaws in the design or manufacturing process that could lead to gaps at the connections could be assessed.

  8. Three-dimensional maxillary and mandibular regional superimposition using cone beam computed tomography: a validation study.

    PubMed

    Koerich, L; Burns, D; Weissheimer, A; Claus, J D P

    2016-05-01

    This study aimed to validate a novel method for fast regional superimposition of cone beam computed tomography (CBCT) scans. The method can be used with smaller field of view scans, thereby allowing for a lower radiation dose. This retrospective study used two dry skulls and secondary data from 15 patients who had more than one scan taken using the same machine. Two observers tested two types of regional voxel-based superimposition: maxillary and mandibular. The registration took 10-15s. Three-dimensional surface models of the maxillas and mandibles were generated via standardized threshold segmentation, and the accuracy and reproducibility of the superimpositions were assessed using the iterative closest point technique to measure the root mean square (RMS) distance between the images. Five areas were measured and a RMS≤0.25 was considered successful. Descriptive statistics and the intra-class correlation coefficient (ICC) were used to compare the intra-observer measurement reproducibility. The ICC was ≥0.980 for all of the variables and the highest RMS found was 0.241. The inter-observer reproducibility was assessed case by case and was perfect (RMS 0) for 68% (23 out of 34) of the superimpositions done and not clinically significant (RMS≤0.25) for the other 32%. The method is fast, accurate, and reproducible and is an alternative to cranial base superimposition.

  9. The 100 most-cited original articles in cardiac computed tomography: A bibliometric analysis.

    PubMed

    O'Keeffe, Michael E; Hanna, Tarek N; Holmes, Davis; Marais, Olivia; Mohammed, Mohammed F; Clark, Sheldon; McLaughlin, Patrick; Nicolaou, Savvas; Khosa, Faisal

    2016-01-01

    Bibliometric analysis is the application of statistical methods to analyze quantitative data about scientific publications. It can evaluate research performance, author productivity, and manuscript impact. To the best of our knowledge, no bibliometric analysis has focused on cardiac computed tomography (CT). The purpose of this paper was to compile a list of the 100 most-cited articles related to cardiac CT literature using Scopus and Web of Science (WOS). A list of the 100 most-cited articles was compiled by order of citation frequency, as well a list of the top 10 most-cited guideline and review articles and the 20 most-cited articles of the years 2014-2015. The database of 100 most-cited articles was analyzed to identify characteristics of highly cited publications. For each manuscript, the number of authors, study design, size of patient cohort and departmental affiliations were cataloged. The 100 most-cited articles were published from 1990 to 2012, with the majority (53) published between 2005 and 2009. The total number of citations varied from 3354 to 196, and the number of citations per year varied from 9.5 to 129.0 with a median and mean of 30.9 and 38.7, respectively. The majority of publications had a study patients sample size of 200 patients or less. The USA and Germany were the nations with the highest number of frequently cited publications. This bibliometric analysis provides insights on the most-cited articles published on the subject of cardiac CT and calcium volume, thus helping to characterize the field and guide future research.

  10. Respiratory triggered 4D cone-beam computed tomography: A novel method to reduce imaging dose

    SciTech Connect

    Cooper, Benjamin J.; O'Brien, Ricky T.; Keall, Paul J.; Balik, Salim; Hugo, Geoffrey D.

    2013-04-15

    Purpose: A novel method called respiratory triggered 4D cone-beam computed tomography (RT 4D CBCT) is described whereby imaging dose can be reduced without degrading image quality. RT 4D CBCT utilizes a respiratory signal to trigger projections such that only a single projection is assigned to a given respiratory bin for each breathing cycle. In contrast, commercial 4D CBCT does not actively use the respiratory signal to minimize image dose. Methods: To compare RT 4D CBCT with conventional 4D CBCT, 3600 CBCT projections of a thorax phantom were gathered and reconstructed to generate a ground truth CBCT dataset. Simulation pairs of conventional 4D CBCT acquisitions and RT 4D CBCT acquisitions were developed assuming a sinusoidal respiratory signal which governs the selection of projections from the pool of 3600 original projections. The RT 4D CBCT acquisition triggers a single projection when the respiratory signal enters a desired acquisition bin; the conventional acquisition does not use a respiratory trigger and projections are acquired at a constant frequency. Acquisition parameters studied were breathing period, acquisition time, and imager frequency. The performance of RT 4D CBCT using phase based and displacement based sorting was also studied. Image quality was quantified by calculating difference images of the test dataset from the ground truth dataset. Imaging dose was calculated by counting projections. Results: Using phase based sorting RT 4D CBCT results in 47% less imaging dose on average compared to conventional 4D CBCT. Image quality differences were less than 4% at worst. Using displacement based sorting RT 4D CBCT results in 57% less imaging dose on average, than conventional 4D CBCT methods; however, image quality was 26% worse with RT 4D CBCT. Conclusions: Simulation studies have shown that RT 4D CBCT reduces imaging dose while maintaining comparable image quality for phase based 4D CBCT; image quality is degraded for displacement based RT 4D

  11. Segmentation and quantification of materials with energy discriminating computed tomography: A phantom study

    SciTech Connect

    Le, Huy Q.; Molloi, Sabee

    2011-01-15

    Purpose: To experimentally investigate whether a computed tomography (CT) system based on CdZnTe (CZT) detectors in conjunction with a least-squares parameter estimation technique can be used to decompose four different materials. Methods: The material decomposition process was divided into a segmentation task and a quantification task. A least-squares minimization algorithm was used to decompose materials with five measurements of the energy dependent linear attenuation coefficients. A small field-of-view energy discriminating CT system was built. The CT system consisted of an x-ray tube, a rotational stage, and an array of CZT detectors. The CZT array was composed of 64 pixels, each of which is 0.8x0.8x3 mm. Images were acquired at 80 kVp in fluoroscopic mode at 50 ms per frame. The detector resolved the x-ray spectrum into energy bins of 22-32, 33-39, 40-46, 47-56, and 57-80 keV. Four phantoms were constructed from polymethylmethacrylate (PMMA), polyethylene, polyoxymethylene, hydroxyapatite, and iodine. Three phantoms were composed of three materials with embedded hydroxyapatite (50, 150, 250, and 350 mg/ml) and iodine (4, 8, 12, and 16 mg/ml) contrast elements. One phantom was composed of four materials with embedded hydroxyapatite (150 and 350 mg/ml) and iodine (8 and 16 mg/ml). Calibrations consisted of PMMA phantoms with either hydroxyapatite (100, 200, 300, 400, and 500 mg/ml) or iodine (5, 15, 25, 35, and 45 mg/ml) embedded. Filtered backprojection and a ramp filter were used to reconstruct images from each energy bin. Material segmentation and quantification were performed and compared between different phantoms. Results: All phantoms were decomposed accurately, but some voxels in the base material regions were incorrectly identified. Average quantification errors of hydroxyapatite/iodine were 9.26/7.13%, 7.73/5.58%, and 12.93/8.23% for the three-material PMMA, polyethylene, and polyoxymethylene phantoms, respectively. The average errors for the four

  12. Quantitative micro-computed tomography: a non-invasive method to assess equivalent bone mineral density.

    PubMed

    Nazarian, Ara; Snyder, Brian D; Zurakowski, David; Müller, Ralph

    2008-08-01

    One of the many applications of micro computed tomography (microCT) is to accurately visualize and quantify cancellous bone microstructure. However, microCT based assessment of bone mineral density has yet to be thoroughly investigated. Specifically, the effects of varying imaging parameters, such as tube voltage (kVp), current (microA), integration time (ms), object to X-ray source distance (mm), projection number, detector array size and imaging media (surrounding the specimen), on the relationship between equivalent tissue density (rhoEQ) and its linear attenuation coefficient (micro) have received little attention. In this study, in house manufactured, hydrogen dipotassium phosphate liquid calibration phantoms (K2HPO4) were employed in addition to a resin embedded hydroxyapatite solid calibration phantoms supplied by Scanco Medical AG Company. Variations in current, integration time and projection number had no effect on the conversion relationship between micro and rhoEQ for the K2HPO4 and Scanco calibration phantoms [p>0.05 for all cases]. However, as expected, variations in scanning tube voltage, object to X-ray source distance, detector array size and imaging media (referring to the solution that surrounds the specimen in the imaging vial) significantly affected the conversion relationship between mu and rhoEQ for K2HPO4 and Scanco calibration phantoms [p<0.05 for all cases]. A multivariate linear regression approach was used to estimate rhoEQ based on attenuation coefficient, tube voltage, object to X-ray source distance, detector array size and imaging media for K2HPO4 liquid calibration phantoms, explaining 90% of the variation in rhoEQ. Furthermore, equivalent density values of bovine cortical bone (converted from attenuation coefficient to equivalent density using the K2HPO4 liquid calibration phantoms) samples highly correlated [R2=0.92] with the ash densities of the samples. In conclusion, Scanco calibration phantoms can be used to assess equivalent

  13. Segmentation and quantification of materials with energy discriminating computed tomography: A phantom study

    PubMed Central

    Le, Huy Q.; Molloi, Sabee

    2011-01-01

    Purpose: To experimentally investigate whether a computed tomography (CT) system based on CdZnTe (CZT) detectors in conjunction with a least-squares parameter estimation technique can be used to decompose four different materials. Methods: The material decomposition process was divided into a segmentation task and a quantification task. A least-squares minimization algorithm was used to decompose materials with five measurements of the energy dependent linear attenuation coefficients. A small field-of-view energy discriminating CT system was built. The CT system consisted of an x-ray tube, a rotational stage, and an array of CZT detectors. The CZT array was composed of 64 pixels, each of which is 0.8×0.8×3 mm. Images were acquired at 80 kVp in fluoroscopic mode at 50 ms per frame. The detector resolved the x-ray spectrum into energy bins of 22–32, 33–39, 40–46, 47–56, and 57–80 keV. Four phantoms were constructed from polymethylmethacrylate (PMMA), polyethylene, polyoxymethylene, hydroxyapatite, and iodine. Three phantoms were composed of three materials with embedded hydroxyapatite (50, 150, 250, and 350 mg∕ml) and iodine (4, 8, 12, and 16 mg∕ml) contrast elements. One phantom was composed of four materials with embedded hydroxyapatite (150 and 350 mg∕ml) and iodine (8 and 16 mg∕ml). Calibrations consisted of PMMA phantoms with either hydroxyapatite (100, 200, 300, 400, and 500 mg∕ml) or iodine (5, 15, 25, 35, and 45 mg∕ml) embedded. Filtered backprojection and a ramp filter were used to reconstruct images from each energy bin. Material segmentation and quantification were performed and compared between different phantoms. Results: All phantoms were decomposed accurately, but some voxels in the base material regions were incorrectly identified. Average quantification errors of hydroxyapatite∕iodine were 9.26∕7.13%, 7.73∕5.58%, and 12.93∕8.23% for the three-material PMMA, polyethylene, and polyoxymethylene phantoms, respectively. The

  14. Preliminary Phase Field Computational Model Development

    SciTech Connect

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  15. A computer controlled pulsatile pump: preliminary study.

    PubMed

    Zwarts, M S; Topaz, S R; Jones, D N; Kolff, W J

    1996-12-01

    A Stepper Motor Driven Reciprocating Pump (SDRP) can replace roller pumps and rotary pumps for cardio pulmonary bypass, hemodialysis and regional perfusion. The blood pumping ventricles are basically the same as ventricles used for air driven artificial hearts and ventricular assist devices. The electric stepper motor uses a flexible linkage belt to produce a reciprocating movement, which pushes a hard sphere into the diaphragm of the blood ventricles. The SDRP generates pulsatile flow and has a small priming volume. The preset power level of the motor driver limits the maximum potential outflow pressure, so the driver acts as a safety device. A double pump can be made by connecting two fluid pumping chambers to opposing sides of the motor base. Each pump generates pulsatile flow. Pressure and flow studies with water were undertaken. Preliminary blood studies showed low hemolysis, even when circulating a small amount of blood up to 16 hours.

  16. Hand-Held Computer Programs for Preliminary Helicopter Design.

    DTIC Science & Technology

    1982-10-01

    programmable calculator a series of programs that give acceptable results during the preliminary phases of the helicopter design process. The project consists of three parts. The first part consists of several short programs and their subroutine form. These programs and subroutines compute density altitude, density, disc area, solidity, tip velocity, induced velocity, coefficient of thrust, tip loss factor, equivalent chord, and ground effect. The second part consists of major subroutines. These subroutines compute profile power, induced power, climb power, parasite power,

  17. Computing and information sciences preliminary engineering design study

    SciTech Connect

    Schroeder, J O; Pearson, E W; Thomas, J J; Brothers, J W; Campbell, W K; DeVaney, D M; Jones, D R; Littlefield, R J; Peterson, M J

    1991-04-01

    This document presents the preliminary design concept for the integrated computing and information system to be included in the Environmental and Molecular Sciences Laboratory (EMSL) at the Pacific Northwest Laboratory, Richland, Washington, for the US Department of Energy (DOE). The EMSL is scheduled for completion and occupancy in 1994 or 1995 and will support the DOE environmental mission, in particular hazardous waste remediation. The focus of the report is on the Computing and Information Sciences engineering task of providing a fully integrated state-of-the-art computing environment for simulation, experimentation and analysis in support of molecular research. The EMSL will house two major research organizations, the Molecular Sciences Research Center (MSRC) and part of the Environmental Sciences Research Center (ESRC). Included in the report is a preliminary description of the computing and information system to be included. The proposed system architecture is based on a preliminary understanding of the EMSL users' needs for computational resources. As users understand more about the scientific challenges they face, the definition of the functional requirements will change. At the same time, the engineering team will be gaining experience with new computing technologies. Accordingly, the design architecture must evolve to reflect this new understanding of functional requirements and enabling technologies. 3 figs., 2 tabs.

  18. Modeling the complete Otto cycle: Preliminary version. [computer programming

    NASA Technical Reports Server (NTRS)

    Zeleznik, F. J.; Mcbride, B. J.

    1977-01-01

    A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

  19. Diagnosis of simulated condylar bone defects using panoramic radiography, spiral tomography and cone-beam computed tomography: A comparison study

    PubMed Central

    Salemi, Fatemeh; Shokri, Abbas; Baharvand, Maryam

    2015-01-01

    Objectives: Radiographic examination is one of the most important parts of the clinical assessment routine for temporomandibular disorders. The aim of this study was to compare the diagnostic accuracy of cone-beam computed tomography(CBCT) with panoramic radiography and spiral computed tomography for the detection of the simulated mandibular condyle bone lesions. Study Design: The sample consisted of 10 TMJs from 5 dried human skulls. Simulated erosive and osteophytic lesions were created in 3 different sizes using round diamond bur and bone chips, respectively. Panoramic radiography, spiral tomography and cone-beam computed tomography were used in defect detection. Data were statistically analyzed with the Mann-Whitney test. The reliability and degrees of agreement between two observers were also determined by the mean of Cohen’s Kappa analysis. Results: CBCT had a statistically significant superiority than other studied techniques in detection of both erosive and osteophytic lesions with different sizes. There were significant differences between tomography and panoramic in correct detection of both erosive and osteophytic lesions with 1mm and 1.5 mm in size. However, there were no significant differences between Tomography and Panoramic in correct detection of both erosive and osteophytic lesions with 0.5 mm in size. Conclusions: CBCT images provide a greater diagnostic accuracy than spiral tomography and panoramic radiography in the detection of condylar bone erosions and osteophytes. Key words:Bone defect, Condyle, CBCT, Panoramic, radiography. PMID:25810839

  20. Emission Computed Tomography: A New Technique for the Quantitative Physiologic Study of Brain and Heart in Vivo

    DOE R&D Accomplishments Database

    Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.

    1978-01-01

    Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.

  1. Clival lesion incidentally discovered on cone-beam computed tomography: A case report and review of the literature

    PubMed Central

    Tadinada, Aditya; Rengasamy, Kandasamy; Fellows, Douglas; Lurie, Alan G.

    2014-01-01

    An osteolytic lesion with a small central area of mineralization and sclerotic borders was discovered incidentally in the clivus on the cone-beam computed tomography (CBCT) of a 27-year-old male patient. This benign appearance indicated a primary differential diagnosis of non-aggressive lesions such as fibro-osseous lesions and arrested pneumatization. Further, on magnetic resonance imaging (MRI), the lesion showed a homogenously low T1 signal intensity with mild internal enhancement after post-gadolinium and a heterogeneous T2 signal intensity. These signal characteristics might be attributed to the fibrous tissues, chondroid matrix, calcific material, or cystic component of the lesion; thus, chondroblastoma and chondromyxoid fibroma were added to the differential diagnosis. Although this report was limited by the lack of final diagnosis and the patient lost to follow-up, the incidental skull base finding would be important for interpreting the entire volume of CBCT by a qualified oral and maxillofacial radiologist. PMID:24944968

  2. Bone Mineral Density Estimations From Routine Multidetector Computed Tomography: A Comparative Study of Contrast and Calibration Effects

    PubMed Central

    Kaesmacher, Johannes; Liebl, Hans; Baum, Thomas; Kirschke, Jan Stefan

    2017-01-01

    Introduction Phantom-based (synchronous and asynchronous) and phantomless (internal tissue calibration based) assessment of bone mineral density (BMD) in routine MDCT (multidetector computed tomography) examinations potentially allows for diagnosis of osteoporosis. Although recent studies investigated the effects of contrast-medium application on phantom-calibrated BMD measurements, it remains uncertain to what extent internal tissue-calibrated BMD measurements are also susceptible to contrast-medium associated density variation. The present study is the first to systemically evaluate BMD variations related to contrast application comparing different calibration techniques. Purpose To compare predicative performance of different calibration techniques for BMD measurements obtained from triphasic contrast-enhanced MDCT. Materials and Methods Bone mineral density was measured on nonenhanced (NE), arterial (AR) and portal-venous (PV) contrast phase MDCT images of 46 patients using synchronous (SYNC) and asynchronous (ASYNC) phantom calibration as well as internal calibration (IC). Quantitative computed tomography (QCT) served as criterion standard. Density variations were analyzed for each contrast phase and calibration technique, and respective linear fitting was performed. Results Both asynchronous calibration-derived BMD values (NE-ASYNC) and values estimated using IC (NE-IC) on NE MDCT images did reasonably well in predicting QCT BMD (root-mean-square deviation, 8.0% and 7.8%, respectively). Average NE-IC BMD was 2.7% lower when compared with QCT (P = 0.017), whereas no difference could be found for NE-ASYNC (P = 0.957). All average BMD estimates derived from contrast-enhanced scans differed significantly from QCT BMD (all P < 0.005) and led to notable systemic BMD biases (mean difference at least > 6.0 mg/mL). All regression fits revealed a consistent linear dependency (R2 range, 0.861–0.963). Overall accuracy and goodness of fit tended to decrease from AR to

  3. Glenohumeral joint kinematics measured by intracortical pins, reflective markers, and computed tomography: A novel technique to assess acromiohumeral distance.

    PubMed

    Dal Maso, Fabien; Blache, Yoann; Raison, Maxime; Lundberg, Arne; Begon, Mickaël

    2016-08-01

    Combination of biplane fluoroscopy and CT-scan provides accurate 3D measurement of the acromiohumeral distance (AHD) during dynamic tasks. However, participants performed only two and six trials in previous experiments to respect the recommended radiation exposure per year. Our objective was to propose a technique to assess the AHD in 3D during dynamic tasks without this limitation. The AHD was computed from glenohumeral kinematics obtained using markers fitted to pins drilled into the scapula and the humerus combined with 3D bone geometry obtained using CT-scan. Four participants performed range-of-motion, daily-living, and sports activities. Sixty-six out of 158trials performed by each participant were analyzed. Two participants were not considered due to experimental issues. AHD decreased with arm elevation. Overall, the smallest AHD occurred in abduction (1.1mm (P1) and 1.2mm (P2)). The smallest AHD were 2.4mm (P1) and 3.1mm (P2) during ADL. It was 2.8mm (P1) and 1.1mm (P2) during sports activities. The humeral head greater and lesser tuberosities came the nearest to the acromion. The proposed technique increases the number of trials acquired during one experiment compared to previous. The identification of movements maximizing AHD is possible, which may provide benefits for shoulder rehabilitation.

  4. Evaluation of stability after pre-orthodontic orthognathic surgery using cone-beam computed tomography: A comparison with conventional treatment

    PubMed Central

    Ann, Hye-Rim; Jung, Young-Soo; Lee, Kee-Joon

    2016-01-01

    Objective The aim of this study was to evaluate the skeletal and dental changes after intraoral vertical ramus osteotomy (IVRO) with and without presurgical orthodontics by using cone-beam computed tomography (CBCT). Methods This retrospective cohort study included 24 patients (mean age, 22.1 years) with skeletal Class III malocclusion who underwent bimaxillary surgery with IVRO. The patients were divided into the preorthodontic orthognathic surgery (POGS) group (n = 12) and conventional surgery (CS) group (n = 12). CBCT images acquired preoperatively, 1 month after surgery, and 1 year after surgery were analyzed to compare the intergroup differences in postoperative three-dimensional movements of the maxillary and mandibular landmarks and the changes in lateral cephalometric variables. Results Baseline demographics (sex and age) were similar between the two groups (6 men and 6 women in each group). During the postsurgical period, the POGS group showed more significant upward movement of the mandible (p < 0.05) than did the CS group. Neither group showed significant transverse movement of any of the skeletal landmarks. Moreover, none of the dental and skeletal variables showed significant intergroup differences 1 year after surgery. Conclusions Compared with CS, POGS with IVRO resulted in significantly different postsurgical skeletal movement in the mandible. Although both groups showed similar skeletal and dental outcomes at 1 year after surgery, upward movement of the mandible during the postsurgical period should be considered to ensure a more reliable outcome after POGS. PMID:27668193

  5. Support vector machine model for diagnosis of lymph node metastasis in gastric cancer with multidetector computed tomography: a preliminary study

    PubMed Central

    2011-01-01

    Background Lymph node metastasis (LNM) of gastric cancer is an important prognostic factor regarding long-term survival. But several imaging techniques which are commonly used in stomach cannot satisfactorily assess the gastric cancer lymph node status. They can not achieve both high sensitivity and specificity. As a kind of machine-learning methods, Support Vector Machine has the potential to solve this complex issue. Methods The institutional review board approved this retrospective study. 175 consecutive patients with gastric cancer who underwent MDCT before surgery were included. We evaluated the tumor and lymph node indicators on CT images including serosal invasion, tumor classification, tumor maximum diameter, number of lymph nodes, maximum lymph node size and lymph nodes station, which reflected the biological behavior of gastric cancer. Univariate analysis was used to analyze the relationship between the six image indicators with LNM. A SVM model was built with these indicators above as input index. The output index was that lymph node metastasis of the patient was positive or negative. It was confirmed by the surgery and histopathology. A standard machine-learning technique called k-fold cross-validation (5-fold in our study) was used to train and test SVM models. We evaluated the diagnostic capability of the SVM models in lymph node metastasis with the receiver operating characteristic (ROC) curves. And the radiologist classified the lymph node metastasis of patients by using maximum lymph node size on CT images as criterion. We compared the areas under ROC curves (AUC) of the radiologist and SVM models. Results In 175 cases, the cases of lymph node metastasis were 134 and 41 cases were not. The six image indicators all had statistically significant differences between the LNM negative and positive groups. The means of the sensitivity, specificity and AUC of SVM models with 5-fold cross-validation were 88.5%, 78.5% and 0.876, respectively. While the diagnostic power of the radiologist classifying lymph node metastasis by maximum lymph node size were only 63.4%, 75.6% and 0.757. Each SVM model of the 5-fold cross-validation performed significantly better than the radiologist. Conclusions Based on biological behavior information of gastric cancer on MDCT images, SVM model can help diagnose the lymph node metastasis preoperatively. PMID:21223564

  6. Little impact of tsunami-stricken nuclear accident on awareness of radiation dose of cardiac computed tomography: A questionnaire study

    PubMed Central

    2013-01-01

    Background With the increased use of cardiac computed tomography (CT), radiation dose remains a major issue, although physicians are trying to reduce the substantial risks associated with use of this diagnostic tool. This study was performed to investigate recognition of the level of radiation exposure from cardiac CT and the differences in the level of awareness of radiation before and after the Fukushima nuclear plant accident. Methods We asked 30 physicians who were undergoing training in internal medicine to determine the equivalent doses of radiation for common radiological examinations when a normal chest X-ray is accepted as one unit; questions about the absolute radiation dose of cardiac CT data were also asked. Results According to the results, 86.6% of respondents believed the exposure to be 1 mSv at most, and 93.3% thought that the exposure was less than that of 100 chest X-rays. This finding indicates that their perceptions were far lower than the actual amounts. Even after the occurrence of such a large nuclear disaster in Fukushima, there were no significant differences in the same subjects’ overall awareness of radiation amounts. Conclusions Even after such a major social issue as the Fukushima nuclear accident, the level of awareness of the accurate radiation amount used in 64-channel multidetector CT (MDCT) by clinical physicians who order this test was not satisfactory. Thus, there is a need for the development of effective continuing education programs to improve awareness of radiation from ionizing radiation devices, including cardiac CT, and emphasis on risk-benefit evaluation based on accurate knowledge during medical training. PMID:23631688

  7. Oral Midazolam-Ketamine versus Midazolam alone for Procedural Sedation of Children Undergoing Computed Tomography; a Randomized Clinical Trial

    PubMed Central

    Majidinejad, Saeed; Taherian, Keramat; Esmailian, Mehrdad; Khazaei, Mehdi; Samaie, Vajihe

    2015-01-01

    Introduction: Motion artifacts are a common problem in pediatric radiographic studies and are a common indication for pediatric procedural sedation. This study aimed to compare the combination of oral midazolam and ketamine (OMK) with oral midazolam alone (OM) as procedural sedatives among children undergoing computed tomography (CT) imaging. Methods: The study population was comprised of six-month to six-year old patients with medium-risk minor head trauma, who were scheduled to undergo brain CT imaging. Patients were randomly allocated to two groups: one group received 0.5 mg/kg midazolam (OM group; n = 33) orally and the other one received 0.2 mg/kg midazolam and 5 mg/kg ketamine orally (OMK group; n=33). The vital signs were monitored and recorded at regular intervals. The primary outcome measure was the success rate of each drug in achieving adequate sedation. Secondary outcome measures were the time to achieve adequate sedation, time to discharge from radiology department, and the incidence of adverse events. Results: Adequate sedation was achieved in five patients (15.2%) in OM group and 15 patients (45.5%) in OMK group, which showed a statistically significant difference between the groups (p = 0.015). No significant difference was noted between OM and OMK groups with respect to the time of achieving adequate sedation (33.80 ± 7.56 and 32.87 ± 10.18 minutes, respectively; p = 0.854) and the time of discharging from radiology department (89.60 ± 30.22 and 105.27 ± 21.98 minutes, respectively; p=0.223). The complications were minor and similar among patients of both groups. Conclusion: This study demonstrated that in comparison with OM, OMK was more effective in producing a satisfactory level of sedation in children undergoing CT examinations without additional complications; however, none of these two regimens fulfilled clinical needs for procedural sedation. PMID:26495384

  8. Diagnosis of asbestosis by a time expanded wave form analysis, auscultation and high resolution computed tomography: a comparative study.

    PubMed Central

    al Jarad, N; Strickland, B; Bothamley, G; Lock, S; Logan-Sinclair, R; Rudd, R M

    1993-01-01

    BACKGROUND--Crackles are a prominent clinical feature of asbestosis and may be an early sign of the condition. Auscultation, however, is subjective and interexaminer disagreement is a problem. Computerised lung sound analysis can visualise, store, and analyse lung sounds and disagreement on the presence of crackles is minimal. High resolution computed tomography (HRCT) is superior to chest radiography in detecting early signs of asbestosis. The aim of this study was to compare clinical auscultation, time expanded wave form analysis (TEW), chest radiography, and HRCT in detecting signs of asbestosis in asbestos workers. METHODS--Fifty three asbestos workers (51 men and two women) were investigated. Chest radiography and HRCT were assessed by two independent readers for detection of interstitial opacities. HRCT was performed in the supine position with additional sections at the bases in the prone position. Auscultation for persistent fine inspiratory crackles was performed by two independent examiners unacquainted with the diagnosis. TEW analysis was obtained from a 33 second recording of lung sounds over the lung bases. TEW and auscultation were performed in a control group of 13 subjects who had a normal chest radiograph. There were 10 current smokers and three previous smokers. In asbestos workers the extent of pulmonary opacities on the chest radiograph was scored according to the International Labour Office (ILO) scale. Patients were divided into two groups: 21 patients in whom the chest radiograph was > 1/0 (group 1) and 32 patients in whom the chest radiograph was scored < or = 1/0 (group 2) on the ILO scale. RESULTS--In patients with an ILO score of < or = 1/0 repetitive mid to late inspiratory crackles were detected by auscultation in seven (22%) patients and by TEW in 14 (44%). HRCT detected definite interstitial opacities in 11 (34%) and gravity dependent subpleural lines in two (6%) patients. All but two patients with evidence of interstitial disease or

  9. Estimation of effective doses to adult and pediatric patients from multislice computed tomography: A method based on energy imparted

    SciTech Connect

    Theocharopoulos, Nicholas; Damilakis, John; Perisinakis, Kostas; Tzedakis, Antonis; Karantanas, Apostolos; Gourtsoyiannis, Nicholas

    2006-10-15

    The purpose of this study is to provide a method and required data for the estimation of effective dose (E) values to adult and pediatric patients from computed tomography (CT) scans of the head, chest abdomen, and pelvis, performed on multi-slice scanners. Mean section radiation dose (d{sub m}) to cylindrical water phantoms of varying radius normalized over CT dose index free-in-air (CTDI{sub F}) were calculated for the head and body scanning modes of a multislice scanner with use of Monte Carlo techniques. Patients were modeled as equivalent water phantoms and the energy imparted ({epsilon}) to simulated pediatric and adult patients was calculated on the basis of measured CTDI{sub F} values. Body region specific energy imparted to effective dose conversion coefficients (E/{epsilon}) for adult male and female patients were generated from previous data. Effective doses to patients aged newborn to adult were derived for all available helical and axial beam collimations, taking into account age specific patient mass and scanning length. Depending on high voltage, body region, and patient sex, E/{epsilon} values ranged from 0.008 mSv/mJ for head scans to 0.024 mSv/mJ for chest scans. When scanned with the same technique factors as the adults, pediatric patients absorb as little as 5% of the energy imparted to adults, but corresponding effective dose values are up to a factor of 1.6 higher. On average, pediatric patients absorb 44% less energy per examination but have a 24% higher effective dose, compared with adults. In clinical practice, effective dose values to pediatric patients are 2.5 to 10 times lower than in adults due to the adaptation of tube current. A method is provided for the calculation of effective dose to adult and pediatric patients on the basis of individual patient characteristics such as sex, mass, dimensions, and density of imaged anatomy, and the technical features of modern multislice scanners. It allows the optimum selection of scanning

  10. A Preliminary Study of the Burgers Equation with Symbolic Computation

    NASA Astrophysics Data System (ADS)

    Derickson, Russell G.; Pielke, Roger A.

    2000-07-01

    A novel approach based on recursive symbolic computation is introduced for the approximate analytic solution of the Burgers equation. Once obtained, appropriate numerical values can be inserted into the symbolic solution to explore parametric variations. The solution is valid for both inviscid and viscous cases, covering the range of Reynolds number from 500 to infinity, whereas current direct numerical simulation (DNS) methods are limited to Reynolds numbers no greater than 4000. What further distinguishes the symbolic approach from numerical and traditional analytic techniques is the ability to reveal and examine direct nonlinear interactions between waves, including the interplay between inertia and viscosity. Thus, preliminary efforts suggest that symbolic computation may be quite effective in unveiling the “anatomy” of the myriad interactions that underlie turbulent behavior. However, due to the tendency of nonlinear symbolic operations to produce combinatorial explosion, future efforts will require the development of improved filtering processes to select and eliminate computations leading to negligible high order terms. Indeed, the initial symbolic computations present the character of turbulence as a problem in combinatorics. At present, results are limited in time evolution, but reveal the beginnings of the well-known “saw tooth” waveform that occurs in the inviscid case (i.e., Re=∞). Future efforts will explore more fully developed 1-D flows and investigate the potential to extend symbolic computations to 2-D and 3-D. Potential applications include the development of improved subgrid scale (SGS) parameterizations for large eddy simulation (LES) models, and studies that complement DNS in exploring fundamental aspects of turbulent flow behavior.

  11. Modification of ACSYNT aircraft computer program for preliminary design

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel J.; Rojos-Oviedo, Ruben

    1994-01-01

    This paper presents the development of a computer simulation of agility flight test techniques. Its purpose is to evaluate the agility of aircraft configurations early in the preliminary design phase. The simulation module is integrated into the NASA Ames aircraft synthesis design code. Trade studies using the agility module embedded within the design code to simulate the combat cycle time agility metric are illustrated using a Northrop F-20 aircraft model. Results show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can also compare the agility potential between different configurations and has the capability to optimize agility performance early in the design process.

  12. PASS: A computer program for Preliminary Aircraft Structural Synthesis

    NASA Technical Reports Server (NTRS)

    Johnson, E. H.

    1977-01-01

    A computer code for Preliminary Aircraft Structural Synthesis provides rapid and accurate analysis for aircraft structures that can be adequately modeled by beam finite elements. The philosophy used in developing the program was to provide a basic framework that can be used for structural synthesis. It is anticipated that a user will need to add detail to this framework in order to perform his specific task. With this philosophy in mind, the program was written so that it is easily divided into segments, thereby making it readily adaptable. The theoretical portion of this manual describes the basic structure of the program and details the development of the unique beam element that is used. The present capability of the algorithm is stated and suggestions are made regarding enhancements to this capability. User information is also given that provides an overview of the program's construction, identifies the required inputs, describes the program output, provides some comments on the program use, and exhibits results for a simple example.

  13. Feasibility of capillary velocity assessment by statistical means using dual-beam spectral-domain Optical Coherence Tomography: a preliminary study.

    PubMed

    Daly, Susan M; Silien, Christophe; Leahy, Martin J

    2013-09-01

    The assessment of vascular dynamics has been shown to yield both qualitative and quantitative metrics and thus play a pivotal role in the diagnosis and prognosis of various diseases, which may manifest as microcirculatory irregularities. Optical Coherence Tomography (OCT) is an established imaging modality which utilises the principle of optical interferometry to distinguish between spatial changes in refractive index and thus formulate a multi-dimensional representation of a specimen in vivo. Nonetheless, difficulties remain in obtaining accurate data (morphological and/or transient) in an environment which is subject to such large biological variability. In an effort to address the issue of angular dependence as with Doppler based analysis, a dual-beam Spectral-domain OCT system for quasi-simultaneous specimen scanning is described. A statistical based method of phase correlation is outlined which is capable of quantifying velocity values in addition to the ability to discern bidirectionality, without the necessity of angular computation.

  14. Preliminary performance assessment of computer automated facial approximations using computed tomography scans of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2013-12-10

    ReFace (Reality Enhancement Facial Approximation by Computational Estimation) is a computer-automated facial approximation application jointly developed by the Federal Bureau of Investigation and GE Global Research. The application derives a statistically based approximation of a face from a unidentified skull using a dataset of ~400 human head computer tomography (CT) scans of living adult American individuals from four ancestry groups: African, Asian, European and Hispanic (self-identified). To date only one unpublished subjective recognition study has been conducted using ReFace approximations. It indicated that approximations produced by ReFace were recognized above chance rates (10%). This preliminary study assesses: (i) the recognizability of five ReFace approximations; (ii) the recognizability of CT-derived skin surface replicas of the same individuals whose skulls were used to create the ReFace approximations; and (iii) the relationship between recognition performance and resemblance ratings of target individuals. All five skin surface replicas were recognized at rates statistically significant above chance (22-50%). Four of five ReFace approximations were recognized above chance (5-18%), although with statistical significance only at the higher rate. Such results suggest reconsideration of the usefulness of the type of output format utilized in this study, particularly in regard to facial approximations employed as a means of identifying unknown individuals.

  15. Application of Computational Fluid Dynamics to a Preliminary Extended Area Protection System (EAPS) Projectile

    DTIC Science & Technology

    2006-09-01

    Computational model of a preliminary EAPS projectile configuration. 4 3.2 Computational Mesh The grids for this study were created using GRIDGEN (10), a...Journal 1982, 18 (2), 159–167. 10. Pointwise, Inc. Gridgen Version 15 On-line User’s Manual. Bedford, TX, 2005. 11. Metacomp Technologies. CFD

  16. Software Requirements for Embedded Computers: A Preliminary Report.

    DTIC Science & Technology

    1980-03-01

    kinds of problems appeared less frequently in programs that used Phase- O contracts. Although more work is needed before strong recommendations can be...progress of this work, and for suggestions that helped to keep it on track. Finally, thanks are due to Monti Callero and Kenneth Marks, whose reviews...and preliminary software product specifications are also available. RFP to FSD Contract. The initial RFP can be for a Phase- O contract, or for an FSD

  17. Computer-Generated Geometry Instruction: A Preliminary Study

    ERIC Educational Resources Information Center

    Kang, Helen W.; Zentall, Sydney S.

    2011-01-01

    This study hypothesized that increased intensity of graphic information, presented in computer-generated instruction, could be differentially beneficial for students with hyperactivity and inattention by improving their ability to sustain attention and hold information in-mind. To this purpose, 18 2nd-4th grade students, recruited from general…

  18. Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation

    NASA Technical Reports Server (NTRS)

    Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred

    2008-01-01

    Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.

  19. Analyzing high energy physics data using database computing: Preliminary report

    NASA Technical Reports Server (NTRS)

    Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry

    1991-01-01

    A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.

  20. A preliminary study of molecular dynamics on reconfigurable computers

    SciTech Connect

    Wolinski, C.; Trouw, F. R.; Gokhale, M.

    2003-01-01

    In this paper we investigate the performance of platform FPGAs on a compute-intensive, floating-point-intensive supercomputing application, Molecular Dynamics (MD). MD is a popular simulation technique to track interacting particles through time by integrating their equations of motion. One part of the MD algorithm was implemented using the Fabric Generator (FG)[l I ] and mapped onto several reconfigurable logic arrays. FG is a Java-based toolset that greatly accelerates construction of the fabrics from an abstract technology independent representation. Our experiments used technology-independent IEEE 32-bit floating point operators so that the design could be easily re-targeted. Experiments were performed using both non-pipelined and pipelined floating point modules. We present results for the Altera Excalibur ARM System on a Programmable Chip (SoPC), the Altera Strath EPlS80, and the Xilinx Virtex-N Pro 2VP.50. The best results obtained were 5.69 GFlops at 8OMHz(Altera Strath EPlS80), and 4.47 GFlops at 82 MHz (Xilinx Virtex-II Pro 2VF50). Assuming a lOWpower budget, these results compare very favorably to a 4Gjlop/40Wprocessing/power rate for a modern Pentium, suggesting that reconfigurable logic can achieve high performance at low power on jloating-point-intensivea pplications.

  1. Description and preliminary studies of a computer drawn instrument landing approach display

    NASA Technical Reports Server (NTRS)

    Adams, J. J.; Lallman, F. J.

    1978-01-01

    A computer drawn instrument landing approach display, which shows a box located on the desired path, aligned with the path, and moving along the path at a selected distance ahead of the aircraft, was examined. Vertical and lateral displacements from the desired path and aircraft altitude information are used as inputs to the computer. A preliminary simulation study with pilot subjects has shown that the pilots find the display very easy to use, and they achieved better performance scores with the box display than with a cross pointer instrument landing display.

  2. Variable uptake feature of focal nodular hyperplasia in Tc-99m phytate hepatic scintigraphy/single-photon emission computed tomography-A parametric analysis.

    PubMed

    Hsu, Yu-Ling; Chen, Yu-Wen; Lin, Chia-Yang; Lai, Yun-Chang; Chen, Shinn-Cherng; Lin, Zu-Yau

    2015-12-01

    Tc-99m phytate hepatic scintigraphy remains the standard method for evaluating the functional features of Kupffer cells. In this study, we demonstrate the variable uptake feature of focal nodular hyperplasia (FNH) in Tc-99m phytate scintigraphy. We reviewed all patients who underwent Tc-99m phytate hepatic scintigraphy between 2008 and 2012 in Kaohsiung Medical University Hospital, Kaohsiung, Taiwan. Cases with FNH were diagnosed on the basis of pathology or at least one or more prior imaging with a periodic clinical follow-up. All patients received a standard protocol of dynamic flow study and planar and Tc-99m phytate single-photon emission computed tomography (E. CAM; Siemens). The correlation of variable nodular radioactivity with parameters such as tumor size and localization was analyzed. In total, 15 lesions of 14 patients in the clinic were diagnosed as FNH. The tumor size was approximately 2.9-7.4 cm (mean size 4.6 cm). Four lesions were larger than 5 cm. The major anatomic distribution was in the right hepatic lobe (10 lesions), particularly in the superior segments (7 lesions). Tc-99m phytate single-photon emission computed tomography imaging for determining the functional features of Kupffer cells included cool/cold (8 lesions), isoradioactive/warm (6 lesions), and hot (1 lesion) patterns of uptake. We did not observe any statistically significant correlation between variable nodular radioactivity and tumor size (p=0.68) or localization (p=0.04). Herein, we demonstrate the variable uptake feature of FNH in Tc-99m phytate scintigraphy. In small FNH tumors (< 5 cm), increased or equal uptake still provided specificity for the differential diagnosis of hepatic solid tumors.

  3. Preliminary Computational Analysis of the (HIRENASD) Configuration in Preparation for the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.

    2011-01-01

    This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.

  4. Micro- and nano-X-ray computed-tomography: A step forward in the characterization of the pore network of a leached cement paste

    SciTech Connect

    Bossa, Nathan; Chaurand, Perrine; Vicente, Jérôme; Borschneck, Daniel; Levard, Clément; Aguerre-Chariol, Olivier; Rose, Jérôme

    2015-01-15

    Pore structure of leached cement pastes (w/c = 0.5) was studied for the first time from micro-scale down to the nano-scale by combining micro- and nano-X-ray computed tomography (micro- and nano-CT). This allowed assessing the 3D heterogeneity of the pore network along the cement profile (from the core to the altered layer) of almost the entire range of cement pore size, i.e. from capillary to gel pores. We successfully quantified an increase of porosity in the altered layer at both resolutions. Porosity is increasing from 1.8 to 6.1% and from 18 to 58% at the micro-(voxel = 1.81 μm) and nano-scale (voxel = 63.5 nm) respectively. The combination of both CT allowed to circumvent weaknesses inherent of both investigation scales. In addition the connectivity and the channel size of the pore network were also evaluated to obtain a complete 3D pore network characterization at both scales.

  5. Validation of the Australian diagnostic reference levels for paediatric multi detector computed tomography: a comparison of RANZCR QUDI data and subsequent NDRLS data from 2012 to 2015.

    PubMed

    Anna, Hayton; Wallace, Anthony; Thomas, Peter

    2017-03-01

    The national diagnostic reference level service (NDRLS), was launched in 2011, however no paediatric data were submitted during the first calendar year of operation. As such, Australian national diagnostic reference levels (DRLs), for paediatric multi detector computed tomography (MDCT), were established using data obtained from a Royal Australian and New Zealand College of Radiologists (RANZCR), Quality Use of Diagnostic Imaging (QUDI), study. Paediatric data were submitted to the NDRLS in 2012 through 2015. An analysis has been made of the NDRLS paediatric data using the same method as was used to analyse the QUDI data to establish the Australian national paediatric DRLs for MDCT. An analysis of the paediatric NDRLS data has also been made using the method used to calculate the Australian national adult DRLs for MDCT. A comparison between the QUDI data and subsequent NDRLS data shows the NDRLS data to be lower on average for the Head and AbdoPelvis protocol and similar for the chest protocol. Using an average of NDRLS data submitted between 2012 and 2015 implications for updated paediatric DRLS are considered.

  6. Comparative evaluation of soft and hard tissue dimensions in the anterior maxilla using radiovisiography and cone beam computed tomography: A pilot study

    PubMed Central

    Mallikarjun, Savita; Babu, Harsha Mysore; Das, Sreedevi; Neelakanti, Abhilash; Dawra, Charu; Shinde, Sachin Vaijnathrao

    2016-01-01

    Aims: To assess and compare the thickness of gingiva in the anterior maxilla using radiovisiography (RVG) and cone beam computed tomography (CBCT) and its correlation with the thickness of underlying alveolar bone. Settings and Design: This cross-sectional study included 10 male subjects in the age group of 20–45 years. Materials and Methods: After analyzing the width of keratinized gingiva of the maxillary right central incisor, the radiographic assessment was done using a modified technique for RVG and CBCT, to measure the thickness of both the labial gingiva and labial plate of alveolar bone at 4 predetermined locations along the length of the root in each case. Statistical Analysis Used: Statistical analysis was performed using Student's t-test and Pearson's correlation test, with the help of statistical software (SPSS V13). Results: No statistically significant differences were obtained in the measurement made using RVG and CBCT. The results of the present study also failed to reveal any significant correlation between the width of gingiva and the alveolar bone in the maxillary anterior region. Conclusions: Within the limitations of this study, it can be concluded that both CBCT and RVG can be used as valuable tools in the assessment of the soft and hard tissue dimensions. PMID:27143830

  7. INCIDENTAL AND NONINCIDENTAL CANINE THYROID TUMORS ASSESSED BY MULTIDETECTOR ROW COMPUTED TOMOGRAPHY: A SINGLE-CENTRE CROSS SECTIONAL STUDY IN 4520 DOGS.

    PubMed

    Bertolini, Giovanna; Drigo, Michele; Angeloni, Luca; Caldin, Marco

    2017-02-09

    Thyroid nodules are common in dogs and are increasingly likely to be detected with the increased use of advanced imaging modalities. An unsuspected, nonpalpable, asymptomatic lesion, defined as a thyroid incidentaloma, may be discovered on an imaging study unrelated to the thyroid gland. The objective of this single-center cross-sectional study was to assess the prevalence and computed tomography (CT) characteristics of incidental and nonincidental thyroid tumors in a large population of dogs, using prospective recruitment of patients undergoing CT examination for various reasons during the period of 2005-2015. Unilateral or bilateral thyroid masses were detected in 96/4520 dogs (prevalence, 2.12%; 95% confidence interval [CI], 1.70-2.54%). Seventy-nine (82.3%) lesions were malignant and 17 (17.7%) were benign. Masses were discovered incidentally in 34/96 dogs (overall prevalence of incidentaloma, 0.76%; 95% CI, 0.51-1.02), and 24 (70.6%) of these 34 masses were thyroid carcinomas. Among the CT variables assessed, mineralization, vascular invasion, and tissue invasion were detected only in malignant tumors. Intratumoral vascularization was significantly associated with the presence of thyroid malignancy (P < 0.001). Although incidental thyroid nodules in dogs are relatively rare, they are often malignant. Findings indicated that the neck should be thoroughly assessed in middle-aged and old patients undergoing body CT for various reasons. Thyroid nodules detected incidentally on CT should be sampled to avoid missing thyroid cancer.

  8. Radiation doses for pregnant women in the late pregnancy undergoing fetal-computed tomography: a comparison of dosimetry and Monte Carlo simulations.

    PubMed

    Matsunaga, Yuta; Kawaguchi, Ai; Kobayashi, Masanao; Suzuki, Shigetaka; Suzuki, Shoichi; Chida, Koichi

    2016-09-19

    The purposes of this study were (1) to compare the radiation doses for 320- and 80-row fetal-computed tomography (CT), estimated using thermoluminescent dosimeters (TLDs) and the ImPACT Calculator (hereinafter referred to as the "CT dosimetry software"), for a woman in her late pregnancy and her fetus and (2) to estimate the overlapped fetal radiation dose from a 320-row CT examination using two different estimation methods of the CT dosimetry software. The direct TLD data in the present study were obtained from a previous study. The exposure parameters used for TLD measurements were entered into the CT dosimetry software, and the appropriate radiation dose for the pregnant woman and her fetus was estimated. When the whole organs (e.g., the colon, small intestine, and ovaries) and the fetus were included in the scan range, the difference in the estimated doses between the TLD measurement and the CT dosimetry software measurement was <1 mGy (<23 %) in both CT units. In addition, when the whole organs were within the scan range, the CT dosimetry software was used for evaluating the fetal radiation dose and organ-specific doses for the woman in the late pregnancy. The conventional method using the CT dosimetry software cannot take into account the overlap between volumetric sections. Therefore, the conventional method using a 320-row CT unit in a wide-volume mode might result in the underestimation of radiation doses for the fetus and the colon, small intestine, and ovaries.

  9. Fluorodeoxyglucose positron emission tomography-computed tomography: a novel approach for the diagnosis of cholecystitis for equivocal diagnoses after ultrasound imaging.

    PubMed

    Nasseri, Yosef; Ourian, Ariel J; Waxman, Alan; D'Angolo, Alessandro; Thomson, Louise E; Margulies, Daniel R

    2012-10-01

    Although hepatobiliary iminodiacetic acid (HIDA) scan is often used when the diagnosis of cholecystitis remains questionable after ultrasound, it carries a high false-positive rate and has other limitations. Fluorodeoxyglucose positron emission tomography-computed tomography (18FDG PET-CT) has recently gained enthusiasm for its ability to detect infection and inflammation. In this study, we evaluate the accuracy of 18FDG PET-CT in diagnosing cholecystitis. Nineteen patients with suspected cholecystitis (Group S) underwent PET-CT and 10 had positive PET-CT findings. Of these 10, nine underwent cholecystectomies, and pathology confirmed cholecystitis in all nine. One patient was managed nonoperatively as a result of multiple comorbidities. Of the nine patients with negative PET-CT, six were managed nonoperatively, safely discharged, and had no readmissions at 3-month follow-up. The other three patients with negative PET-CT underwent cholecystectomies, and two showed no cholecystitis on pathology. The third had mild to moderate cholecystitis with focal mucosal erosion/ulceration without gallbladder wall thickening on pathology. 18FDG PET-CT detected gallbladder inflammation in all but one patient with pathology-proven cholecystitis with a sensitivity and specificity of 0.90 and 1.00, respectively. 18FDG-PET-CT appears to be a promising, rapid, direct, and accurate test in diagnosing cholecystitis and could replace HIDA scan in cases that remain equivocal after ultrasound.

  10. Non-invasive Assessment of Lower Limb Geometry and Strength Using Hip Structural Analysis and Peripheral Quantitative Computed Tomography: A Population-Based Comparison.

    PubMed

    Litwic, A E; Clynes, M; Denison, H J; Jameson, K A; Edwards, M H; Sayer, A A; Taylor, P; Cooper, C; Dennison, E M

    2016-02-01

    Hip fracture is the most significant complication of osteoporosis in terms of mortality, long-term disability and decreased quality of life. In the recent years, different techniques have been developed to assess lower limb strength and ultimately fracture risk. Here we examine relationships between two measures of lower limb bone geometry and strength; proximal femoral geometry and tibial peripheral quantitative computed tomography. We studied a sample of 431 women and 488 men aged in the range 59-71 years. The hip structural analysis (HSA) programme was employed to measure the structural geometry of the left hip for each DXA scan obtained using a Hologic QDR 4500 instrument while pQCT measurements of the tibia were obtained using a Stratec 2000 instrument in the same population. We observed strong sex differences in proximal femoral geometry at the narrow neck, intertrochanteric and femoral shaft regions. There were significant (p < 0.001) associations between pQCT-derived measures of bone geometry (tibial width; endocortical diameter and cortical thickness) and bone strength (strength strain index) with each corresponding HSA variable (all p < 0.001) in both men and women. These results demonstrate strong correlations between two different methods of assessment of lower limb bone strength: HSA and pQCT. Validation in prospective cohorts to study associations of each with incident fracture is now indicated.

  11. Analysis of C-shaped canal systems in mandibular second molars using surgical operating microscope and cone beam computed tomography: A clinical approach

    PubMed Central

    Chhabra, Sanjay; Yadav, Seema; Talwar, Sangeeta

    2014-01-01

    Aims: The study was aimed to acquire better understanding of C-shaped canal systems in mandibular second molar teeth through a clinical approach using sophisticated techniques such as surgical operating microscope and cone beam computed tomography (CBCT). Materials and Methods: A total of 42 extracted mandibular second molar teeth with fused roots and longitudinal grooves were collected randomly from native Indian population. Pulp chamber floors of all specimens were examined under surgical operating microscope and classified into four types (Min's method). Subsequently, samples were subjected to CBCT scan after insertion of K-files size #10 or 15 into each canal orifice and evaluated using the cross-sectional and 3-dimensional images in consultation with dental radiologist so as to obtain more accurate results. Minimum distance between the external root surface on the groove and initial file placed in the canal was also measured at different levels and statistically analyzed. Results: Out of 42 teeth, maximum number of samples (15) belonged to Type-II category. A total of 100 files were inserted in 86 orifices of various types of specimens. Evaluation of the CBCT scan images of the teeth revealed that a total of 21 canals were missing completely or partially at different levels. The mean values for the minimum thickness were highest at coronal followed by middle and apical third levels in all the categories. Lowest values were obtained for teeth with Type-III category at all three levels. Conclusions: The present study revealed anatomical variations of C-shaped canal system in mandibular second molars. The prognosis of such complex canal anatomies can be improved by simultaneous employment of modern techniques such as surgical operating microscope and CBCT. PMID:24944447

  12. A pilot study evaluating shaved cavity margins with micro-computed tomography: a novel method for predicting lumpectomy margin status intraoperatively.

    PubMed

    Tang, Rong; Coopey, Suzanne B; Buckley, Julliette M; Aftreth, Owen P; Fernandez, Leopoldo J; Brachtel, Elena F; Michaelson, James S; Gadd, Michele A; Specht, Michelle C; Koerner, Frederick C; Smith, Barbara L

    2013-01-01

    Microscopically clear lumpectomy margins are essential in breast conservation, as involved margins increase local recurrence. Currently, 18-50% of lumpectomies have close or positive margins that require re-excision. We assessed the ability of micro-computed tomography (micro-CT) to evaluate lumpectomy shaved cavity margins (SCM) intraoperatively to determine if this technology could rapidly identify margin involvement by tumor and reduce re-excision rates. Twenty-five SCM from six lumpectomies were evaluated with a Skyscan 1173 table top micro-CT scanner (Skyscan, Belgium). Micro-CT results were compared to histopathological results. We scanned three SCM at once with a 7-minute scanning protocol, and studied a total of 25 SCM from six lumpectomies. Images of the SCM were evaluated for radiographic signs of breast cancer including clustered microcalcifications and spiculated masses. SCM were negative by micro-CT in 19/25 (76%) and negative (≥2 mm) by histopathology in 19/25 (76%). Margin status by micro-CT was concordant with histopathology in 23/25 (92%). Micro-CT overestimated margin involvement in 1/25 and underestimated margin involvement in 1/25. Micro-CT had an 83.3% positive predictive value, a 94.7% negative predictive value, 83.3% sensitivity, and 94.7% specificity for evaluation of SCM. Evaluation of SCM by micro-CT is an accurate and promising method of intraoperative margin assessment in breast cancer patients. The scanning time required is short enough to permit real-time feedback to the operating surgeon, allowing immediate directed re-excision.

  13. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  14. A preliminary transient-fault experiment on the SIFT computer system

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Elks, Carl R.

    1987-01-01

    This paper presents the results of a preliminary experiment to study the effectiveness of a fault-tolerant system's ability to handle transient faults. The primary goal of the experiment was to develop the techniques to measure the parameters needed for a reliability analysis of the SIFT computer system which includes th effects of transient faults. A key aspect of such an analysis is the determination of the effectiveness of the operating system's ability to discriminate between transient and permanent faults. A detailed description of the preliminary transient fault experiment along with the results from 297 transient fault injections are given. Although not enough data was obtained to draw statistically significant conclusions, the foundation has been laid for a large-scale transient fault experiment.

  15. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice.

    PubMed

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Miraux, Sylvain; Amédée, Joëlle; Fricain, Jean-Christophe; Catros, Sylvain

    2010-03-01

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  16. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    NASA Astrophysics Data System (ADS)

    Broekema, P. C.; van Nieuwpoort, R. V.; Bal, H. E.

    2015-07-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload.

  17. Computer-based cognitive intervention for dementia: preliminary results of a randomized clinical trial.

    PubMed

    Galante, E; Venturini, G; Fiaccadori, C

    2007-01-01

    Dementia is a highly invalidating condition and, given the progressive aging of the population, one of the major issues that health systems will have to face in future years. Recently there has been an increase in the potential of diagnostic tools and pharmacological treatments for dementia; moreover, considerable interest has been expressed regarding non pharmacological interventions. However, the current evidence in support of non pharmacological treatments in patients affected by dementia still does not allow to draw definitive conclusions on what is the most effective treatment to apply, largely because of methodological difficulties and limitations of the studies so far carried out due to the complex nature of the disease. To address this need, we carried out a single blind randomized controlled study on the efficacy of computer cognitive rehabilitation in patients with mild cognitive decline. We here present preliminary data on 11 patients with diagnosis of Alzheimer's Disease (AD) and mild cognitive decline randomly assigned to treatment (a) or control (b) condition (i.e. specific vs. aspecific treatment). The specific treatment (a) consisted in a cycle of 12 individual sessions of computer exercises, while the control condition (b) consisted in sessions of semi-structured interviews with patients, conducted with the same frequency and time period as (a). Cognitive, behavioural and functional assessment was performed by an expert evaluator, blinded to the patients' group allocation. Preliminary results show a significant performance decline only in the control group at the 9-month follow-up compared to both baseline and the 3-month follow-up. Our results suggest that computer based cognitive training in patients with AD and mild cognitive decline is effective at least in delaying the continuous progression of cognitive impairment in AD.

  18. On the computation of preliminary orbits for Earth satellites with radar observations

    NASA Astrophysics Data System (ADS)

    Gronchi, G. F.; Dimare, L.; Bracali Cioci, D.; Ma, H.

    2015-08-01

    We introduce a new method to perform preliminary orbit determination for satellites on low Earth orbits (LEO). This method works with tracks of radar observations: each track is composed by n ≥ 4 topocentric position vectors per pass of the satellite, taken at very short time intervals. We assume very accurate values for the range ρ, while the angular positions (i.e. the line of sight, given by the pointing of the antenna) are less accurate. We wish to correct the errors in the angular positions already in the computation of a preliminary orbit. With the information contained in a pair of radar tracks, using the laws of the two-body dynamics, we can write eight equations in eight unknowns. The unknowns are the components of the topocentric velocity orthogonal to the line of sight at the two mean epochs of the tracks, and the corrections Δ to be applied to the angular positions. We take advantage of the fact that the components of Δ are typically small. We show the results of some tests, performed with simulated observations, and compare this method with Gibbs' and the Keplerian integrals methods.

  19. Preliminary efficacy of a computer-delivered HIV prevention intervention for African American teenage females.

    PubMed

    Klein, Charles H; Card, Josefina J

    2011-12-01

    This study translated SiHLE (Sisters Informing, Healing, Living, and Empowering), a 12-hour Centers for Disease Control and Prevention evidence-based group-level intervention for African American females 14-18 years of age, into a 2-hour computer-delivered individual-level intervention. A randomized controlled trial (n = 178) was conducted to examine the efficacy of the new Multimedia SiHLE intervention. Average condom-protected sex acts (proportion of vaginal sex acts with condoms, last 90 days) for sexually active participants receiving Multimedia SiHLE rose from M = 51% at baseline to M = 71% at 3-month follow-up (t = 2.06, p = .05); no statistically significant difference was found in the control group. Non-sexually active intervention group participants reported a significant increase in condom self-efficacy (t = 2.36, p = .02); no statistically significant difference was found in the control group. The study provides preliminary support for the efficacy of a computer-delivered adaptation of a proven HIV prevention program for African American teenage women. This is consistent with meta-analyses that have shown that computer-delivered interventions, which can often be disseminated at lower per-capita cost than human-delivered interventions, can influence HIV risk behaviors in positive fashion.

  20. Computer-assisted intraosseous anaesthesia for molar and incisor hypomineralisation teeth. A preliminary study.

    PubMed

    Cabasse, C; Marie-Cousin, A; Huet, A; Sixou, J L

    2015-03-01

    Anesthetizing MIH (Molar and Incisor Hypomineralisation) teeth is one of the major challenges in paediatric dentistry. Computer-assisted IO injection (CAIO) of 4% articaine with 1:200,000 epinephrine (Alphacaine, Septodont) has been shown to be an efficient way to anesthetize teeth in children. The aim of this study was to assess the efficacy of this method with MIH teeth. This preliminary study was performed using the Quick Sleeper system (Dental Hi Tec, Cholet, France) that allows computer-controlled rotation of the needle to penetrate the bone and computer-controlled injection of the anaesthetic solution. Patients (39) of the department of Paediatric Dentistry were included allowing 46 sessions (including 32 mandibular first permanent molars) to be assessed. CAIO showed efficacy in 93.5% (43/46) of cases. Failures (3) were due to impossibility to reach the spongy bone (1) and to achieve anaesthesia (2). This prospective study confirms that CAIO anaesthesia is a promising method to anesthetize teeth with MIH that could therefore be routinely used by trained practitioners.

  1. Computational analysis and preliminary redesign of the nozzle contour of the Langley hypersonic CF4 tunnel

    NASA Technical Reports Server (NTRS)

    Thompson, R. A.; Sutton, Kenneth

    1987-01-01

    A computational analysis, modification, and preliminary redesign study was performed on the nozzle contour of the Langley Hypersonic CF4 Tunnel. This study showed that the existing nozzle was contoured incorrectly for the design operating condition, and this error was shown to produce the measured disturbances in the exit flow field. A modified contour was designed for the current nozzle downstream of the maximum turning point that would provide a uniform exit flow. New nozzle contours were also designed for an exit Mach number and Reynolds number combination which matches that attainable in the Langley 20-Inch Mach 6 Tunnel. Two nozzle contours were designed: one having the same exit radius but a larger mass flow rate than that of the existing CF4 Tunnel, and the other having the same mass flow rate but a smaller exit radius than that of the existing CF4 Tunnel.

  2. Development of an online automatic computed radiography dose data mining program: a preliminary study.

    PubMed

    Ng, Curtise K C; Sun, Zhonghua

    2010-01-01

    Recent studies have reported the computed radiography (CR) dose creep problem and therefore the need to have monitoring processes in place in clinical departments. The objective of this study is to provide a better technological solution to implement a regular CR dose monitoring process. An online automatic CR dose data mining program which can be applied to different systems was developed based on freeware and existing softwares in the Picture Archiving and Communication System (PACS) server. The program was tested with 69 CR images. This preliminary study shows that the program addresses the major weaknesses of some existing studies including involvement of manual procedures in the monitoring process and being only applicable to a single manufacturer's CR images. The proposed method provides an efficient and effective solution to implement a CR dose monitoring program regularly in busy clinical departments to regulate the dose creep problem so as to reinforce the 'As Low As Reasonably Achievable' (ALARA) principle.

  3. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Election with respect to life insurance reserves... Provisions § 1.818-4 Election with respect to life insurance reserves computed on preliminary term basis. (a) In general. Section 818(c) permits a life insurance company issuing contracts with respect to...

  4. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 8 2013-04-01 2013-04-01 false Election with respect to life insurance reserves... Provisions § 1.818-4 Election with respect to life insurance reserves computed on preliminary term basis. (a) In general. Section 818(c) permits a life insurance company issuing contracts with respect to...

  5. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 8 2014-04-01 2014-04-01 false Election with respect to life insurance reserves... Provisions § 1.818-4 Election with respect to life insurance reserves computed on preliminary term basis. (a) In general. Section 818(c) permits a life insurance company issuing contracts with respect to...

  6. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... Section 818(c) permits a life insurance company issuing contracts with respect to which the life insurance... section 801 (relating to the definition of a life insurance company). If such an election is made, the... computed on preliminary term basis. If a life insurance company makes an election under section 818(c)...

  7. A preliminary study on the short-term efficacy of chairside computer-aided design/computer-assisted manufacturing- generated posterior lithium disilicate crowns.

    PubMed

    Reich, Sven; Fischer, Sören; Sobotta, Bernhard; Klapper, Horst-Uwe; Gozdowski, Stephan

    2010-01-01

    The purpose of this preliminary study was to evaluate the clinical performance of chairside-generated crowns over a preliminary time period of 24 months. Forty-one posterior crowns made of a machinable lithium disilicate ceramic for full-contour crowns were inserted in 34 patients using a chairside computer-aided design/computer-assisted manufacturing technique. The crowns were evaluated at baseline and after 6, 12, and 24 months according to modified United States Public Health Service criteria. After 2 years, all reexamined crowns (n = 39) were in situ; one abutment exhibited secondary caries and two abutments received root canal treatment. Within the limited observation period, the crowns revealed clinically satisfying results.

  8. Computer-aided detection of HER2 amplification status using FISH images: a preliminary study

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Wang, Xiao-Hui; Surti, Urvashi; Bhargava, Rohit; Gur, David

    2009-02-01

    The amplification status of human epidermal growth factor receptors 2 (HER2) genes is strongly associated with clinical outcome in patients with breast cancer. The American Society of Clinical Oncology Tumor Marker Guidelines Panel has recommended routine testing of HER2 status on all newly diagnosed metastatic breast cancers since 2001. Although fluorescent in situ hybridization (FISH) technology provides superior accuracy as compared with other approaches, current manual FISH analysis methods are somewhat subjective, tedious, and may introduce interreader variability. The goal of this preliminary study is to develop and test a computer-aided detection (CAD) scheme to assess HER2 status using FISH images. Forty FISH images were selected for this study from our genetic laboratory. The CAD scheme first applies an adaptive, iterative threshold method followed by a labeling algorithm to segment cells of possible interest. A set of classification rules is then used to identify analyzable interphase cells and discard nonanalyzable cells due to cell overlapping and/or other image staining debris (or artifacts). The scheme then maps the detected analyzable cells onto two other gray scale images corresponding to the red and green color of the original image followed by application of a raster scan and labeling algorithms to separately detect the HER-2/neu ("red") and CEP17 ("green") FISH signals. A simple distance based criterion is applied to detect and merge split FISH signals within each cell. The CAD scheme computes the ratio between independent "red" and "green" FISH signals of all analyzable cells identified on an image. If the ratio is >= 2.0, the FISH image is assumed to have been acquired from a HER2+ case; otherwise, the FISH image is assumed to have been acquired from HER2- case. When we applied the CAD scheme to the testing dataset, the average computed HER2 amplification ratios were 1.06+/-0.25 and 2.53+/-0.81 for HER2- and HER2+ samples, respectively. The

  9. Wolter X-Ray Microscope Computed Tomography Ray-Trace Model with Preliminary Simulation Results

    SciTech Connect

    Jackson, J A

    2006-02-27

    It is proposed to build a Wolter X-ray Microscope Computed Tomography System in order to characterize objects to sub-micrometer resolution. Wolter Optics Systems use hyperbolic, elliptical, and/or parabolic mirrors to reflect x-rays in order to focus or magnify an image. Wolter Optics have been used as telescopes and as microscopes. As microscopes they have been used for a number of purposes such as measuring emission x-rays and x-ray fluoresce of thin biological samples. Standard Computed Tomography (CT) Systems use 2D radiographic images, from a series of rotational angles, acquired by passing x-rays through an object to reconstruct a 3D image of the object. The x-ray paths in a Wolter X-ray Microscope will be considerably different than those of a standard CT system. There is little information about the 2D radiographic images that can be expected from such a system. There are questions about the quality, resolution and focusing range of an image created with such a system. It is not known whether characterization information can be obtained from these images and whether these 2D images can be reconstructed to 3D images of the object. A code has been developed to model the 2D radiographic image created by an object in a Wolter X-ray Microscope. This code simply follows the x-ray through the object and optics. There is no modeling at this point of other effects, such as scattering, reflection losses etc. Any object, of appropriate size, can be used in the model code. A series of simulations using a number of different objects was run to study the effects of the optics. The next step will be to use this model to reconstruct an object from the simulated data. Funding for the project ended before this goal could be accomplished. The following documentation includes: (1) background information on current X-ray imaging systems, (2) background on Wolter Optics, (3) description of the Wolter System being used, (4) purpose, limitations and development of the modeling

  10. Computer-assisted learning in anatomy at the international medical school in Debrecen, Hungary: a preliminary report.

    PubMed

    Kish, Gary; Cook, Samuel A; Kis, Gréta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an introduction to anatomical digital images along with clinical cases. This low-budget course has a large visual component using images from magnetic resonance imaging and computer axial tomogram scans, ultrasound clinical studies, and readily available anatomy software that presents topics which run in parallel to the university's core anatomy curriculum. From the combined computer images and CHA lecture information, students are asked to solve computer-based clinical anatomy problems in the CHA computer laboratory. A statistical comparison was undertaken of core anatomy oral examination performances of English program first-year medical students who took the elective CHA course and those who did not in the three academic years 2007-2008, 2008-2009, and 2009-2010. The results of this study indicate that the CHA-enrolled students improved their performance on required anatomy core curriculum oral examinations (P < 0.001), suggesting that computer-assisted learning may play an active role in anatomy curriculum improvement. These preliminary results have prompted ongoing evaluation of what specific aspects of CHA are valuable and which students benefit from computer-assisted learning in a multilingual and diverse cultural environment.

  11. Preliminary validation of a new methodology for estimating dose reduction protocols in neonatal chest computed radiographs

    NASA Astrophysics Data System (ADS)

    Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.

    2006-03-01

    The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.

  12. Reducing Foreign Language Communication Apprehension with Computer-Mediated Communication: A Preliminary Study

    ERIC Educational Resources Information Center

    Arnold, Nike

    2007-01-01

    Many studies (e.g., [Beauvois, M.H., 1998. "E-talk: Computer-assisted classroom discussion--attitudes and motivation." In: Swaffar, J., Romano, S., Markley, P., Arens, K. (Eds.), "Language learning online: Theory and practice in the ESL and L2 computer classroom." Labyrinth Publications, Austin, TX, pp. 99-120; Bump, J., 1990. "Radical changes in…

  13. How Well Can a Computer Program Teach German Culture? Some Preliminary Findings from EthnoDeutsch.

    ERIC Educational Resources Information Center

    Ashby, Wendy; Ostertag, Veronica

    2002-01-01

    Investigates the effectiveness of an interactive, computer-mediated instructional segment designed to educate students about ethnicity in German-speaking countries. Fifty-two intermediate German students worked with computer-mediated segments and rated the segments' effectiveness on a Likert-scale questionnaire. (AS)

  14. Preliminary Computational Fluid Dynamics (CFD) Simulation of EIIB Push Barge in Shallow Water

    NASA Astrophysics Data System (ADS)

    Beneš, Petr; Kollárik, Róbert

    2011-12-01

    This study presents preliminary CFD simulation of EIIb push barge in inland conditions using CFD software Ansys Fluent. The RANSE (Reynolds Averaged Navier-Stokes Equation) methods are used for the viscosity solution of turbulent flow around the ship hull. Different RANSE methods are used for the comparison of their results in ship resistance calculations, for selecting the appropriate and removing inappropriate methods. This study further familiarizes on the creation of geometrical model which considers exact water depth to vessel draft ratio in shallow water conditions, grid generation, setting mathematical model in Fluent and evaluation of the simulations results.

  15. Preliminary study of the use of the STAR-100 computer for transonic flow calculations

    NASA Technical Reports Server (NTRS)

    Keller, J. D.; Jameson, A.

    1977-01-01

    An explicit method for solving the transonic small-disturbance potential equation is presented. This algorithm, which is suitable for the new vector-processor computers such as the CDC STAR-100, is compared to successive line over-relaxation (SLOR) on a simple test problem. The convergence rate of the explicit scheme is slower than that of SLOR, however, the efficiency of the explicit scheme on the STAR-100 computer is sufficient to overcome the slower convergence rate and allow an overall speedup compared to SLOR on the CYBER 175 computer.

  16. Protocol Analysis of Man-Computer Languages: Design and Preliminary Findings

    DTIC Science & Technology

    1975-07-01

    nacaaaarr ""< (dam/fr by Mock number) application -oriented language design, man-computer language design, man-machine communication, message proce»sing...Officer Ittft C - Keyword Language Form I Hi D - Positional Language Form (14 E - English-like Language Form Mi F - Display Handouts...given a population of users and a category of applications as shown below. iiven: Our problem is to find an on-line computer language that will

  17. The difference between playing games with and without the computer: a preliminary view.

    PubMed

    Antonietti, Alessandro; Mellone, Rosa

    2003-03-01

    The authors address the question of whether associations between video games and cognitive and metacognitive variables depend either on the features of the computer or on the content of the game that the computer allows one to play. An experiment to separate these two kinds of effects was carried out by using a traditional version and a computer-supported version of Pegopolis, a solitaire game. The two versions were exactly the same except that they were played by moving pieces either on a real board or on a virtual computer-presented board. The performance levels and strategies followed during the game by the 40 undergraduates who took part in the experiment were not significantly different in the real and virtual conditions. None of the participants transferred playing strategies or practice from one version of the game to the other. Scores were not affected by gender or by the studies pursued by participants, the habit of playing games in the traditional manner or playing video games, or intelligence. Retrospective reports did not support differences in the subjective experience between the two versions. Results showed that video games, when they do not make much use of the computer's special features, produce effects because of the situations they simulate rather than because of features of the computer itself.

  18. Comparison of different methods to compute a preliminary orbit of Space Debris using radar observations

    NASA Astrophysics Data System (ADS)

    Ma, Hélène; Gronchi, Giovanni F.

    2014-07-01

    We advertise a new method of preliminary orbit determination for space debris using radar observations, which we call Infang †. We can perform a linkage of two sets of four observations collected at close times. The context is characterized by the accuracy of the range ρ, whereas the right ascension α and the declination δ are much more inaccurate due to observational errors. This method can correct α, δ, assuming the exact knowledge of the range ρ. Considering no perturbations from the J 2 effect, but including errors in the observations, we can compare the new method, the classical method of Gibbs, and the more recent Keplerian integrals method. The development of Infang is still on-going and will be further improved and tested.

  19. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  20. Psychological underpinnings of intrafamilial computer-mediated communication: a preliminary exploration of CMC uptake with parents and siblings.

    PubMed

    Goby, Valerie Priscilla

    2011-06-01

    This preliminary study investigates the uptake of computer-mediated communication (CMC) with parents and siblings, an area on which no research appears to have been conducted. Given the lack of relevant literature, grounded theory methodology was used and online focus group discussions were conducted in an attempt to generate suitable hypotheses for further empirical studies. Codification of the discussion data revealed various categories of meaning, namely: a perceived inappropriateness of CMC with members of family of origin; issues relating to the family generational gap; the nature of the offline sibling/parent relationship; the non-viability of online affordances such as planned self-disclosure, deception, identity construction; and disinhibition in interactions with family-of-origin members. These themes could be molded into hypotheses to assess the psychosocial limitations of CMC and to determine if it can indeed become a ubiquitous alternative to traditional communication modes as some scholars have claimed.

  1. Computer-mediated communication and the Gallaudet University community: a preliminary report.

    PubMed

    Hogg, Nanette M; Lomicky, Carol S; Weiner, Stephen F

    2008-01-01

    The study examined the use of computer-mediated communication (CMC) among individuals involved in a conflict sparked by the appointment of an administrator as president-designate of Gallaudet University in 2006. CMC was defined as forms of communication used for transmitting (sharing) information through networks with digital devices. There were 662 survey respondents. Respondents reported overwhelmingly (98%) that they used CMC to communicate. Students and alumni reported CMC use in larger proportions than any other group. The favorite devices among all respondents were Sidekicks, stationary computers, and laptops. Half of all respondents also reported using some form of video device. Nearly all reported using e-mail; respondents also identified Web surfing, text messaging, and blogging as popular CMC activities. The authors plan another article reporting on computer and electronic technology use as a mechanism connecting collective identity to social movements.

  2. SIFT - A preliminary evaluation. [Software Implemented Fault Tolerant computer for aircraft control

    NASA Technical Reports Server (NTRS)

    Palumbo, D. L.; Butler, R. W.

    1983-01-01

    This paper presents the results of a performance evaluation of the SIFT computer system conducted in the NASA AIRLAB facility. The essential system functions are described and compared to both earlier design proposals and subsequent design improvements. The functions supporting fault tolerance are found to consume significant computing resources. With SIFT's specimen task load, scheduled at a 30-Hz rate, the executive tasks such as reconfiguration, clock synchronization and interactive consistency, require 55 percent of the available task slots. Other system overhead (e.g., voting and scheduling) use an average of 50 percent of each remaining task slot.

  3. Computer-Mediated Communication and the Gallaudet University Community: A Preliminary Report

    ERIC Educational Resources Information Center

    Hogg, Nanette M.; Lomicky, Carol S.; Weiner, Stephen F.

    2008-01-01

    The study examined the use of computer-mediated communication (CMC) among individuals involved in a conflict sparked by the appointment of an administrator as president-designate of Gallaudet University in 2006. CMC was defined to comprise forms of communication used for transmitting (sharing) information through networks with digital devices.…

  4. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    SciTech Connect

    Helton, J.C.; Anderson, D.R.; Baker, B.L.

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  5. Integrating Computer Algebra Systems in Post-Secondary Mathematics Education: Preliminary Results of a Literature Review

    ERIC Educational Resources Information Center

    Buteau, Chantal; Marshall, Neil; Jarvis, Daniel; Lavicza, Zsolt

    2010-01-01

    We present results of a literature review pilot study (326 papers) regarding the use of Computer Algebra Systems (CAS) in tertiary mathematics education. Several themes that have emerged from the review are discussed: diverse uses of CAS, benefits to student learning, issues of integration and mathematics learning, common and innovative usage of…

  6. Monitor Tone Generates Stress in Computer and VDT Operators: A Preliminary Study.

    ERIC Educational Resources Information Center

    Dow, Caroline; Covert, Douglas C.

    A near-ultrasonic pure tone of 15,570 Herz generated by flyback transformers in computer and video display terminal (VDT) monitors may cause severe non-specific irritation or stress disease in operators. Women hear higher frequency sounds than men and are twice as sensitive to "too loud" noise. Pure tones at high frequencies are more…

  7. Computer code for preliminary sizing analysis of axial-flow turbines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    This mean diameter flow analysis uses a stage average velocity diagram as the basis for the computational efficiency. Input design requirements include power or pressure ratio, flow rate, temperature, pressure, and rotative speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse) or for any specified stage swirl split. Exit turning vanes can be included in the design. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, and last stage absolute and relative Mach numbers. An analysis is presented along with a description of the computer program input and output with sample cases. The analysis and code presented herein are modifications of those described in NASA-TN-D-6702. These modifications improve modeling rigor and extend code applicability.

  8. High-accuracy computation of Delta V magnitude probability densities - Preliminary remarks

    NASA Technical Reports Server (NTRS)

    Chadwick, C.

    1986-01-01

    This paper describes an algorithm for the high accuracy computation of some statistical quantities of the magnitude of a random trajectory correction maneuver (TCM). The trajectory correction velocity increment Delta V is assumed to be a three-component random vector with each component being a normally distributed random scalar having a possibly nonzero mean. Knowledge of the statitiscal properties of the magnitude of a random TCM is important in the planning and execution of maneuver strategies for deep-space missions such as Galileo. The current algorithm involves the numerical integration of a set of differential equations. This approach allows the computation of density functions for specific Delta V magnitude distributions to high accuracy without first having to generate large numbers of random samples. Possible applications of the algorithm to maneuver planning, planetary quarantine evaluation, and guidance success probability calculations are described.

  9. An Interactive Computer Program for the Preliminary Design and Analysis of Marine Reduction Gears.

    DTIC Science & Technology

    1982-03-01

    axis and simple epicyclic reduction gears. It is caD -beo 71J? 1473 too-roos OP, I Nov6$15i ORLE TVNICLASST IFl um u ewd S/01 0102-014- 6601 66CUOIIY...CONCLUSIONS AND RECOMBENDATICNS Computer aided design ( CAD ) is an important and useful tool ’or engineers. As computer technology continues to expand, CAD ...N e E-H4o m H4 4 E-4. Cad 0 0 = *uuu I! 2I4 E-4 0H o( - =C = e4r (n ) T-N f co~~~2 aU OD 0 %D00mmm0 Aw 0 94 71z 00 0 p nc nc 0 C; 4 H 0 2q 0I go 04

  10. Cone-Beam Computed Tomography Evaluation of Mental Foramen Variations: A Preliminary Study

    PubMed Central

    Sheikhi, Mahnaz; Karbasi Kheir, Mitra; Hekmatian, Ehsan

    2015-01-01

    Background. Mental foramen is important in surgical operations of premolars because it transfers the mental nerves and vessels. This study evaluated the variations of mental foramen by cone-beam computed tomography among a selected Iranian population. Materials and Methods. A total number of 180 cone-beam computed tomography projections were analyzed in terms of shape, size, direction, and horizontal and vertical positions of mental foramen in the right and left sides. Results. The most common shape was oval, opening direction was posterior-superior, horizontal position was in line with second premolar, and vertical position was apical to the adjacent dental root. The mean of foremen diameter was 3.59 mm. Conclusion. In addition to the most common types of mental foramen, other variations exist, too. Hence, it reflects the significance of preoperative radiographic examinations, especially 3-dimensional images to prevent nerve damage. PMID:26609432

  11. Preliminary assessment of Tongue Drive System in medium term usage for computer access and wheelchair control.

    PubMed

    Yousefi, Behnaz; Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless, wearable assistive technology that enables individuals with severe motor impairments access computers, drive wheelchairs, and control their environments using tongue motion. In this paper, we have evaluated the TDS performance as a computer input device using ISO9241-9 standard tasks for pointing and selecting, based on the well known Fitts' Law, and as a powered wheelchair controller through an obstacle course navigation task. Nine able-bodied subjects who already had tongue piercing participated in this trial over 5 sessions during 5 weeks, allowing us to study the TDS learning process and its current limiting factors. Subjects worn tongue rings made of titanium in the form of a barbell with a small rare earth magnetic tracer hermetically sealed inside the upper ball. Comparing the results between 1(st) and 5(th) sessions showed that subjects' performance improved in all the measures through 5 sessions, demonstrating the effects of learning.

  12. A Comparison between the Occurrence of Pauses, Repetitions and Recasts under Conditions of Face-to-Face and Computer-Mediated Communication: A Preliminary Study

    ERIC Educational Resources Information Center

    Cabaroglu, Nese; Basaran, Suleyman; Roberts, Jon

    2010-01-01

    This study compares pauses, repetitions and recasts in matched task interactions under face-to-face and computer-mediated conditions. Six first-year English undergraduates at a Turkish University took part in Skype-based voice chat with a native speaker and face-to-face with their instructor. Preliminary quantitative analysis of transcripts showed…

  13. Preliminary results of very fast computation of Moment Magnitude and focal mechanism in the context of tsunami warning

    NASA Astrophysics Data System (ADS)

    Schindelé, François; Roch, Julien; Rivera, Luis

    2015-04-01

    Various methodologies were recently developed to compute the moment magnitude and the focal mechanism, thanks to the real time access to numerous broad-band seismic data. Several methods were implemented at the CENALT, in particular the W-Phase method developed by H. Kanamori and L. Rivera. For earthquakes of magnitudes in the range 6.5-9.0, this method provides accurate results in less than 40 minutes. The context of the tsunami warning in Mediterranean, a small basin impacted in less than one hour, and with small sources but some with high tsunami potential (Boumerdes 2003), a comprehensive tsunami warning system in that region should include very fast computation of the seismic parameters. The results of the values of Mw, the focal depth and the type of fault (reverse, normal, strike-slip) are the most relevant parameters expected for the tsunami warning. Preliminary results will be presented using data in the North-eastern and Mediterranean region for the recent period 2010-2014. This work is funded by project ASTARTE - - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  14. Functionalized synchrotron in-line phase-contrast computed tomography: a novel approach for simultaneous quantification of structural alterations and localization of barium-labelled alveolar macrophages within mouse lung samples.

    PubMed

    Dullin, Christian; dal Monego, Simeone; Larsson, Emanuel; Mohammadi, Sara; Krenkel, Martin; Garrovo, Chiara; Biffi, Stefania; Lorenzon, Andrea; Markus, Andrea; Napp, Joanna; Salditt, Tim; Accardo, Agostino; Alves, Frauke; Tromba, Giuliana

    2015-01-01

    Functionalized computed tomography (CT) in combination with labelled cells is virtually non-existent due to the limited sensitivity of X-ray-absorption-based imaging, but would be highly desirable to realise cell tracking studies in entire organisms. In this study we applied in-line free propagation X-ray phase-contrast CT (XPCT) in an allergic asthma mouse model to assess structural changes as well as the biodistribution of barium-labelled macrophages in lung tissue. Alveolar macrophages that were barium-sulfate-loaded and fluorescent-labelled were instilled intratracheally into asthmatic and control mice. Mice were sacrificed after 24 h, lungs were kept in situ, inflated with air and scanned utilizing XPCT at the SYRMEP beamline (Elettra Synchrotron Light Source, Italy). Single-distance phase retrieval was used to generate data sets with ten times greater contrast-to-noise ratio than absorption-based CT (in our setup), thus allowing to depict and quantify structural hallmarks of asthmatic lungs such as reduced air volume, obstruction of airways and increased soft-tissue content. Furthermore, we found a higher concentration as well as a specific accumulation of the barium-labelled macrophages in asthmatic lung tissue. It is believe that XPCT will be beneficial in preclinical asthma research for both the assessment of therapeutic response as well as the analysis of the role of the recruitment of macrophages to inflammatory sites.

  15. Functionalized synchrotron in-line phase-contrast computed tomography: a novel approach for simultaneous quantification of structural alterations and localization of barium-labelled alveolar macrophages within mouse lung samples

    PubMed Central

    Dullin, Christian; dal Monego, Simeone; Larsson, Emanuel; Mohammadi, Sara; Krenkel, Martin; Garrovo, Chiara; Biffi, Stefania; Lorenzon, Andrea; Markus, Andrea; Napp, Joanna; Salditt, Tim; Accardo, Agostino; Alves, Frauke; Tromba, Giuliana

    2015-01-01

    Functionalized computed tomography (CT) in combination with labelled cells is virtually non-existent due to the limited sensitivity of X-ray-absorption-based imaging, but would be highly desirable to realise cell tracking studies in entire organisms. In this study we applied in-line free propagation X-ray phase-contrast CT (XPCT) in an allergic asthma mouse model to assess structural changes as well as the biodistribution of barium-labelled macrophages in lung tissue. Alveolar macrophages that were barium-sulfate-loaded and fluorescent-labelled were instilled intratracheally into asthmatic and control mice. Mice were sacrificed after 24 h, lungs were kept in situ, inflated with air and scanned utilizing XPCT at the SYRMEP beamline (Elettra Synchrotron Light Source, Italy). Single-distance phase retrieval was used to generate data sets with ten times greater contrast-to-noise ratio than absorption-based CT (in our setup), thus allowing to depict and quantify structural hallmarks of asthmatic lungs such as reduced air volume, obstruction of airways and increased soft-tissue content. Furthermore, we found a higher concentration as well as a specific accumulation of the barium-labelled macrophages in asthmatic lung tissue. It is believe that XPCT will be beneficial in preclinical asthma research for both the assessment of therapeutic response as well as the analysis of the role of the recruitment of macrophages to inflammatory sites. PMID:25537601

  16. X-ray phase computed tomography for nanoparticulated imaging probes and therapeutics: preliminary feasibility study

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Yang, Yi; Tang, Shaojie

    2011-03-01

    With the scientific progress in cancer biology, pharmacology and biomedical engineering, the nano-biotechnology based imaging probes and therapeutical agents (namely probes/agents) - a form of theranostics - are among the strategic solutions bearing the hope for the cure of cancer. The key feature distinguishing the nanoparticulated probes/agents from their conventional counterparts is their targeting capability. A large surface-to-volume ratio in nanoparticulated probes/agents enables the accommodation of multiple targeting, imaging and therapeutic components to cope with the intra- and inter-tumor heterogeneity. Most nanoparticulated probes/agents are synthesized with low atomic number materials and thus their x-ray attenuation are very similar to biological tissues. However, their microscopic structures are very different, which may result in significant differences in their refractive properties. Recently, the investigation in the x-ray grating-based differential phase contrast (DPC) CT has demonstrated its advantages in differentiating low-atomic materials over the conventional attenuation-based CT. We believe that a synergy of x-ray grating-based DPC CT and nanoparticulated imaging probes and therapeutic agents may play a significant role in extensive preclinical and clinical applications, or even become a modality for molecular imaging. Hence, we propose to image the refractive property of nanoparticulated imaging probes and therapeutical agents using x-ray grating-based DPC CT. In this work, we conduct a preliminary feasibility study with a focus to characterize the contrast-to-noise ratio (CNR) and contrast-detail behavior of the x-ray grating-based DPC CT. The obtained data may be instructive to the architecture design and performance optimization of the x-ray grating-based DPC CT for imaging biomarker-targeted imaging probes and therapeutic agents, and even informative to the translation of preclinical research in theranostics into clinical applications.

  17. Synopsis of some preliminary computational studies related to unsaturated zone transport at Area G

    SciTech Connect

    Vold, E.

    1998-03-01

    Computational transport models are described with applications in three problem areas related to unsaturated zone moisture movement beneath Area G. These studies may be used to support the ongoing maintenance of the site Performance Assessment. The three areas include: a 1-D transient analysis with average tuff hydraulic properties in the near surface region with computed results compared to field data; the influence on near surface transient moisture percolation due to realistic distributions in hydraulic properties derived statistically from the observed variance in the field data; and the west to east moisture flow in a 2-D steady geometry approximation of the Pajarito Plateau. Results indicate that a simple transient model for transport of moisture volume fraction fits field data well compared to a moisture pulse observed in the active disposal unit, pit 37. Using realistic infiltration boundary conditions for summer showers and for spring snow melt conditions, the computed moisture pulses show significant propagation to less than 10-ft depth. Next, the hydraulic properties were varied on a 2-D grid using statistical distributions based on the field data means and variances for the hydraulic parameters. Near surface transient percolation in these conditions shows a qualitatively realistic percolation with a spatially variable wave front moving into the tuff; however, the flow does not channel into preferred paths and suggests there is no formation of fast paths which could enhance transportation of contaminants. Finally, moisture transport is modeled through an unsaturated 2-D slice representing the upper stratigraphic layers beneath Area G and a west-to-east cut of several miles to examine possible lateral movement from the west where percolation is assumed to be greater than at Area G. Results show some west-to-east moisture flux consistent with the assumed profile for the percolation boundary conditions.

  18. Applications of computer assisted surgery and medical robotics at the ISSSTE, México: preliminary results.

    PubMed

    Mosso, José Luis; Pohl, Mauricio; Jimenez, Juan Ramon; Valdes, Raquel; Yañez, Oscar; Medina, Veronica; Arambula, Fernando; Padilla, Miguel Angel; Marquez, Jorge; Gastelum, Alfonso; Mosso, Alejo; Frausto, Juan

    2007-01-01

    We present the first results of four projects of a second phase of a Mexican Project Computer Assisted Surgery and Medical Robotics, supported by the Mexican Science and Technology National Council (Consejo Nacional de Ciencia y Tecnología) under grant SALUD-2002-C01-8181. The projects are being developed by three universities (UNAM, UAM, ITESM) and the goal of this project is to integrate a laboratory in a Hospital of the ISSSTE to give service to surgeons or clinicians of Endoscopic surgeons, urologist, gastrointestinal endoscopist and neurosurgeons.

  19. Feature Extraction on Brain Computer Interfaces using Discrete Dyadic Wavelet Transform: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Gareis, I.; Gentiletti, G.; Acevedo, R.; Rufiner, L.

    2011-09-01

    The purpose of this work is to evaluate different feature extraction alternatives to detect the event related evoked potential signal on brain computer interfaces, trying to minimize the time employed and the classification error, in terms of sensibility and specificity of the method, looking for alternatives to coherent averaging. In this context the results obtained performing the feature extraction using discrete dyadic wavelet transform using different mother wavelets are presented. For the classification a single layer perceptron was used. The results obtained with and without the wavelet decomposition were compared; showing an improvement on the classification rate, the specificity and the sensibility for the feature vectors obtained using some mother wavelets.

  20. A preliminary computer pattern analysis of satellite images of mature extratropical cyclones

    NASA Technical Reports Server (NTRS)

    Burfeind, Craig R.; Weinman, James A.; Barkstrom, Bruce R.

    1987-01-01

    This study has applied computerized pattern analysis techniques to the location and classification of features of several mature extratropical cyclones that were depicted in GOES satellite images. These features include the location of the center of the cyclone vortex core and the location of the associated occluded front. The cyclone type was classified in accord with the scheme of Troup and Streten. The present analysis was implemented on a personal computer; results were obtained within approximately one or two minutes without the intervention of an analyst.

  1. In-tank fluid sloshing effects during earthquakes: A preliminary computational simulation

    SciTech Connect

    Park, J.E.; Rezvani, M.A.

    1995-04-01

    Hundreds of underground radioactive waste storage tanks are located at Department of Energy (DOE) sites. At present, no technique for evaluating the pressure loads due to the impact of earthquake generated waves on the side walls and dome of the tanks is known if the wave breaks back on itself. This paper presents the results of two-dimensional Computational Fluid Dynamics (CFD) calculations of the motion of waves in a generic rectangular tank as the result of accelerations recorded during an earthquake. The advantages and limitations of this technique and methods for avoiding the limitations will be discussed.

  2. Group training with healthy computing practices to prevent repetitive strain injury (RSI): a preliminary study.

    PubMed

    Peper, Erik; Gibney, Katherine H; Wilson, Vietta E

    2004-12-01

    This pilot study investigated whether group training, in which participants become role models and coaches, would reduce discomfort as compared to a nontreatment Control Group. Sixteen experimental participants participated in 6 weekly 2-hr group sessions of a Healthy Computing program whereas 12 control participants received no training. None of the participants reported symptoms to their supervisors nor were they receiving medical treatment for repetitive strain injury prior to the program. The program included training in ergonomic principles, psychophysiological awareness and control, sEMG practice at the workstation, and coaching coworkers. Using two-tailed t tests to analyze the data, the Experimental Group reported (1) a significant overall reduction in most body symptoms as compared to the Control Group and (2) a significant increase in positive work-style habits, such as taking breaks at the computer, as compared to the Control Group. This study suggests that employees could possibly improve health and work style patterns based on a holistic training program delivered in a group format followed by individual practice.

  3. A Statistical Model and Computer program for Preliminary Calculations Related to the Scaling of Sensor Arrays

    SciTech Connect

    Max Morris

    2001-04-01

    Recent advances in sensor technology and engineering have made it possible to assemble many related sensors in a common array, often of small physical size. Sensor arrays may report an entire vector of measured values in each data collection cycle, typically one value per sensor per sampling time. The larger quantities of data provided by larger arrays certainly contain more information, however in some cases experience suggests that dramatic increases in array size do not always lead to corresponding improvements in the practical value of the data. The work leading to this report was motivated by the need to develop computational planning tools to approximate the relative effectiveness of arrays of different size (or scale) in a wide variety of contexts. The basis of the work is a statistical model of a generic sensor array. It includes features representing measurement error, both common to all sensors and independent from sensor to sensor, and the stochastic relationships between the quantities to be measured by the sensors. The model can be used to assess the effectiveness of hypothetical arrays in classifying objects or events from two classes. A computer program is presented for evaluating the misclassification rates which can be expected when arrays are calibrated using a given number of training samples, or the number of training samples required to attain a given level of classification accuracy. The program is also available via email from the first author for a limited time.

  4. The dissemination of computer-based psychological treatment: a preliminary analysis of patient and clinician perceptions.

    PubMed

    Carper, Matthew M; McHugh, R Kathryn; Barlow, David H

    2013-03-01

    Computerized cognitive behavioral therapy is an efficacious treatment for anxiety and depression with the potential to improve access to evidence-based care. However, its adoption in clinical practice in the US has been low and thus there is a need for identification of barriers to its use. We examined treatment-seeking patient (n = 55) and clinician (n = 26) perceptions of computer-based psychological treatment (CBPT) using Diffusion of Innovations theory as a conceptual framework. Diffusion of Innovations theory emphasizes potential adopter perceptions as being key to understanding adoption decisions, thus making it an ideal framework for evaluating barriers to use. Overall, treatment-seeking patients held slightly negative perceptions of CBPT, while clinicians' perceptions were more neutral. In both groups, perceptions of observability (seeing or hearing about the treatment in use) were rated lowest. Implications for dissemination efforts and suggestions for future research are discussed.

  5. Development of a Computer Program for Analyzing Preliminary Aircraft Configurations in Relationship to Emerging Agility Metrics

    NASA Technical Reports Server (NTRS)

    Bauer, Brent

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.

  6. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    NASA Astrophysics Data System (ADS)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  7. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  8. Automatic, computer-based speech assessment on edentulous patients with and without complete dentures - preliminary results.

    PubMed

    Stelzle, F; Ugrinovic, B; Knipfer, C; Bocklet, T; Nöth, E; Schuster, M; Eitner, S; Seiss, M; Nkenke, E

    2010-03-01

    Dental rehabilitation of edentulous patients with complete dentures includes not only aesthetics and mastication of food, but also speech quality. It was the aim of this study to introduce and validate a computer-based speech recognition system (ASR) for automatic speech assessment in edentulous patients after dental rehabilitation with complete dentures. To examine the impact of dentures on speech production, the speech outcome of edentulous patients with and without complete dentures was compared. Twenty-eight patients reading a standardized text were recorded twice - with and without their complete dentures in situ. A control group of 40 healthy subjects with natural dentition was recorded under the same conditions. Speech quality was evaluated by means of a polyphone-based ASR according to the percentage of the word accuracy (WA). Speech acceptability assessment by expert listeners and the automatic rating of the WA by the ASR showed a high correlation (corr = 0.71). Word accuracy was significantly reduced in edentulous speakers (55.42 +/- 13.1) compared to the control group's WA (69.79 +/- 10.6). On the other hand, wearing complete dentures significantly increased the WA of the edentulous patients (60.00 +/- 15.6). Speech production quality is significantly reduced after complete loss of teeth. Reconstitution of speech production quality is an important part of dental rehabilitation and can be improved for edentulous patients by means of complete dentures. The ASR has proven to be a useful and easily applicable tool for automatic speech assessment in a standardized way.

  9. A preliminary 3D computed tomography study of the human maxillary sinus and nasal cavity.

    PubMed

    Butaric, Lauren N; McCarthy, Robert C; Broadfield, Douglas C

    2010-11-01

    Despite centuries of investigation, the function of the maxillary sinus (MS) and underlying patterns governing its form remain elusive. In this study, we articulate a methodology for collecting volumetric data for the MS and nasal cavity (NC) from computed tomography (CT) scans and report details for a small sample of 39 dried human crania of known ecogeographic provenience useful for assessing variation in MS size and shape. We use scaling analyses to preliminarily test the hypothesis that volumes of the nasal cavity (NCV) and maxillary sinus (MSV) are inversely correlated such that the NC covaries with size of the face, whereas the MS "fills in" the leftover space [proposed by Shea: Am J Phys Anthropol 47 (1977):289-300]. Against expectation, MSV is not significantly correlated with NCV or any cranial size variable. NCV, on the other hand, scales isometrically with facial size. The results of this pilot study suggest that NCV covaries with facial size, but that the MS does not simply fill in the leftover space in the face. The role, if any, of the MSs in midfacial function and architecture remains unclear. Larger sample sizes, additional environmental variables, and assessment of MS and NC shape are necessary to resolve this issue.

  10. Preliminary validation of computational procedures for a new atmospheric ionizing radiation (AIR) model.

    PubMed

    Clem, John M; De Angelis, Giovanni; Goldhagen, Paul; Wilson, John W

    2003-01-01

    A new computational procedure to determine particle fluxes in the Earth's atmosphere is presented. The primary cosmic ray spectrum has been modeled through an analysis of simultaneous proton and helium measurements made on high altitude balloon flights and spacecraft. An improved global fit to the data was achieved through applying a unique technique utilizing the Fokker-Plank equation with a non-linear rigidity-dependent diffusion coefficient. The propagation of primary particles through the Earth's atmosphere is calculated with a three-dimensional Monte Carlo transport program called FLUKA. Primary protons and helium nuclei (alphas) are generated within the rigidity range of 0.5 GV-20 TV uniform in cos2 theta. For a given location, primaries above the effective cutoff rigidity are transported through the atmosphere. Alpha particles are initially transported with a separate package called HEAVY to simulate fragmentation. This package interfaces with FLUKA to provide interaction starting points for each nucleon originating from a helium nucleus. Results from this calculation are presented and compared to measurements.

  11. Effectiveness of systematic articulation training program accessing computers (SATPAC) approach to remediate dentalized and interdental /s, z/: a preliminary study.

    PubMed

    Flipsen, Peter; Sacks, Stephen; Neils-Strunjas, Jean

    2013-10-01

    Traditional methods for treating speech distortion errors in older school-age children have tended to yield mixed success. The current study was a preliminary evaluation of an alternative approach called the Systematic Articulation Training Program Accessing Computers (SATPAC), which was tested for the remediation of /s/ and /z/. Procedures involved a sequence of phonetic placement and/or oral-motor placement cues as needed to establish the targets, followed by concentrated drill structured around a facilitating context nonsense word and then advanced to more natural contexts. Participants were 18 children aged 6 years, 9 months to 11 years, 10 months. Treatment involved once per week, individual, 10-min. sessions with an experienced speech-language pathologist. Group A (n = 9) received 15 weeks of treatment, while treatment was delayed for Group B (n = 9). Then the groups were reversed. During period one, Group A (treated) significantly improved their accuracy of /s, z/ in spontaneous speech, while Group B (untreated) showed no significant change. During period two, Group B improved significantly when treatment was applied. The majority of the participants retained proficiency two years later.

  12. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    PubMed

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements.

  13. Optical computed tomography utilizing a rotating mirror and Fresnel lenses: operating principles and preliminary results

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Wuu, Cheng-Shie

    2013-02-01

    The performance of a fast optical computed tomography (CT) scanner based on a point laser source, a small area photodiode detector, and two optical-grade Fresnel lenses is evaluated. The OCTOPUS™-10× optical CT scanner (MGS Research Inc., Madison, CT) is an upgrade of the OCTOPUS™ research scanner with improved design for faster motion of the laser beam and faster data acquisition process. The motion of the laser beam in the new configuration is driven by the rotational motion of a scanning mirror. The center of the scanning mirror and the center of the photodiode detector are adjusted to be on the focal points of two coaxial Fresnel lenses. A glass water tank is placed between the two Fresnel lenses to house gel phantoms and matching liquids. The laser beam scans over the water tank in parallel beam geometry for projection data as the scanning mirror rotates at a frequency faster than 0.1 s per circle. Signal sampling is performed independently of the motion of the scanning mirror, to reduce the processing time for the synchronization of the stepper motors and the data acquisition board. An in-house developed reference image normalization mechanism is added to the image reconstruction program to correct the non-uniform light transmitting property of the Fresnel lenses. Technical issues with regard to the new design of the scanner are addressed, including projection data extraction from raw data samples, non-uniform pixel averaging and reference image normalization. To evaluate the dosimetric accuracy of the scanner, the reconstructed images from a 16 MeV, 6 cm × 6 cm electron field irradiation were compared with those from the Eclipse treatment planning system (Varian Corporation, Palo Alto, CA). The spatial resolution of the scanner is demonstrated to be of sub-millimeter accuracy. The effectiveness of the reference normalization method for correcting the non-uniform light transmitting property of the Fresnel lenses is analyzed. A sub-millimeter accuracy of

  14. Optical computed tomography utilizing a rotating mirror and Fresnel lenses: operating principles and preliminary results.

    PubMed

    Xu, Y; Wuu, Cheng-Shie

    2013-02-07

    The performance of a fast optical computed tomography (CT) scanner based on a point laser source, a small area photodiode detector, and two optical-grade Fresnel lenses is evaluated. The OCTOPUS™-10× optical CT scanner (MGS Research Inc., Madison, CT) is an upgrade of the OCTOPUS™ research scanner with improved design for faster motion of the laser beam and faster data acquisition process. The motion of the laser beam in the new configuration is driven by the rotational motion of a scanning mirror. The center of the scanning mirror and the center of the photodiode detector are adjusted to be on the focal points of two coaxial Fresnel lenses. A glass water tank is placed between the two Fresnel lenses to house gel phantoms and matching liquids. The laser beam scans over the water tank in parallel beam geometry for projection data as the scanning mirror rotates at a frequency faster than 0.1 s per circle. Signal sampling is performed independently of the motion of the scanning mirror, to reduce the processing time for the synchronization of the stepper motors and the data acquisition board. An in-house developed reference image normalization mechanism is added to the image reconstruction program to correct the non-uniform light transmitting property of the Fresnel lenses. Technical issues with regard to the new design of the scanner are addressed, including projection data extraction from raw data samples, non-uniform pixel averaging and reference image normalization. To evaluate the dosimetric accuracy of the scanner, the reconstructed images from a 16 MeV, 6 cm × 6 cm electron field irradiation were compared with those from the Eclipse treatment planning system (Varian Corporation, Palo Alto, CA). The spatial resolution of the scanner is demonstrated to be of sub-millimeter accuracy. The effectiveness of the reference normalization method for correcting the non-uniform light transmitting property of the Fresnel lenses is analyzed. A sub

  15. Cloud Computing-based Platform for Drought Decision-Making using Remote Sensing and Modeling Products: Preliminary Results for Brazil

    NASA Astrophysics Data System (ADS)

    Vivoni, E.; Mascaro, G.; Shupe, J. W.; Hiatt, C.; Potter, C. S.; Miller, R. L.; Stanley, J.; Abraham, T.; Castilla-Rubio, J.

    2012-12-01

    Droughts and their hydrological consequences are a major threat to food security throughout the world. In arid and semiarid regions dependent on irrigated agriculture, prolonged droughts lead to significant and recurring economic and social losses. In this contribution, we present preliminary results on integrating a set of multi-resolution drought indices into a cloud computing-based visualization platform. We focused our initial efforts on Brazil due to a severe, on-going drought in a large agricultural area in the northeastern part of the country. The online platform includes drought products developed from: (1) a MODIS-based water stress index (WSI) based on inferences from normalized difference vegetation index and land surface temperature fields, (2) a volumetric water content (VWC) index obtained from application of the NASA CASA model, and (3) a set of AVHRR-based vegetation health indices obtained from NOAA/NESDIS. The drought indices are also presented in terms of anomalies with respect to a baseline period. Since our main objective is to engage stakeholders and decision-makers in Brazil, we incorporated other relevant geospatial data into the platform, including irrigation areas, dams and reservoirs, administrative units and annual climate information. We will also present a set of use cases developed to help stakeholders explore, query and provide feedback that allowed fine-tuning of the drought product delivery, presentation and analysis tools. Finally, we discuss potential next steps in development of the online platform, including applications at finer resolutions in specific basins and at a coarser global scale.

  16. The Uses of Computers in High Schools. A Teacher's Manual Prepared by the Teachers of the Huntington Computer Project. Preliminary Version.

    ERIC Educational Resources Information Center

    Braun, Ludwig; Visich, Marian, Jr.

    The material presented in this three-volume manual is a summary of the history and experiences of the Huntington Computer Project, and the purpose of the manual is to assist high school teachers and other educators who wish to explore the uses of computers in high schools. The first volume contains an introductory section which gives the…

  17. Preliminary Analysis of a Randomized Trial of Computer Attention Training in Children with Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Steiner, N.; Sidhu, T. K.; Frenette, E. C.; Mitchell, K.; Perrin, E. C.

    2011-01-01

    Clinically significant attention problems among children present a significant obstacle to increasing student achievement. Computer-based attention training holds great promise as a way for schools to address this problem. The aim of this project is to evaluate the efficacy of two computer-based attention training systems in schools. One program…

  18. Computer-Assisted Learning in Anatomy at the International Medical School in Debrecen, Hungary: A Preliminary Report

    ERIC Educational Resources Information Center

    Kish, Gary; Cook, Samuel A.; Kis, Greta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an…

  19. Monitoring the Microgravity Environment Quality On-board the International Space Station Using Soft Computing Techniques. Part 2; Preliminary System Performance Results

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.

    2002-01-01

    This paper presents the preliminary performance results of the artificial intelligence monitoring system in full operational mode using near real time acceleration data downlinked from the International Space Station. Preliminary microgravity environment characterization analysis result for the International Space Station (Increment-2), using the monitoring system is presented. Also, comparison between the system predicted performance based on ground test data for the US laboratory "Destiny" module and actual on-orbit performance, using measured acceleration data from the U.S. laboratory module of the International Space Station is presented. Finally, preliminary on-orbit disturbance magnitude levels are presented for the Experiment of Physics of Colloids in Space, which are compared with on ground test data. The ground test data for the Experiment of Physics of Colloids in Space were acquired from the Microgravity Emission Laboratory, located at the NASA Glenn Research Center, Cleveland, Ohio. The artificial intelligence was developed by the NASA Glenn Principal Investigator Microgravity Services Project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment of time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a dynamic graphical display, implemented in Java, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, structural modes, etc., and decide whether or not to run their experiments, whenever that is an option, based on the acceleration magnitude and frequency sensitivity associated with that experiment. This monitoring system detects primarily the vibratory disturbance sources. The system has built-in capability to detect both known

  20. Atomic-scale tomography: a 2020 vision.

    PubMed

    Kelly, Thomas F; Miller, Michael K; Rajan, Krishna; Ringer, Simon P

    2013-06-01

    Atomic-scale tomography (AST) is defined and its place in microscopy is considered. Arguments are made that AST, as defined, would be the ultimate microscopy. The available pathways for achieving AST are examined and we conclude that atom probe tomography (APT) may be a viable basis for AST on its own and that APT in conjunction with transmission electron microscopy is a likely path as well. Some possible configurations of instrumentation for achieving AST are described. The concept of metaimages is introduced where data from multiple techniques are melded to create synergies in a multidimensional data structure. When coupled with integrated computational materials engineering, structure-properties microscopy is envisioned. The implications of AST for science and technology are explored.

  1. Positron Emission Tomography: A Basic Analysis

    NASA Astrophysics Data System (ADS)

    Kerbacher, M. E.; Deaton, J. W.; Phinney, L. C.; Mitchell, L. J.; Duggan, J. L.

    2007-10-01

    Positron Emission Tomography is useful in detecting biological abnormalities. The technique involves attaching radiotracers to a material used inside the body, in many cases glucose. Glucose is absorbed most readily in areas of unusual cell growth or uptake of nutrients so through natural processes the treated glucose highlights regions of tumors and other degenerative disorders such as Alzheimer's disease. The higher the concentration of isotopes, the more dynamic the area. Isotopes commonly used as tracers are 11C, 18F, 13N, and 15O due to their easy production and short half-lives. Once the tracers have saturated an area of tissue they are detected using coincidence detectors collinear with individual isotopes. As the isotope decays it emits a positron which, upon annihilating an electron, produces two oppositely directioned gamma rays. The PET machine consists of several pairs of detectors, each 180 degrees from their partner detector. When the oppositely positioned detectors are collinear with the area of the isotope, a computer registers the location of the isotope and can compile an image of the activity of the highlighted area based on the position and strength of the isotopes.

  2. Modeling resident error-making patterns in detection of mammographic masses using computer-extracted image features: preliminary experiments

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Zhang, Jing; Lo, Joseph Y.; Kuzmiak, Cherie M.; Ghate, Sujata V.; Yoon, Sora

    2014-03-01

    Providing high quality mammography education to radiology trainees is essential, as good interpretation skills potentially ensure the highest benefit of screening mammography for patients. We have previously proposed a computer-aided education system that utilizes trainee models, which relate human-assessed image characteristics to interpretation error. We proposed that these models be used to identify the most difficult and therefore the most educationally useful cases for each trainee. In this study, as a next step in our research, we propose to build trainee models that utilize features that are automatically extracted from images using computer vision algorithms. To predict error, we used a logistic regression which accepts imaging features as input and returns error as output. Reader data from 3 experts and 3 trainees were used. Receiver operating characteristic analysis was applied to evaluate the proposed trainee models. Our experiments showed that, for three trainees, our models were able to predict error better than chance. This is an important step in the development of adaptive computer-aided education systems since computer-extracted features will allow for faster and more extensive search of imaging databases in order to identify the most educationally beneficial cases.

  3. The Opinions of the Kindergarten Teachers in Relation to the Introduction of Computers to Nursery Schools: Preliminary Approach

    ERIC Educational Resources Information Center

    Sivropoulou, Irene; Tsapakidou, Aggeliki; Kiridis, Argyris

    2009-01-01

    Computers were introduced in Greek kindergartens of our country with the new curricula for kindergarten, "Inter-disciplinary Integrated Framework of Study Programs," ("Official Journal of the Hellenic Republic," 376 t.B/18-10-2001, article 6), in order to contribute to the spherical growth of children and to extend their…

  4. Guideline for minimizing radiation exposure during acquisition of coronary artery calcium scans with the use of multidetector computed tomography: a report by the Society for Atherosclerosis Imaging and Prevention Tomographic Imaging and Prevention Councils in collaboration with the Society of Cardiovascular Computed Tomography.

    PubMed

    Voros, Szilard; Rivera, Juan J; Berman, Daniel S; Blankstein, Ron; Budoff, Matthew J; Cury, Ricardo C; Desai, Milind Y; Dey, Damini; Halliburton, Sandra S; Hecht, Harvey S; Nasir, Khurram; Santos, Raul D; Shapiro, Michael D; Taylor, Allen J; Valeti, Uma S; Young, Phillip M; Weissman, Gaby

    2011-01-01

    Coronary artery calcium (CAC) scanning is an important tool for risk stratification in intermediate-risk, asymptomatic subjects without previous coronary disease. However, the clinical benefit of improved risk prediction needs to be balanced against the risk of the use of ionizing radiation. Although there is increasing emphasis on the need to obtain CAC scans at low-radiation exposure to the patient, very few practical documents exist to aid laboratories and health care professionals on how to obtain such low-radiation scans. The Tomographic Imaging Council of the Society for Atherosclerosis Imaging and Prevention, in collaboration with the Prevention Council and the Society of Cardiovascular Computed Tomography, created a task force and writing group to generate a practical document to address parameters that can be influenced by careful attention to image acquisition. Patient selection for CAC scanning should be based on national guidelines. It is recommended that laboratories performing CAC examinations monitor radiation exposure (dose-length-product [DLP]) and effective radiation dose (E) in all patients. DLP should be <200 mGy × cm; E should average 1.0-1.5 mSv and should be <3.0 mSv. On most scanner platforms, CAC imaging should be performed in an axial mode with prospective electrocardiographic triggering, using tube voltage of 120 kVp. Tube current should be carefully selected on the basis of patient size, potentially using chest lateral width measured on the topogram. Scan length should be limited for the coverage of the heart only. When patients and imaging parameters are selected appropriately, CAC scanning can be performed with low levels of radiation exposure.

  5. An analysis of Space Shuttle countdown activities: Preliminaries to a computational model of the NASA Test Director

    NASA Technical Reports Server (NTRS)

    John, Bonnie E.; Remington, Roger W.; Steier, David M.

    1991-01-01

    Before all systems are go just prior to the launch of a space shuttle, thousands of operations and tests have been performed to ensure that all shuttle and support subsystems are operational and ready for launch. These steps, which range from activating the orbiter's flight computers to removing the launch pad from the itinerary of the NASA tour buses, are carried out by launch team members at various locations and with highly specialized fields of expertise. The liability for coordinating these diverse activities rests with the NASA Test Director (NTD) at NASA-Kennedy. The behavior is being studied of the NTD with the goal of building a detailed computational model of that behavior; the results of that analysis to date are given. The NTD's performance is described in detail, as a team member who must coordinate a complex task through efficient audio communication, as well as an individual taking notes and consulting manuals. A model of the routine cognitive skill used by the NTD to follow the launch countdown procedure manual was implemented using the Soar cognitive architecture. Several examples are given of how such a model could aid in evaluating proposed computer support systems.

  6. Design of a Tablet Computer App for Facilitation of a Molecular Blood Culture Test in Clinical Microbiology and Preliminary Usability Evaluation

    PubMed Central

    Pape-Haugaard, Louise; Meltzer, Michelle C; Fuchs, Martin; Schønheyder, Henrik C; Hejlesen, Ole

    2016-01-01

    Background User mobility is an important aspect of the development of clinical information systems for health care professionals. Mobile phones and tablet computers have obtained widespread use by health care professionals, offering an opportunity for supporting the access to patient information through specialized applications (apps) while supporting the mobility of the users. The use of apps for mobile phones and tablet computers may support workflow of complex tasks, for example, molecular-based diagnostic tests in clinical microbiology. Multiplex Blood Culture Test (MuxBCT) is a molecular-based diagnostic test used for rapid identification of pathogens in positive blood cultures. To facilitate the workflow of the MuxBCT, a specialized tablet computer app was developed as an accessory to the diagnostic test. The app aims to reduce the complexity of the test by step-by-step guidance of microscopy and to assist users in reaching an exact bacterial or fungal diagnosis based on blood specimen observations and controls. Additionally, the app allows for entry of test results, and communication thereof to the laboratory information system (LIS). Objective The objective of the study was to describe the design considerations of the MuxBCT app and the results of a preliminary usability evaluation. Methods The MuxBCT tablet app was developed and set up for use in a clinical microbiology laboratory. A near-live simulation study was conducted in the clinical microbiology laboratory to evaluate the usability of the MuxBCT app. The study was designed to achieve a high degree of realism as participants carried out a scenario representing the context of use for the MuxBCT app. As the MuxBCT was under development, the scenario involved the use of molecular blood culture tests similar to the MuxBCT for identification of microorganisms from positive blood culture samples. The study participants were observed, and their interactions with the app were recorded. After the study, the

  7. Design and preliminary evaluation of the FINGER rehabilitation robot: controlling challenge and quantifying finger individuation during musical computer game play

    PubMed Central

    2014-01-01

    Background This paper describes the design and preliminary testing of FINGER (Finger Individuating Grasp Exercise Robot), a device for assisting in finger rehabilitation after neurologic injury. We developed FINGER to assist stroke patients in moving their fingers individually in a naturalistic curling motion while playing a game similar to Guitar Hero®a. The goal was to make FINGER capable of assisting with motions where precise timing is important. Methods FINGER consists of a pair of stacked single degree-of-freedom 8-bar mechanisms, one for the index and one for the middle finger. Each 8-bar mechanism was designed to control the angle and position of the proximal phalanx and the position of the middle phalanx. Target positions for the mechanism optimization were determined from trajectory data collected from 7 healthy subjects using color-based motion capture. The resulting robotic device was built to accommodate multiple finger sizes and finger-to-finger widths. For initial evaluation, we asked individuals with a stroke (n = 16) and without impairment (n = 4) to play a game similar to Guitar Hero® while connected to FINGER. Results Precision design, low friction bearings, and separate high speed linear actuators allowed FINGER to individually actuate the fingers with a high bandwidth of control (−3 dB at approximately 8 Hz). During the tests, we were able to modulate the subject’s success rate at the game by automatically adjusting the controller gains of FINGER. We also used FINGER to measure subjects’ effort and finger individuation while playing the game. Conclusions Test results demonstrate the ability of FINGER to motivate subjects with an engaging game environment that challenges individuated control of the fingers, automatically control assistance levels, and quantify finger individuation after stroke. PMID:24495432

  8. Pledget-Armed Sutures Affect the Haemodynamic Performance of Biologic Aortic Valve Substitutes: A Preliminary Experimental and Computational Study.

    PubMed

    Capelli, Claudio; Corsini, Chiara; Biscarini, Dario; Ruffini, Francesco; Migliavacca, Francesco; Kocher, Alfred; Laufer, Guenther; Taylor, Andrew M; Schievano, Silvia; Andreas, Martin; Burriesci, Gaetano; Rath, Claus

    2017-03-01

    Surgical aortic valve replacement is the most common procedure of choice for the treatment of severe aortic stenosis. Bioprosthetic valves are traditionally sewed-in the aortic root by means of pledget-armed sutures during open-heart surgery. Recently, novel bioprostheses which include a stent-based anchoring system have been introduced to allow rapid implantation, therefore reducing the duration and invasiveness of the intervention. Different effects on the hemodynamics were clinically reported associated with the two technologies. The aim of this study was therefore to investigate whether the differences in hemodynamic performances are an effect of different anchoring systems. Two commercially available bio-prosthetic aortic valves, one sewed-in with pledget-armed sutures and one rapid-deployment, were thus tested in this study by means of a combined approach of experimental and computational tools. In vitro experiments were performed to evaluate the overall hydrodynamic performance under identical standard conditions; computational fluid dynamics analyses were set-up to explore local flow variations due to different design of the anchoring system. The results showed how the performance of cardiac valve substitutes is negatively affected by the presence of pledget-armed sutures. These are causing flow disturbances, which in turn increase the mean pressure gradient and decrease the effective orifice area. The combined approach of experiments and numerical simulations can be effectively used to quantify the detailed relationship between local fluid-dynamics and overall performances associated with different valve technologies.

  9. Changing Community Health Behaviors with a Health Promotion Computer Network: Preliminary Findings from Stanford Health-Net

    PubMed Central

    Robinson, Thomas N.; Walters, Paul A.

    1987-01-01

    Computer-based health education has been employed in many settings. However, data on resultant behavior change are lacking. A randomized, controlled, prospective study was performed to test the efficacy of Stanford Health-Net in changing community health behaviors. Graduate and undergraduate students (N=1003) were randomly assigned to treatment and control conditions. The treatment group received access to Health-Net, a health promotion computer network emphasizing specific self-care and preventive strategies. Over a four month intervention period, 26% of the treatment group used Health-Net an average of 6.4 times each (range 1 to 97). Users rated Health-Net favorably. The mean number of ambulatory medical visits decreesed 22.5% more in the treatment group than in the control group (P<.05), while hospitalizations did not differ significantly between groups. In addition, perceived self-efficacy for preventing the acquisition of a STD and herpes increased 577% (P<.05) and 261% (P<.01) more, respectively, in the treatment group than in the control group. These findings suggest that access to Stanford Health-Net can result in significant health behavior change. The advantages of the network approach make it a potential model for other communities.

  10. Computer simulations of comet- and asteroidlike bodies passing through the Venusian atmosphere: Preliminary results on atmospheric and ground shock effects

    NASA Technical Reports Server (NTRS)

    Roddy, D.; Hatfield, D.; Hassig, P.; Rosenblatt, M.; Soderblom, L.; Dejong, E.

    1992-01-01

    We have completed computer simulations that model shock effects in the venusian atmosphere caused during the passage of two cometlike bodies 100 m and 1000 m in diameter and an asteroidlike body 10 km in diameter. Our objective is to examine hypervelocity-generated shock effects in the venusian atmosphere for bodies of different types and sizes in order to understand the following: (1) their deceleration and depth of penetration through the atmosphere; and (2) the onset of possible ground-surface shock effects such as splotches, craters, and ejecta formations. The three bodies were chosen to include both a range of general conditions applicable to Venus as well as three specific cases of current interest. These calculations use a new multiphase computer code (DICE-MAZ) designed by California Research & Technology for shock-dynamics simulations in complex environments. The code was tested and calibrated in large-scale explosion, cratering, and ejecta research. It treats a wide range of different multiphase conditions, including material types (vapor, melt, solid), particle-size distributions, and shock-induced dynamic changes in velocities, pressures, temperatures (internal energies), densities, and other related parameters, all of which were recorded in our calculations.

  11. A preliminary study of a cloud-computing model for chronic illness self-care support in an underdeveloped country

    PubMed Central

    Piette, John D.; Mendoza-Avelares, Milton O.; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila

    2013-01-01

    Background Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. Objective This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. Methods A single group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients’ cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for six weeks, with automated follow-up emails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a six week follow-up. Other measures included patients’ glycemic control (A1c) and data from the IVR calling system. Results 55% of participants completed the majority of their IVR calls and 33% completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean A1c’s decreased from 10.0% at baseline to 8.9% at follow-up (p<.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Conclusions Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in under-developed countries. PMID:21565655

  12. Preliminary evaluation of the dosimetric accuracy of cone-beam computed tomography for cases with respiratory motion

    NASA Astrophysics Data System (ADS)

    Kim, Dong Wook; Bae, Sunhyun; Chung, Weon Kuu; Lee, Yoonhee

    2014-04-01

    Cone-beam computed tomography (CBCT) images are currently used for patient positioning and adaptive dose calculation; however, the degree of CBCT uncertainty in cases of respiratory motion remains an interesting issue. This study evaluated the uncertainty of CBCT-based dose calculations for a moving target. Using a phantom, we estimated differences in the geometries and the Hounsfield units (HU) between CT and CBCT. The calculated dose distributions based on CT and CBCT images were also compared using a radiation treatment planning system, and the comparison included cases with respiratory motion. The geometrical uncertainties of the CT and the CBCT images were less than 0.15 cm. The HU differences between CT and CBCT images for standard-dose-head, high-quality-head, normal-pelvis, and low-dose-thorax modes were 31, 36, 23, and 33 HU, respectively. The gamma (3%, 0.3 cm)-dose distribution between CT and CBCT was greater than 1 in 99% of the area. The gamma-dose distribution between CT and CBCT during respiratory motion was also greater than 1 in 99% of the area. The uncertainty of the CBCT-based dose calculation was evaluated for cases with respiratory motion. In conclusion, image distortion due to motion did not significantly influence dosimetric parameters.

  13. Preliminary assessment of facial soft tissue thickness utilizing three-dimensional computed tomography models of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2014-04-01

    Facial approximation is the technique of developing a representation of the face from the skull of an unknown individual. Facial approximation relies heavily on average craniofacial soft tissue depths. For more than a century, researchers have employed a broad array of tissue depth collection methodologies, a practice which has resulted in a lack of standardization in craniofacial soft tissue depth research. To combat such methodological inconsistencies, Stephan and Simpson 2008 [15] examined and synthesized a large number of previously published soft tissue depth studies. Their comprehensive meta-analysis produced a pooled dataset of averaged tissue depths and a simplified methodology, which the researchers suggest be utilized as a minimum standard protocol for future craniofacial soft tissue depth research. The authors of the present paper collected craniofacial soft tissue depths using three-dimensional models generated from computed tomography scans of living males and females of four self-identified ancestry groups from the United States ranging in age from 18 to 62 years. This paper assesses the differences between: (i) the pooled mean tissue depth values from the sample utilized in this paper and those published by Stephan 2012 [21] and (ii) the mean tissue depth values of two demographically similar subsets of the sample utilized in this paper and those published by Rhine and Moore 1984 [16]. Statistical test results indicate that the tissue depths collected from the sample evaluated in this paper are significantly and consistently larger than those published by Stephan 2012 [21]. Although a lack of published variance data by Rhine and Moore 1984 [16] precluded a direct statistical assessment, a substantive difference was also concluded. Further, the dataset presented in this study is representative of modern American adults and is, therefore, appropriate for use in constructing contemporary facial approximations.

  14. Biological characterization of the skin of shortfin mako shark Isurus oxyrinchus and preliminary study of the hydrodynamic behaviour through computational fluid dynamics.

    PubMed

    Díez, G; Soto, M; Blanco, J M

    2015-07-01

    This study characterized the morphology, density and orientation of the dermal denticles along the body of a shortfin mako shark Isurus oxyrinchus and identified the hydrodynamic parameters of its body through a computational fluid-dynamics model. The study showed a great variability in the morphology, size, shape, orientation and density of dermal denticles along the body of I. oxyrinchus. There was a significant higher density in dorsal and ventral areas of the body and their highest angular deviations were found in the lower part of the mouth and in the areas between the pre-caudal pit and the second dorsal and pelvic fins. A detailed three-dimensional geometry from a scanned body of a shark was carried out to evaluate the hydrodynamic properties such as drag coefficient, lift coefficient and superficial (skin) friction coefficient of the skin together with flow velocity field, according to different roughness coefficients simulating the effect of the dermal denticles. This preliminary approach contributed to detailed information of the denticle interactions. As the height of the denticles was increased, flow velocity and the effect of lift decreased whereas drag increased. The highest peaks of skin friction coefficient were observed around the pectoral fins.

  15. On-line computer system to minimize laser injuries during surgery: preliminary system layout and proposal of the key features.

    PubMed

    Canestri, F

    1999-01-01

    The aim of this paper is to investigate some new user interface ideas and related application packages which aim to improve the degree of safety in an operating room during surgical operations in which an invasive laser beam is deployed. The overall value of the proposition is that a means is provided which ensures the successful completion of the surgical case while minimizing the risk of thermal and mechanical injuries to healthy tissues adjacent to the surgical field. According to surgeons operating with a variety of CO2 lasers available at both the National Cancer Institute in Milan, Italy, and the Sackler School of Medicine, Tel Aviv University, Israel, each laser device presents different cutting and coagulation properties. In order to identify which 'ideal' procedure might corroborate the subjective impression of each surgeon and also to provide one common tool to ensure procedures with a high level of safety, the author has worked for several months with surgeons and technicians of both Institutions to define the general design of a new on-line surgical operation planning and design system to be used during the pre-operative briefing activities and also as a consultation tool during operation. This software package will be developed and tested on both 'C' and FORTRAN compilers running on a commercially available PC which is driving a continuous wave (CW) CO2 laser device via its Instrument Bus interface. The present proposal describes the details of a software package called LCA (Laser-beam Controller and Adviser) which performs several controls in parallel on the key output parameters of a laser beam device during its utilization in delicate surgical operations. The required performances of this device needed during a given surgical operation are pre-simulated and compared against the well-known safety limits, which are stored in the computer's mass storage. If the surgeon's decision about the laser device set-up are considered to be too close to the

  16. SUPER-RESOLUTION ULTRASOUND TOMOGRAPHY: A PRELIMINARY STUDY WITH A RING ARRAY

    SciTech Connect

    HUANG, LIANJIE; SIMONETTI, FRANCESCO; DURIC, NEBOJSA; RAMA, OLSI

    2007-01-18

    Ultrasound tomography attempts to retrieve the structure of an objective by exploiting the interaction of acoustic waves with the object. A fundamental limit of ultrasound tomography is that features cannot be resolved if they are spaced less than {lambda}/2 apart, where {lambda} is wavelength of the probing wave, regardless of the degree of accuracy of the measurements. Therefore, since the attenuation of the probing wave with propagation distance increases as {lambda} decreases, resolution has to be traded against imaging depth. Recently, it has been shown that the {lambda}/2 limit is a consequence of the Born approximation (implicit in the imaging algorithms currently employed) which neglects the distortion of the probing wavefield as it travels through the medium to be imaged. On the other hand, such a distortion, which is due to the multiple scattering phenomenon, can encode unlimited resolution in the radiating component of the scattered field. Previously, a resolution better than {lambda}/3 has been reported in these proceedings [F. Simonetti, pp. 126 (2006)] in the case of elastic wave probing. In this paper, they demonstrate experimentally a resolution better than {lambda}/4 for objects immersed in a water bth probed by means of a ring array which excites and detects pressure waves in a full view configuration.

  17. Computer simulations of large asteroid impacts into oceanic and continental sites--preliminary results on atmospheric, cratering and ejecta dynamics

    USGS Publications Warehouse

    Roddy, D.J.; Schuster, S.H.; Rosenblatt, M.; Grant, L.B.; Hassig, P.J.; Kreyenhagen, K.N.

    1987-01-01

    Computer simulations have been completed that describe passage of a 10-km-diameter asteroid through the Earth's atmosphere and the subsequent cratering and ejecta dynamics caused by impact of the asteroid into both oceanic and continental sites. The asteroid was modeled as a spherical body moving vertically at 20 km/s with a kinetic energy of 2.6 ?? 1030 ergs (6.2 ?? 107 Mt ). Detailed material modeling of the asteroid, ocean, crustal units, sedimentary unit, and mantle included effects of strength and fracturing, generic asteroid and rock properties, porosity, saturation, lithostatic stresses, and geothermal contributions, each selected to simulate impact and geologic conditions that were as realistic as possible. Calculation of the passage of the asteroid through a U.S. Standard Atmosphere showed development of a strong bow shock wave followed by a highly shock compressed and heated air mass. Rapid expansion of this shocked air created a large low-density region that also expanded away from the impact area. Shock temperatures in air reached ???20,000 K near the surface of the uplifting crater rim and were as high as ???2000 K at more than 30 km range and 10 km altitude. Calculations to 30 s showed that the shock fronts in the air and in most of the expanding shocked air mass preceded the formation of the crater, ejecta, and rim uplift and did not interact with them. As cratering developed, uplifted rim and target material were ejected into the very low density, shock-heated air immediately above the forming crater, and complex interactions could be expected. Calculations of the impact events showed equally dramatic effects on the oceanic and continental targets through an interval of 120 s. Despite geologic differences in the targets, both cratering events developed comparable dynamic flow fields and by ???29 s had formed similar-sized transient craters ???39 km deep and ???62 km across. Transient-rim uplift of ocean and crust reached a maximum altitude of nearly

  18. SU-E-I-74: Image-Matching Technique of Computed Tomography Images for Personal Identification: A Preliminary Study Using Anthropomorphic Chest Phantoms

    SciTech Connect

    Matsunobu, Y; Shiotsuki, K; Morishita, J

    2015-06-15

    Purpose: Fingerprints, dental impressions, and DNA are used to identify unidentified bodies in forensic medicine. Cranial Computed tomography (CT) images and/or dental radiographs are also used for identification. Radiological identification is important, particularly in the absence of comparative fingerprints, dental impressions, and DNA samples. The development of an automated radiological identification system for unidentified bodies is desirable. We investigated the potential usefulness of bone structure for matching chest CT images. Methods: CT images of three anthropomorphic chest phantoms were obtained on different days in various settings. One of the phantoms was assumed to be an unidentified body. The bone image and the bone image with soft tissue (BST image) were extracted from the CT images. To examine the usefulness of the bone image and/or the BST image, the similarities between the two-dimensional (2D) or threedimensional (3D) images of the same and different phantoms were evaluated in terms of the normalized cross-correlation value (NCC). Results: For the 2D and 3D BST images, the NCCs obtained from the same phantom assumed to be an unidentified body (2D, 0.99; 3D, 0.93) were higher than those for the different phantoms (2D, 0.95 and 0.91; 3D, 0.89 and 0.80). The NCCs for the same phantom (2D, 0.95; 3D, 0.88) were greater compared to those of the different phantoms (2D, 0.61 and 0.25; 3D, 0.23 and 0.10) for the bone image. The difference in the NCCs between the same and different phantoms tended to be larger for the bone images than for the BST images. These findings suggest that the image-matching technique is more useful when utilizing the bone image than when utilizing the BST image to identify different people. Conclusion: This preliminary study indicated that evaluating the similarity of bone structure in 2D and 3D images is potentially useful for identifying of an unidentified body.

  19. Flat panel detector-based cone beam computed tomography with a circle-plus-two-arcs data acquisition orbit: preliminary phantom study.

    PubMed

    Ning, Ruola; Tang, Xiangyang; Conover, David; Yu, Rongfeng

    2003-07-01

    Cone beam computed tomography (CBCT) has been investigated in the past two decades due to its potential advantages over a fan beam CT. These advantages include (a) great improvement in data acquisition efficiency, spatial resolution, and spatial resolution uniformity, (b) substantially better utilization of x-ray photons generated by the x-ray tube compared to a fan beam CT, and (c) significant advancement in clinical three-dimensional (3D) CT applications. However, most studies of CBCT in the past are focused on cone beam data acquisition theories and reconstruction algorithms. The recent development of x-ray flat panel detectors (FPD) has made CBCT imaging feasible and practical. This paper reports a newly built flat panel detector-based CBCT prototype scanner and presents the results of the preliminary evaluation of the prototype through a phantom study. The prototype consisted of an x-ray tube, a flat panel detector, a GE 8800 CT gantry, a patient table and a computer system. The prototype was constructed by modifying a GE 8800 CT gantry such that both a single-circle cone beam acquisition orbit and a circle-plus-two-arcs orbit can be achieved. With a circle-plus-two-arcs orbit, a complete set of cone beam projection data can be obtained, consisting of a set of circle projections and a set of arc projections. Using the prototype scanner, the set of circle projections were acquired by rotating the x-ray tube and the FPD together on the gantry, and the set of arc projections were obtained by tilting the gantry while the x-ray tube and detector were at the 12 and 6 o'clock positions, respectively. A filtered backprojection exact cone beam reconstruction algorithm based on a circle-plus-two-arcs orbit was used for cone beam reconstruction from both the circle and arc projections. The system was first characterized in terms of the linearity and dynamic range of the detector. Then the uniformity, spatial resolution and low contrast resolution were assessed using

  20. Crosswell seismic reflection/diffraction tomography: A reservoir characterization application

    SciTech Connect

    Tura, M.A.C. . Dept. of Earth Sciences); Greaves, R.J. . Earth Resources Lab.); Beydoun, W.B. )

    1994-03-01

    A crosswell seismic experiment at the San Emidio oil field in Bakersfield, California, is carried out to evaluate crosswell reflection/diffraction tomography and image the interwell region to locate a possible pinchout zone. In this experiment, the two wells used are 2,500 ft (762 m) apart, and the zone to be imaged is 11,000 ft (3,350 m) to 13,000 ft (3,960 m) deep. With the considered distances, this experiment forms the first large scale reservoir characterization application of crosswell reflection/diffraction tomography. A subset of the intended data, formed of two common receiver gathers and one common shot gather, was collected at the San Emidio oil field. The cross-well data display a wide variety of wave modes including tube waves, singly and multiply reflected/diffracted waves, and refracted waves. The data are processed using frequency filters, median filters, and spatial muting filters to enhance the reflected/diffracted energy. With the encouraging results obtained from synthetic data, the ERBMI method, with the smooth background velocity model is used next to image the processed field data. Images obtained from the crosswell data show a good match with the reflected field in the zero-offset VSPs and with migrated surface seismic data. From the interpretation of these images, the potential of this crosswell seismic method for answering questions regarding reservoir continuity and existence of pinchout zones can be seen.

  1. Characterization of waste drums using nonintrusive active and passive computed tomography

    SciTech Connect

    Roberson, G.P.; Martz, H.E.; Decman, D.J.; Camp, D.C.; Azevedo, S.G.; Keto, E.R.

    1994-08-01

    We have developed a data acquisition scanner for gamma-ray nondestructive assay (NDA) active and passive computed tomography (A&PCT) along with associated computational techniques for image reconstruction, analysis, and display. We are using this scanner to acquire data sets of mock-waste drums at Lawrence Livermore National Laboratory (LLNIL). In this paper, we discuss some issues associated with gamma-ray spectroscopy assay, NDA imaging, describe the design and construction of an NDA drum scanner and report on code development for image reconstruction. We also present representative A&PCT assay results of well characterized mock-waste drums. These preliminary results suggest that A&PCT imaging can be used to produce accurate absolute assays of radioactivity in real-waste drums.

  2. ACCF/SCCT/ACR/AHA/ASE/ASNC/NASCI/SCAI/SCMR 2010 appropriate use criteria for cardiac computed tomography. A report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the Society of Cardiovascular Computed Tomography, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the American Society of Nuclear Cardiology, the North American Society for Cardiovascular Imaging, the Society for Cardiovascular Angiography and Interventions, and the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D; Kramer, Christopher M; Berman, Daniel; Brown, Alan; Chaudhry, Farooq A; Cury, Ricardo C; Desai, Milind Y; Einstein, Andrew J; Gomes, Antoinette S; Harrington, Robert; Hoffmann, Udo; Khare, Rahul; Lesser, John; McGann, Christopher; Rosenberg, Alan; Schwartz, Robert; Shelton, Marc; Smetana, Gerald W; Smith, Sidney C

    2010-11-23

    The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance (CMR) appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria (1). The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease (CAD) was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography (CT) for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research.

  3. ACCF/SCCT/ACR/AHA/ASE/ASNC/NASCI/SCAI/SCMR 2010 Appropriate Use Criteria for Cardiac Computed Tomography. A Report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the Society of Cardiovascular Computed Tomography, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the American Society of Nuclear Cardiology, the North American Society for Cardiovascular Imaging, the Society for Cardiovascular Angiography and Interventions, and the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D

    2010-01-01

    The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance (CMR) appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria (1). The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease (CAD) was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography (CT) for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research.

  4. ACCF/SCCT/ACR/AHA/ASE/ASNC/NASCI/SCAI/SCMR 2010 Appropriate Use Criteria for Cardiac Computed Tomography. A Report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the Society of Cardiovascular Computed Tomography, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the American Society of Nuclear Cardiology, the North American Society for Cardiovascular Imaging, the Society for Cardiovascular Angiography and Interventions, and the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D

    2010-11-23

    The American College of Cardiology Foundation, along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research.

  5. Computer Assisted Instruction and Bibliographic Instruction: Preliminary Data on the Use of PLATO in the BI Program of the Humanities and Social Sciences Library, University of Alberta.

    ERIC Educational Resources Information Center

    Champion, Brian

    In response to requests from the University of Alberta Department of Computing Services for PLATO (Programmed Logic for Automatic Teaching Operations) applications in structured learning situations, a program for computer-assisted bibliographic instruction (BI) was developed. The program is divided into the following six units: (1) Introduction;…

  6. Preliminary Results on Studying of Meteorites from Geological Museum of Kazan University by X-Ray Fluorescence and Computed X-Ray Tomography

    NASA Astrophysics Data System (ADS)

    Kuzina, D. M.; Nurgaliev, D. K.; Gareev, B. I.; Batalin, G. A.; Silantev, V. V.; Statsenko, E. O.

    2017-02-01

    Micro X-ray fluorescence and X-ray computed tomography used for studying meteorites (particularly chondrules and iron-nickel alloys) from Geological Museum (Kazan), their elemental composition, and distribution of these objects in the body of meteorite.

  7. Characterizing atherosclerotic plaque with computed tomography: a contrast-detail study

    NASA Astrophysics Data System (ADS)

    Kasraie, Nima; Clarke, Geoffrey D.

    2012-02-01

    Plaque characterization may benefit from the increasing distinctiveness of the attenuating properties of different soft plaque components at lower energies. Due to the relative slight increase in the CT number of the nonadipose soft plaque at lower tube voltage settings vs. adipose plaque, a higher contrast between atheromous adipose and non-adipose plaque may become visible with modern 64 slice systems. A contrast-detail (C-D) phantom with varying plaque composition as the contrast generating method, was imaged on a commercial 64 slice MDCT system using 80, 120, and 140 kVp settings. The same phantom was also imaged on a Cone Beam CT (CBCT) system with a lower tube voltage of 75 kVp. The results of experiments from four different observers on three different plaque types (lipid, fiber, calcific) indicate that CT attenuation within lipid cores and fibrous masses vary not only with the percentage of lipid or fiber present, but also with the size of the cores. Furthermore, the C-D curve analysis for all three plaque types reveals that while the noise constraints prevent visible differentiation of soft plaque at current conventional 64 slice MDCT settings, CBCT exhibits superior visible contrast detectability than its conventional counterpart, with the latter having appreciably better resolution limits and beneficial lower tube voltages. This low voltage CT technique has the potential to be useful in composition based diagnosis of carotid vulnerable atherosclerotic plaque.

  8. Prognostic Value of Epicardial Fat Volume Measurements by Computed Tomography: A Systematic Review of the Literature

    PubMed Central

    Spearman, James V.; Renker, Matthias; Schoepf, U. Joseph; Krazinski, Aleksander W.; Herbert, Teri L.; De Cecco, Carlo N.; Nietert, Paul J.; Meinel, Felix G.

    2015-01-01

    Objectives To perform a systematic review of the growing body of literature evaluating the prognostic value of epicardial fat volume (EFV) quantified by cross-sectional imaging for adverse clinical outcomes. Methods Two independent reviewers performed systematic searches on both PubMed and Scopus using search terms developed with a medical librarian. Peer-reviewed articles were selected based on the inclusion of outcome data, utilization of epicardial fat volume and sufficient reporting for analysis. Results A total of 411 studies were evaluated with 9 studies meeting the inclusion criteria. In all, the studies evaluated 10,252 patients. All 9 studies were based on CT measurements. Seven studies evaluated the prognostic value of EFV unadjusted for calcium score, and 6 of these studies found a significant association between EFV and clinical outcomes. Seven studies evaluated the incremental value of EFV beyond calcium scoring, and 6 of these studies found a significant association. Conclusions The majority of studies suggest that EFV quantification is significantly associated with clinical outcomes and provides incremental prognostic value over coronary artery calcium scoring. Future research should use a binary cut-off of 125mL for evaluation of EFV to provide consistency with other research. PMID:25925354

  9. Effects of computer-based graphic organizers to solve one-step word problems for middle school students with mild intellectual disability: A preliminary study.

    PubMed

    Sheriff, Kelli A; Boon, Richard T

    2014-08-01

    The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed.

  10. Oceanic pipeline computations

    SciTech Connect

    Marks, A.

    1980-01-01

    yechnical and economic feasibility, design, and construction of oil, gas, and two-phase oceanic pipelines systems are discussed. In addition, formulae, references, examples, and programmable calculator software (Hewlett-Packard-67) are given. The contents include: preliminary pipeline sizing; fluid characteristics; preliminary hydraulics; oceanographis; preliminary corridor selection; route selection; final pipeline design; hydraulic design; wall thickness selection; oceanographic design computations; stress analysis; and construction parameters. (JMT)

  11. Computer-enhanced interventions for drug use and HIV risk in the emergency room: preliminary results on psychological precursors of behavior change.

    PubMed

    Bonar, Erin E; Walton, Maureen A; Cunningham, Rebecca M; Chermack, Stephen T; Bohnert, Amy S B; Barry, Kristen L; Booth, Brenda M; Blow, Frederic C

    2014-01-01

    This article describes process data from a randomized controlled trial among 781 adults recruited in the emergency department who reported recent drug use and were randomized to: intervener-delivered brief intervention (IBI) assisted by computer, computerized BI (CBI), or enhanced usual care (EUC). Analyses examined differences between baseline and post-intervention on psychological constructs theoretically related to changes in drug use and HIV risk: importance, readiness, intention, help-seeking, and confidence. Compared to EUC, participants receiving the IBI significantly increased in confidence and intentions; CBI patients increased importance, readiness, confidence, and help-seeking. Both groups increased relative to the EUC in likelihood of condom use with regular partners. Examining BI components suggested that benefits of change and tools for change were associated with changes in psychological constructs. Delivering BIs targeting drug use and HIV risk using computers appears promising for implementation in healthcare settings. This trial is ongoing and future work will report behavioral outcomes.

  12. An Overview of Preliminary Computational and Experimental Results for the Semi-Span Super-Sonic Transport (S4T) Wind-Tunnel Model

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.

    2011-01-01

    A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.

  13. A web-based remote radiation treatment planning system using the remote desktop function of a computer operating system: a preliminary report.

    PubMed

    Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki

    2009-01-01

    We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.

  14. Can a numerically stable subgrid-scale model for turbulent flow computation be ideally accurate?: a preliminary theoretical study for the Gaussian filtered Navier-Stokes equations.

    PubMed

    Ida, Masato; Taniguchi, Nobuyuki

    2003-09-01

    This paper introduces a candidate for the origin of the numerical instabilities in large eddy simulation repeatedly observed in academic and practical industrial flow computations. Without resorting to any subgrid-scale modeling, but based on a simple assumption regarding the streamwise component of flow velocity, it is shown theoretically that in a channel-flow computation, the application of the Gaussian filtering to the incompressible Navier-Stokes equations yields a numerically unstable term, a cross-derivative term, which is similar to one appearing in the Gaussian filtered Vlasov equation derived by Klimas [J. Comput. Phys. 68, 202 (1987)] and also to one derived recently by Kobayashi and Shimomura [Phys. Fluids 15, L29 (2003)] from the tensor-diffusivity subgrid-scale term in a dynamic mixed model. The present result predicts that not only the numerical methods and the subgrid-scale models employed but also only the applied filtering process can be a seed of this numerical instability. An investigation concerning the relationship between the turbulent energy scattering and the unstable term shows that the instability of the term does not necessarily represent the backscatter of kinetic energy which has been considered a possible origin of numerical instabilities in large eddy simulation. The present findings raise the question whether a numerically stable subgrid-scale model can be ideally accurate.

  15. Computer simulations of 10-km-diameter asteroid impacts into oceanic and continental sites: Preliminary results on atmospheric passage, cratering and ejecta dynamics

    NASA Technical Reports Server (NTRS)

    Roddy, D. J.; Schuster, S. H.; Rosenblatt, Martin; Grant, L. B.; Hassig, P. J.; Kreyenhagen, K. N.

    1987-01-01

    A series of analytical calculations of large scale cratering events for both oceanic and continental sites were made in order to examine their effects on the target media and atmosphere. The first analytical studies that were completed consists of computer simulations of the dynamics of: (1) the passage of a 10 km diameter asteroid moving at 20 km/sec through the Earth's atmosphere, and (2) the impact cratering events in both oceanic and continental environments. Calculation of the dynamics associated with the passage of the asteroid through the atmosphere showed strong effects on the surrounding air mass. The calculations of the impact cratering events showed equally dramatic effects on the oceanic and continental environments. These effects are briefly discussed.

  16. Computed Tomography Versus Magnetic Resonance Imaging-Based Contouring in Cervical Cancer Brachytherapy: Results of a Prospective Trial and Preliminary Guidelines for Standardized Contours

    SciTech Connect

    Viswanathan, Akila N.; Dimopoulos, Johannes . E-mail: Johannes.dimopoulos@akhwien.at; Kirisits, Christian; Berger, Daniel; Poetter, Richard

    2007-06-01

    Purpose: To compare the contours and dose-volume histograms (DVH) of the tumor and organs at risk (OAR) with computed tomography (CT) vs. magnetic resonance imaging (MRI) in cervical cancer brachytherapy. Methods and Materials: Ten patients underwent both MRI and CT after applicator insertion. The dose received by at least 90% of the volume (D{sub 90}), the minimal target dose (D{sub 100}), the volume treated to the prescription dose or greater for tumor for the high-risk (HR) and intermediate-risk (IR) clinical target volume (CTV) and the dose to 0.1 cm{sup 3}, 1 cm{sup 3}, and 2 cm{sup 3} for the OARs were evaluated. A standardized approach to contouring on CT (CT{sub Std}) was developed, implemented (HR- and IR-CTV{sub CTStd}), and compared with the MRI contours. Results: Tumor height, thickness, and total volume measurements, as determined by either CT or CT{sub Std} were not significantly different compared with the MRI volumes. In contrast, the width measurements differed in HR-CTV{sub CTStd} (p = 0.05) and IR-CTV{sub CTStd} (p = 0.01). For the HR-CTV{sub CTStd}, this resulted in statistically significant differences in the volume treated to the prescription dose or greater (MRI, 96% vs. CT{sub Std}, 86%, p = 0.01), D{sub 100} (MRI, 5.4 vs. CT{sub Std}, 3.4, p <0.01), and D{sub 90} (MRI, 8.7 vs. CT{sub Std}, 6.7, p <0.01). Correspondingly, the IR-CTV DVH values on MRI vs. CT{sub Std}, differed in the D{sub 100} (MRI, 3.0 vs. CT{sub Std}, 2.2, p = 0.01) and D{sub 90} (MRI, 5.6 vs. CT{sub Std}, 4.6, p = 0.02). The MRI and CT DVH values of the dose to 0.1 cm{sup 3}, 1 cm{sup 3}, and 2 cm{sup 3} for the OARs were similar. Conclusion: Computed tomography-based or MRI-based scans at brachytherapy are adequate for OAR DVH analysis. However, CT tumor contours can significantly overestimate the tumor width, resulting in significant differences in the D{sub 90}, D{sub 100}, and volume treated to the prescription dose or greater for the HR-CTV compared with that using

  17. Comparison of Contrast-Enhanced Ultrasound and Computed Tomography in Classifying Endoleaks After Endovascular Treatment of Abdominal Aorta Aneurysms: Preliminary Experience

    SciTech Connect

    Carrafiello, Gianpaolo Lagana, Domenico; Recaldini, Chiara; Mangini, Monica; Bertolotti, Elena; Caronno, Roberto; Tozzi, Matteo; Piffaretti, Gabriele; Annibale Genovese, Eugenio; Fugazzola, Carlo

    2006-12-15

    The purpose of the study was to assess the effectiveness of contrast-enhanced ultrasonography (CEUS) in endoleak classification after endovascular treatment of an abdominal aortic aneurysm compared to computed tomography angiography (CTA). From May 2001 to April 2003, 10 patients with endoleaks already detected by CTA underwent CEUS with Sonovue (registered) to confirm the CTA classification or to reclassify the endoleak. In three conflicting cases, the patients were also studied with conventional angiography. CEUS confirmed the CTA classification in seven cases (type II endoleaks). Two CTA type III endoleaks were classified as type II using CEUS and one CTA type II endoleak was classified as type I by CEUS. Regarding the cases with discordant classification, conventional angiography confirmed the ultrasound classification. Additionally, CEUS documented the origin of type II endoleaks in all cases. After CEUS reclassification of endoleaks, a significant change in patient management occurred in three cases. CEUS allows a better attribution of the origin of the endoleak, as it shows the flow in real time. CEUS is more specific than CTA in endoleak classification and gives more accurate information in therapeutic planning.

  18. Optimizing Hybrid Occlusion in Face-Jaw-Teeth Transplantation: A Preliminary Assessment of Real-Time Cephalometry as Part of the Computer-Assisted Planning and Execution Workstation for Craniomaxillofacial Surgery

    PubMed Central

    Murphy, Ryan J.; Basafa, Ehsan; Hashemi, Sepehr; Grant, Gerald T.; Liacouras, Peter; Susarla, Srinivas M.; Otake, Yoshito; Santiago, Gabriel; Armand, Mehran; Gordon, Chad R.

    2016-01-01

    Background The aesthetic and functional outcomes surrounding Le Fort–based, face-jaw-teeth transplantation have been suboptimal, often leading to posttransplant class II/III skeletal profiles, palatal defects, and “hybrid malocclusion.” Therefore, a novel technology—real-time cephalometry—was developed to provide the surgical team instantaneous, intraoperative knowledge of three-dimensional dentoskeletal parameters. Methods Mock face-jaw-teeth transplantation operations were performed on plastic and cadaveric human donor/recipient pairs (n = 2). Preoperatively, cephalometric landmarks were identified on donor/recipient skeletons using segmented computed tomographic scans. The computer-assisted planning and execution workstation tracked the position of the donor face-jaw-teeth segment in real time during the placement/inset onto recipient, reporting pertinent hybrid cephalometric parameters from any movement of donor tissue. The intraoperative data measured through real-time cephalometry were compared to posttransplant measurements for accuracy assessment. In addition, posttransplant cephalometric relationships were compared to planned outcomes to determine face-jaw-teeth transplantation success. Results Compared with postoperative data, the real-time cephalometry–calculated intraoperative measurement errors were 1.37 ± 1.11 mm and 0.45 ± 0.28 degrees for the plastic skull and 2.99 ± 2.24 mm and 2.63 ± 1.33 degrees for the human cadaver experiments. These results were comparable to the posttransplant relations to planned outcome (human cadaver experiment, 1.39 ± 1.81 mm and 2.18 ± 1.88 degrees; plastic skull experiment, 1.06 ± 0.63 mm and 0.53 ± 0.39 degrees). Conclusion Based on this preliminary testing, real-time cephalometry may be a valuable adjunct for adjusting and measuring “hybrid occlusion” in face-jaw-teeth transplantation and other orthognathic surgical procedures. PMID:26218382

  19. A database of body-only computer-generated pictures of women for body-image studies: Development and preliminary validation.

    PubMed

    Moussally, Joanna M; Rochat, Lucien; Posada, Andrés; Van der Linden, Martial

    2017-02-01

    The body-shape-related stimuli used in most body-image studies have several limitations (e.g., a lack of pilot validation procedures and the use of non-body-shape-related control/neutral stimuli). We therefore developed a database of 61 computer-generated body-only pictures of women, wherein bodies were methodically manipulated in terms of fatness versus thinness. Eighty-two young women assessed the pictures' attractiveness, beauty, harmony (valence ratings), and body shape (assessed on a thinness/fatness axis), providing normative data for valence and body shape ratings. First, stimuli manipulated for fatness versus thinness conveyed comparable emotional intensities regarding the valence and body shape ratings. Second, different subcategories of stimuli were obtained on the basis of variations in body shape and valence judgments. Fat and thin bodies were distributed into several subcategories depending on their valence ratings, and a subcategory containing stimuli that were neutral in terms of valence and body shape was identified. Interestingly, at a descriptive level, the thinness/fatness manipulations of the bodies were in a curvilinear relationship with the valence ratings: Thin bodies were not only judged as positive, but also as negative when their estimated body mass indexes (BMIs) decreased too much. Finally, convergent validity was assessed by exploring the impacts of body-image-related variables (BMI, thin-ideal internalization, and body dissatisfaction) on participants' judgments of the bodies. Valence judgments, but not body shape judgments, were influenced by the participants' levels of thin-ideal internalization and body dissatisfaction. Participants' BMIs did not significantly influence their judgments. Given these findings, this database contains relevant material that can be used in various fields, primarily for studies of body-image disturbance or eating disorders.

  20. Mosaic tile model to compute gravitational field for infinitely thin non-axisymmetric objects and its application to preliminary analysis of gravitational field of M74

    NASA Astrophysics Data System (ADS)

    Fukushima, Toshio

    2016-07-01

    Using the analytical expressions of the Newtonian gravitational potential and the associated acceleration vector for an infinitely thin uniform rectangular plate, we developed a method to compute the gravitational field of a general infinitely thin object without assuming its axial symmetry when its surface mass density is known at evenly spaced rectangular grid points. We utilized the method in evaluating the gravitational field of the H I gas, dust, red stars, and blue stars components of M74 from its THINGS, 2MASS, PDSS1, and GALEX data. The non-axisymmetric feature of M74 including an asymmetric spiral structure is seen from (i) the contour maps of the determined gravitational potential, (ii) the vector maps of the associated acceleration vector, and (iii) the cross-section views of the gravitational field and the surface mass density along different directions. An x-mark pattern in the gravitational field is detected at the core of M74 from the analysis of its dust and red stars components. Meanwhile, along the east-west direction in the central region of the angular size of 1 arcmin, the rotation curve derived from the radial component of the acceleration vector caused by the red stars component matches well with that observed by the VENGA project. Thus the method will be useful in studying the dynamics of particles and fluids near and inside spiral galaxies with known photometry data. Electronically available are the table of the determined gravitational fields of M74 on its galactic plane as well as the FORTRAN 90 programs to produce them.

  1. Quantification of left coronary bifurcation angles and plaques by coronary computed tomography angiography for prediction of significant coronary stenosis: A preliminary study with dual-source CT

    PubMed Central

    Cui, Yue; Zeng, Wenjuan; Yu, Jie; Lu, Jing; Hu, Yuannan; Diao, Nan; Liang, Bo; Han, Ping; Shi, Heshui

    2017-01-01

    Purpose To evaluate the diagnostic performance of left coronary bifurcation angles and plaque characteristics for prediction of coronary stenosis by dual-source CT. Methods 106 patients suspected of coronary artery disease undergoing both coronary computed tomography angiography (CCTA) and invasive coronary angiography (CAG) within three months were included. Left coronary bifurcation angles including the angles between the left anterior descending artery and left circumflex artery (LAD-LCx), left main coronary artery and left anterior descending artery (LM-LAD), left main coronary artery and left circumflex artery (LM-LCx) were measured on CT images. CCTA plaque parameters were calculated by plaque analysis software. Coronary stenosis ≥ 50% by CAG was defined as significant. Results 106 patients with 318 left coronary bifurcation angles and 126 vessels were analyzed. The bifurcation angle of LAD-LCx was significantly larger in left coronary stenosis ≥ 50% than stenosis < 50%, and significantly wider in the non-calcified plaque group than calcified. Multivariable analyses showed the bifurcation angle of LAD-LCx was an independent predictor for significant left coronary stenosis (OR = 1.423, P = 0.002). In ROC curve analysis, LAD-LCx predicted significant left coronary stenosis with a sensitivity of 66.7%, specificity of 78.4%, positive predictive value of 85.2% and negative predictive value of 55.8%. The lipid plaque volume improved the diagnostic performance of CCTA diameter stenosis (AUC: 0.854 vs. 0.900, P = 0.045) in significant coronary stenosis. Conclusions The bifurcation angle of LAD-LCx could predict significant left coronary stenosis. Wider LAD-LCx is related to non-calcified lesions. Lipid plaque volume could improve the diagnostic performance of CCTA for coronary stenosis prediction. PMID:28346530

  2. Preliminary experience on the implementation of computed tomography (CT)-based image guided brachytherapy (IGBT) of cervical cancer using high-dose-rate (HDR) Cobalt-60 source in University of Malaya Medical Centre (UMMC)

    NASA Astrophysics Data System (ADS)

    Jamalludin, Z.; Min, U. N.; Ishak, W. Z. Wan; Malik, R. Abdul

    2016-03-01

    This study presents our preliminary work of the computed tomography (CT) image guided brachytherapy (IGBT) implementation on cervical cancer patients. We developed a protocol in which patients undergo two Magnetic Resonance Imaging (MRI) examinations; a) prior to external beam radiotherapy (EBRT) and b) prior to intra-cavitary brachytherapy for tumour identification and delineation during IGBT planning and dosimetry. For each fraction, patients were simulated using CT simulator and images were transferred to the treatment planning system. The HR-CTV, IR-CTV, bladder and rectum were delineated on CT-based contouring for cervical cancer. Plans were optimised to achieve HR-CTV and IR-CTV dose (D90) of total EQD2 80Gy and 60Gy respectively, while limiting the minimum dose to the most irradiated 2cm3 volume (D2cc) of bladder and rectum to total EQD2 90Gy and 75Gy respectively. Data from seven insertions were analysed by comparing the volume-based with traditional point- based doses. Based on our data, there were differences between volume and point doses of HR- CTV, bladder and rectum organs. As the number of patients having the CT-based IGBT increases from day to day in our centre, it is expected that the treatment and dosimetry accuracy will be improved with the implementation.

  3. Preliminary Report: Results of Computed Tracer Concentrations over Eastern China, South Korea, and Japan for 01 March to 30 May 2007 Daily Simulated Releases from Taiyuan, China

    SciTech Connect

    Vogt, P

    2007-08-07

    In order to prepare for a proposed long range tracer experiment in China for the spring of 2008 time period, NARAC computed hypothetical PMCH concentrations over Eastern China, South Korea and Japan for simulated releases from Taiyuan, China. Normalized 1 kg of PMCH source strength releases were made twice a day, with wind input from global forecast weather model. We used 6-hour analysis fields valid at the start of each model run, resulting in four wind fields per day. The selected domain encompassed the region of interest over eastern Asia and the Western Pacific. Screening runs were made for each day at 0000 and 1200 UTC from 01 April, 2007 through 29 May, 2007 for a total of 90 days and 180 cases. 24-hour average air concentrations were evaluated at 22 sample cities in the three regions of interest for each case. 15 sample cities were selected to help quantify modeling results for experiment objectives. Any case that resulted in model predicted air concentrations exceeding 2.0E-02 fL/L at a sample city in all three regions was then selected for a detailed model run with source times six hours before and after evaluated in addition to the case time. The detailed runs used the same wind fields and model domain, but 6-hour average air concentrations were generated and analyzed for the 15 sample cities. Each of the 180 cases were ranked subjectively, based on whether or not the model prediction indicated the possibility that a release on that date and time might achieve the long range experiment objectives. Ranks used are High, Good, Low, Poor, and Bad. Of the 180 cases run, NARAC dispersion models predicted 6 instances of High possibility, 8 cases of Good, 32 of Low, 74 of Poor, and 60 cases of Bad probability. Detailed model runs were made for all 14 High or Good probability cases, a total of only 7.8% of all analyzed. Based on the results of this study we have identified a few dates on which a release of a reasonable amount of PMCH tracer (on the order of 500 kg

  4. Evaluation of pulmonary function using single-breath-hold dual-energy computed tomography with xenon: Results of a preliminary study.

    PubMed

    Kyoyama, Hiroyuki; Hirata, Yusuke; Kikuchi, Satoshi; Sakai, Kosuke; Saito, Yuriko; Mikami, Shintaro; Moriyama, Gaku; Yanagita, Hisami; Watanabe, Wataru; Otani, Katharina; Honda, Norinari; Uematsu, Kazutsugu

    2017-01-01

    Xenon-enhanced dual-energy computed tomography (xenon-enhanced CT) can provide lung ventilation maps that may be useful for assessing structural and functional abnormalities of the lung. Xenon-enhanced CT has been performed using a multiple-breath-hold technique during xenon washout. We recently developed xenon-enhanced CT using a single-breath-hold technique to assess ventilation. We sought to evaluate whether xenon-enhanced CT using a single-breath-hold technique correlates with pulmonary function testing (PFT) results.Twenty-six patients, including 11 chronic obstructive pulmonary disease (COPD) patients, underwent xenon-enhanced CT and PFT. Three of the COPD patients underwent xenon-enhanced CT before and after bronchodilator treatment. Images from xenon-CT were obtained by dual-source CT during a breath-hold after a single vital-capacity inspiration of a xenon-oxygen gas mixture. Image postprocessing by 3-material decomposition generated conventional CT and xenon-enhanced images.Low-attenuation areas on xenon images matched low-attenuation areas on conventional CT in 21 cases but matched normal-attenuation areas in 5 cases. Volumes of Hounsfield unit (HU) histograms of xenon images correlated moderately and highly with vital capacity (VC) and total lung capacity (TLC), respectively (r = 0.68 and 0.85). Means and modes of histograms weakly correlated with VC (r = 0.39 and 0.38), moderately with forced expiratory volume in 1 second (FEV1) (r = 0.59 and 0.56), weakly with the ratio of FEV1 to FVC (r = 0.46 and 0.42), and moderately with the ratio of FEV1 to its predicted value (r = 0.64 and 0.60). Mode and volume of histograms increased in 2 COPD patients after the improvement of FEV1 with bronchodilators. Inhalation of xenon gas caused no adverse effects.Xenon-enhanced CT using a single-breath-hold technique depicted functional abnormalities not detectable on thin-slice CT. Mode, mean, and volume of HU histograms of xenon images reflected

  5. A PRELIMINARY JUPITER MODEL

    SciTech Connect

    Hubbard, W. B.; Militzer, B.

    2016-03-20

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen–helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen–helium-rich envelope with approximately three times solar metallicity.

  6. Multidisciplinary Optimization Methods for Preliminary Design

    NASA Technical Reports Server (NTRS)

    Korte, J. J.; Weston, R. P.; Zang, T. A.

    1997-01-01

    An overview of multidisciplinary optimization (MDO) methodology and two applications of this methodology to the preliminary design phase are presented. These applications are being undertaken to improve, develop, validate and demonstrate MDO methods. Each is presented to illustrate different aspects of this methodology. The first application is an MDO preliminary design problem for defining the geometry and structure of an aerospike nozzle of a linear aerospike rocket engine. The second application demonstrates the use of the Framework for Interdisciplinary Design Optimization (FIDO), which is a computational environment system, by solving a preliminary design problem for a High-Speed Civil Transport (HSCT). The two sample problems illustrate the advantages to performing preliminary design with an MDO process.

  7. Computational prediction of propellant reorientation

    NASA Technical Reports Server (NTRS)

    Hochstein, John I.

    1987-01-01

    Viewgraphs from a presentation on computational prediction of propellant reorientation are given. Information is given on code verification, test conditions, predictions for a one-quarter scale cryogenic tank, pulsed settling, and preliminary results.

  8. Preliminary Drill Sites

    DOE Data Explorer

    Lane, Michael

    2013-06-28

    Preliminary locations for intermediate depth temperature gradient holes and/or resource confirmation wells based on compilation of geological, geophysical and geochemical data prior to carrying out the DOE-funded reflection seismic survey.

  9. Computer-Graphical Simulation Of Robotic Welding

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Cook, George

    1988-01-01

    Computer program ROBOSIM, developed to simulate operations of robots, applied to preliminary design of robotic arc-welding operation. Limitations on equipment investigated in advance to prevent expensive mistakes. Computer makes drawing of robotic welder and workpiece on positioning table. Such numerical simulation used to perform rapid, safe experiments in computer-aided design or manufacturing.

  10. Notification: Audit of EPA's Cloud Computer Initiative

    EPA Pesticide Factsheets

    Project #OA-FY13-0095, December 17, 2012. The U.S. Environmental Protection Agency (EPA) Office of Inspector General plans to begin preliminary research on the audit of EPA’s cloud computer initiative.

  11. Preliminary Response Analysis of AUV

    NASA Astrophysics Data System (ADS)

    Hariri, Azian; Basharie, Siti Mariam; Ghani, Mohamad Hanifah Abd.

    2010-06-01

    Development of Autonomous Unmanned Vehicle (AUV) involves a great task to fully understand the overall working principles of an UAV that needed time, experience and a wide range of intelligence to cover the entire scientific facts. This study is done by means to acquire the fundamental knowledge in understanding the stability and response of an UAV. The longitudinal response and stability of UAV owing to deflection of stern plane during trimmed equilibrium motion can be computed by solving the AUV equation of motion. In this study, the AUV equations of motion were rederived and the solution was computed with the aid of Matlab software. From the existing AUV, a new dimension, weight and speed were specified to be used in the rederivation of the linearised AUV longitudinal equations of motion. From the analysis done, the longitudinal response AUV shows the stern plane and thrust has relatively steady longitudinal control power and quick response characteristic. The results had successfully given a preliminary insight of the specified AUV response and dynamic stability.

  12. Preliminary Cruise Report - Iguana Expedition

    DTIC Science & Technology

    A preliminary cruise report of Expedition Iguana , 31 March 1972-11 May 1972, gives some preliminary results, list of equipment and, personnel, stations and data gathered, and track and topographic plots. (Author)

  13. Preliminary AirMSPI Datasets

    Atmospheric Science Data Center

    2016-12-06

    ... Preliminary AirMSPI Datasets   The data files available through this web page and ftp links are preliminary ... geometric corrections. Caution should be used for science analysis. At a later date, more qualified versions will be made public.   ...

  14. ASCI 2010 appropriateness criteria for cardiac computed tomography: a report of the Asian Society of Cardiovascular Imaging Cardiac Computed Tomography and Cardiac Magnetic Resonance Imaging Guideline Working Group.

    PubMed

    Tsai, I-Chen; Choi, Byoung Wook; Chan, Carmen; Jinzaki, Masahiro; Kitagawa, Kakuya; Yong, Hwan Seok; Yu, Wei

    2010-02-01

    In Asia, the healthcare system, populations and patterns of disease differ from Western countries. The current reports on the criteria for cardiac CT scans, provided by Western professional societies, are not appropriate for Asian cultures. The Asian Society of Cardiovascular Imaging, the only society dedicated to cardiovascular imaging in Asia, formed a Working Group and invited 23 Technical Panel members representing a variety of Asian countries to rate the 51 indications for cardiac CT in clinical practice in Asia. The indications were rated as 'appropriate' (7-9), 'uncertain' (4-6), or 'inappropriate' (1-3) on a scale of 1-9. The median score was used for the final result if there was no disagreement. The final ratings for indications were 33 appropriate, 14 uncertain and 4 inappropriate. And 20 of them are highly agreed (19 appropriate and 1 inappropriate). Specifically, the Asian representatives considered cardiac CT as an appropriate modality for Kawasaki disease and congenital heart diseases in follow up and in symptomatic patients. In addition, except for some specified conditions, cardiac CT was considered to be an appropriate modality for one-stop shop ischemic heart disease evaluation due to its general appropriateness in coronary, structure and function evaluation. This report is expected to have a significant impact on the clinical practice, research and reimbursement policy in Asia.

  15. Cognitive remediation for adolescents with 22q11 deletion syndrome (22q11DS): A preliminary study examining effectiveness, feasibility, and fidelity of a hybrid strategy, remote and computer-based intervention

    PubMed Central

    Mariano, Margaret A.; Tang, Kerri; Kurtz, Matthew; Kates, Wendy R.

    2015-01-01

    Background 22q11DS is a multiple anomaly syndrome involving intellectual and behavioral deficits, and increased risk for schizophrenia. As cognitive remediation (CR) has recently been found to improve cognition in younger patients with schizophrenia, we investigated the efficacy, feasibility, and fidelity of a remote, hybrid strategy, computerized CR program in youth with 22q11DS. Methods A longitudinal design was implemented in which 21 participants served as their own controls. Following an eight month baseline period in which no interventions were provided, cognitive coaches met with participants remotely for CR via video conferencing three times a week over a targeted 8 month timeframe and facilitated their progress through the intervention, offering task-specific strategies. A subset of strategies were examined for fidelity. Outcomes were evaluated using a neurocognitive test battery at baseline, pre-treatment and post-treatment. Results All participants adhered to the intervention. The mean length of the treatment phase was 7.96 months. A moderately high correlation (intraclass correlation coefficient, 0.73) was found for amount and type of strategies offered by coaches. Participants exhibited significant improvements (ES = .36–.55, p ≤ .009) in working memory, shifting attention and cognitive flexibility. All significant models were driven by improvements in pre to post-treatment scores. Conclusions Based on our preliminary investigation, a remote, hybrid strategy, computerized CR program can be implemented with 22q11DS youth despite geographic location, health, and cognitive deficits. It appears effective in enhancing cognitive skills during the developmental period of adolescence, making this type of CR delivery useful for youth with 22q11DS transitioning into post-school environments. PMID:26044111

  16. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  17. On Preliminary Breakdown

    NASA Astrophysics Data System (ADS)

    Beasley, W. H.; Petersen, D.

    2013-12-01

    The preliminary breakdown phase of a negative cloud-to-ground lightning flash was observed in detail. Observations were made with a Photron SA1.1 high-speed video camera operating at 9,000 frames per second, fast optical sensors, a flat-plate electric field antenna covering the SLF to MF band, and VHF and UHF radio receivers with bandwidths of 20 MHz. Bright stepwise extensions of a negative leader were observed at an altitude of 8 km during the first few milliseconds of the flash, and were coincident with bipolar electric field pulses called 'characteristic pulses'. The 2-D step lengths of the preliminary processes were in excess of 100 meters, with some 2-D step lengths in excess of 200 meters. Smaller and shorter unipolar electric field pulses were superposed onto the bipolar electric field pulses, and were coincident with VHF and UHF radio pulses. After a few milliseconds, the emerging negative stepped leader system showed a marked decrease in luminosity, step length, and propagation velocity. Details of these events will be discussed, including the possibility that the preliminary breakdown phase consists not of a single developing lightning leader system, but of multiple smaller lightning leader systems that eventually join together into a single system.

  18. A preliminary weather model for optical communications through the atmosphere

    NASA Technical Reports Server (NTRS)

    Shaik, K. S.

    1988-01-01

    A preliminary weather model is presented for optical propagation through the atmosphere. It can be used to compute the attenuation loss due to the atmosphere for desired link availability statistics. The quantitative results that can be obtained from this model provide good estimates for the atmospheric link budget necessary for the design of an optical communication system. The result is extended to provide for the computation of joint attenuation probability for n sites with uncorrelated weather patterns.

  19. Preliminary System Design of the SWRL Financial System.

    ERIC Educational Resources Information Center

    Ikeda, Masumi

    The preliminary system design of the computer-based Southwest Regional Laboratory's (SWRL) Financial System is outlined. The system is designed to produce various management and accounting reports needed to maintain control of SWRL operational and financial activities. Included in the document are descriptions of the various types of system…

  20. The ASTRO-1 preliminary design review coupled load analysis

    NASA Technical Reports Server (NTRS)

    Mcghee, D. S.

    1984-01-01

    Results of the ASTRO-1 preliminary design review coupled loads analysis are presented. The M6.0Y Generic Shuttle mathematical models were used. Internal accelerations, interface forces, relative displacements, and net e.g., accelerations were recovered for two ASTRO-1 payloads in a tandem configuration. Twenty-seven load cases were computed and summarized. Load exceedences were found and recommendations made.

  1. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  2. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  3. Environmental Survey preliminary report

    SciTech Connect

    Not Available

    1988-04-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Sandia National Laboratories conducted August 17 through September 4, 1987. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with Sandia National Laboratories-Albuquerque (SNLA). The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at SNLA, and interviews with site personnel. 85 refs., 49 figs., 48 tabs.

  4. Ruiz Volcano: Preliminary report

    NASA Astrophysics Data System (ADS)

    Ruiz Volcano, Colombia (4.88°N, 75.32°W). All times are local (= GMT -5 hours).An explosive eruption on November 13, 1985, melted ice and snow in the summit area, generating lahars that flowed tens of kilometers down flank river valleys, killing more than 20,000 people. This is history's fourth largest single-eruption death toll, behind only Tambora in 1815 (92,000), Krakatau in 1883 (36,000), and Mount Pelée in May 1902 (28,000). The following briefly summarizes the very preliminary and inevitably conflicting information that had been received by press time.

  5. [Use of Computers in Introductory Physics Teaching.

    ERIC Educational Resources Information Center

    Merrill, John R.

    This paper presents some of the preliminary results of Project COEXIST at Dartmouth College, an NSF sponsored project to investigate ways to use computers in introductory physics and mathematics teaching. Students use the computer in a number of ways on homework, on individual projects, and in the laboratory. Students write their own programs,…

  6. Optical coherence tomography: a non-invasive technique applied to conservation of paintings

    NASA Astrophysics Data System (ADS)

    Liang, Haida; Gomez Cid, Marta; Cucu, Radu; Dobre, George; Kudimov, Boris; Pedro, Justin; Saunders, David; Cupitt, John; Podoleanu, Adrian

    2005-06-01

    It is current practice to take tiny samples from a painting to mount and examine in cross-section under a microscope. However, since conservation practice and ethics limit sampling to a minimum and to areas along cracks and edges of paintings, which are often unrepresentative of the whole painting, results from such analyses cannot be taken as representative of a painting as a whole. Recently in a preliminary study, we have demonstrated that near-infrared Optical Coherence Tomography (OCT) can be used directly on paintings to examine the cross-section of paint and varnish layers without contact and the need to take samples. OCT is an optical interferometric technique developed for in vivo imaging of the eye and biological tissues; it is essentially a scanning Michelson's interferometer with a "broad-band" source that has the spatial coherence of a laser. The low temporal coherence and high spatial concentration of the source are the keys to high depth resolution and high sensitivity 3D imaging. The technique is non-invasive and non-contact with a typical working distance of 2 cm. This non-invasive technique enables cross-sections to be examined anywhere on a painting. In this paper, we will report new results on applying near-infrared en-face OCT to paintings conservation and extend the application to the examination of underdrawings, drying processes, and quantitative measurements of optical properties of paint and varnish layers.

  7. Preliminary ISIS users manual

    NASA Technical Reports Server (NTRS)

    Grantham, C.

    1979-01-01

    The Interactive Software Invocation (ISIS), an interactive data management system, was developed to act as a buffer between the user and host computer system. The user is provided by ISIS with a powerful system for developing software or systems in the interactive environment. The user is protected from the idiosyncracies of the host computer system by providing such a complete range of capabilities that the user should have no need for direct access to the host computer. These capabilities are divided into four areas: desk top calculator, data editor, file manager, and tool invoker.

  8. Applied potential tomography: a new non-invasive technique for assessing gastric function.

    PubMed

    Mangnall, Y F; Baxter, A J; Avill, R; Bird, N C; Brown, B H; Barber, D C; Seagar, A D; Johnson, A G; Read, N W

    1987-01-01

    Applied potential tomography is a new, non-invasive technique that yields sequential images of the resistivity of gastric contents after subjects have ingested a liquid or semi-solid meal. This study validates the technique as a means of measuring gastric emptying. Experiments in vitro showed an excellent correlation between measurements of resistivity and either the square of the radius of a glass rod or the volume of water in a spherical balloon when both were placed in an oval tank containing saline. Altering the lateral position of the rod in the tank did not alter the values obtained. Images of abdominal resistivity were also directly correlated with the volume of air in a gastric balloon. Profiles of gastric emptying of liquid meals obtained using APT were very similar to those obtained using scintigraphy or dye dilution techniques provided that acid secretion was inhibited by cimetidine. Profiles of emptying of a mashed potato meal using APT were also very similar to those obtained by scintigraphy. Measurements of the emptying of a liquid meal from the stomach were reproducible if acid secretion was inhibited by cimetidine. Thus, APT is an accurate and reproducible method of measuring gastric emptying of liquids and particulate food. It is inexpensive, well tolerated, easy to use and ideally suited for multiple studies in patients, even those who are pregnant. A preliminary study is also presented that assesses the technique as a means of measuring gastric acid secretion. Comparison of resistivity changes with measured acid secretion following the injection of pentagastrin shows good correlations. APT might offer a non-invasive alternative to the use of a nasogastric tube and acid collection.

  9. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  10. 2-D Fused Image Reconstruction approach for Microwave Tomography: a theoretical assessment using FDTD Model.

    PubMed

    Bindu, G; Semenov, S

    2013-01-01

    This paper describes an efficient two-dimensional fused image reconstruction approach for Microwave Tomography (MWT). Finite Difference Time Domain (FDTD) models were created for a viable MWT experimental system having the transceivers modelled using thin wire approximation with resistive voltage sources. Born Iterative and Distorted Born Iterative methods have been employed for image reconstruction with the extremity imaging being done using a differential imaging technique. The forward solver in the imaging algorithm employs the FDTD method of solving the time domain Maxwell's equations with the regularisation parameter computed using a stochastic approach. The algorithm is tested with 10% noise inclusion and successful image reconstruction has been shown implying its robustness.

  11. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  12. Computer Applications Planning. A Guide to Planning and Implementing a District-Wide Computer Program.

    ERIC Educational Resources Information Center

    Mojkowski, Charles

    Designed to help school districts move from exploring the use of computers in the classroom to the comprehensive planning and development of computer education programs, this guide is organized around five steps essential to the process of developing a district program. Phase 1 includes the following preliminary activities involved in planning for…

  13. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  14. Optical computer motherboards

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Xu, Guoda; Bartha, John M.; Gruntman, Michael A.

    1997-09-01

    In this paper, we investigate the application of precision plastic optics into a communication/computer sub-system, such as a hybrid computer motherboard. We believe that using optical waveguides for next-generation computer motherboards can provide a high performance alternative for present multi-layer printed circuit motherboards. In response to this demand, we suggest our novel concept of a hybrid motherboard based on an internal-fiber-coupling (IFC) wavelength-division-multiplexing (WDM) optical backplane. The IFC/WDM backplane provides dedicated Tx/Rx connections, and applies low-cost, high-performance components, including CD LDs, GRIN plastic fibers, molding housing, and nonimaging optics connectors. Preliminary motherboard parameters are: speed 100 MHz/100 m, or 1 GHz/10 m; fiber loss approximately 0.01 dB/m; almost zero fan-out/fan-in optical power loss, and eight standard wavelength channels. The proposed hybrid computer motherboard, based on innovative optical backplane technology, should solve low-speed, low-parallelism bottlenecks in present electric computer motherboards.

  15. Reproducibility of Tear Meniscus Measurement by Fourier-Domain Optical Coherence Tomography: A Pilot Study

    PubMed Central

    Zhou, Sheng; Li, Yan; Lu, Ake Tzu-Hui; Liu, Pengfei; Tang, Maolong; Yiu, Samuel C.; Huang, David

    2009-01-01

    BACKGROUND AND OBJECTIVE To study the reproducibility of tear meniscus measurement with high-speed high-resolution Fourier-domain optical coherence tomography (FD-OCT). PATIENTS AND METHODS Twenty normal participants were enrolled in this prospective study. The lower tear meniscus in the right eye of each subject was imaged by vertical scans centered on the inferior cornea and the lower eyelid using an FD-OCT system (RTVue; Optovue, Inc., Fremont, CA) with a corneal adaptor. The system performs 26,000 axial scans per second and has a 5-micron axial resolution. Each subject was examined at two visits 30 to 60 days apart. Each eye was scanned twice on each visit. The scans were taken 2 seconds after a blink. The lower meniscus height, depth, and cornea-meniscus angle were measured with a computer caliper. The cross-sectional area was calculated using a two-triangle approximation. RESULTS The between-visits coefficient of variation was 17.5%, 18.0%, 35.5%, and 12.2% for meniscus height, depth, area, and angle, respectively. The intraclass correlations for these parameters were 0.605, 0.558, 0.567, and 0.367, respectively. CONCLUSION FD-OCT measures lower tear meniscus dimensions and area with higher between-visits reproducibility than previous OCT instruments. FD-OCT may be a useful way to measure dry eye severity and treatment effectiveness. PMID:19772266

  16. Tomography: A window on the role of sulfur in the structure of micrometeorites

    NASA Astrophysics Data System (ADS)

    Taylor, Susan; Jones, Keith W.; Herzog, Gregory F.; Hornig, Claire E.

    2011-10-01

    To determine the role played by sulfides in the formation of vesicles and FeNi metal beads, we mapped the locations and tabulated the numbers of sulfides, metal beads, and vesicles in 1583 sectioned micrometeorites (MMs) using conventional microscopy and in 190 whole MMs using synchrotron computed microtomography (SCMT). Both the section and the SCMT images show that sulfides melt, coalesce, and migrate to the MMs surface. The decomposition of sulfides may occur during all these stages. Given the sulfide morphologies and compositions that we see in section, we think the breakdown of Ni sulfides produces the FeNi beads. The SCMT images show that metal beads are common in melted MMs, >50% have them. Vesicles in porphyritic and scoriaceous MMs are also probably formed as sulfides decompose. Not only do sulfides abut the vesicles but also the temperatures at which sulfides decompose overlap those at which MM surfaces first melt and temporarily seal, suggesting that S gases could produce most of these vesicles. As the vesicle shapes and patterns of distribution differ among MM classes, tomography can be used to nondestructively screen for specific types of MMs. Tomography is a powerful tool for visualizing the three-dimensional distribution of metal beads, sulfides, mean densities, and vesicles in MMs.

  17. Preliminary reference Earth model

    NASA Astrophysics Data System (ADS)

    Dziewonski, Adam M.; Anderson, Don L.

    1981-06-01

    A large data set consisting of about 1000 normal mode periods, 500 summary travel time observations, 100 normal mode Q values, mass and moment of inertia have been inverted to obtain the radial distribution of elastic properties, Q values and density in the Earth's interior. The data set was supplemented with a special study of 12 years of ISC phase data which yielded an additional 1.75 × 10 6 travel time observations for P and S waves. In order to obtain satisfactory agreement with the entire data set we were required to take into account anelastic dispersion. The introduction of transverse isotropy into the outer 220 km of the mantle was required in order to satisfy the shorter period fundamental toroidal and spheroidal modes. This anisotropy also improved the fit of the larger data set. The horizontal and vertical velocities in the upper mantle differ by 2-4%, both for P and S waves. The mantle below 220 km is not required to be anisotropic. Mantle Rayleigh waves are surprisingly sensitive to compressional velocity in the upper mantle. High S n velocities, low P n velocities and a pronounced low-velocity zone are features of most global inversion models that are suppressed when anisotropy is allowed for in the inversion. The Preliminary Reference Earth Model, PREM, and auxiliary tables showing fits to the data are presented.

  18. Preliminary Analysis of Photoreading

    NASA Technical Reports Server (NTRS)

    McNamara, Danielle S.

    2000-01-01

    The purpose of this project was to provide a preliminary analysis of a reading strategy called PhotoReading. PhotoReading is a technique developed by Paul Scheele that claims to increase reading rate to 25,000 words per minute (Scheele, 1993). PhotoReading itself involves entering a "relaxed state" and looking at, but not reading, each page of a text for a brief moment (about I to 2 seconds). While this technique has received attention in the popular press, there had been no objective examinations of the technique's validity. To examine the effectiveness of PhotoReading, the principal investigator (i.e., trainee) participated in a PhotoReading workshop to learn the technique. Parallel versions of two standardized and three experimenter-created reading comprehension tests were administered to the trainee and an expert user of the PhotoReading technique to compare the use of normal reading strategies and the PhotoReading technique by both readers. The results for all measures yielded no benefits of using the PhotoReading technique. The extremely rapid reading rates claimed by PhotoReaders were not observed; indeed, the reading rates were generally comparable to those for normal reading. Moreover, the PhotoReading expert generally showed an increase in reading time when using the PhotoReading technique in comparison to when using normal reading strategies to process text. This increase in reading time when PhotoReading was accompanied by a decrease in text comprehension.

  19. Enhanced preliminary assessment

    SciTech Connect

    Not Available

    1992-02-01

    An Enhanced Preliminary Assessment was conducted at Fort Benjamin Harrison (FBH) Indiana, which is located approximately 12 miles from downtown Indianapolis in Lawrence Township, Marion County. FBH contains 2,501 acres, of which approximately 1,069 acres is covered by woodlands. Activities at FBH include administration, training, housing, and support. Sensitive environments at FBH include wetlands, habitat areas for the endangered Indiana bat, endangered plants, and historically and archeologically significant areas. FBH is a U.S. Army Soldier Support Center under the jurisdiction of the U.S. Army Training and Doctrine Command (TRADOC). Based on information obtained during and subsequent to a site visit (15 through 18 October 1991), 36 types of Areas Requiring Environmental Evaluation (AREEs) were identified and grouped by the following categories: Facility Operations; Maintenance/Fueling Operations; Water Treatment Operations; Training Areas; Hazardous Materials Storage/Waste Handling Areas; Sanitary Wastewater Treatment Plants; Storage Tanks; Landfills/Incinerators; Medical Facilities; Burn Pit Areas; Spill Areas; Ammunition Storage; Coal Storage; and Facility-wide AREEs. This report presents a summary of findings for each AREE and recommendations for further action.

  20. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  1. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  2. Stardust Interstellar Preliminary Examination (ISPE)

    NASA Astrophysics Data System (ADS)

    Westphal, A. J.; Allen, C.; Bajt, S.; Basset, R.; Bastien, R.; Bechtel, H.; Bleuet, P.; Borg, J.; Brenker, F.; Bridges, J.; Brownlee, D. E.; Burchell, M.; Burghammer, M.; Butterworth, A. L.; Cloetens, P.; Cody, G.; Ferroir, T.; Floss, C.; Flynn, G. J.; Frank, D.; Gainsforth, Z.; Grün, E.; Hoppe, P.; Kearsley, A.; Lemelle, L.; Leroux, H.; Lettieri, R.; Marchant, W.; Mendez, B.; Nittler, L. R.; Ogliore, R.; Postberg, F.; Sandford, S. A.; Schmitz, S.; Silversmit, G.; Simionovici, A.; Srama, R.; Stadermann, F. J.; Stephan, T.; Stroud, R. M.; Susini, J.; Sutton, S.; Trieloff, M.; Tsou, P.; Tsuchiyama, A.; Tyliczszak, T.; Vekemans, B.; Vincze, L.; Warren, J.; Zolensky, M. E.

    2009-03-01

    The Stardust Interstellar Preliminary Examination (ISPE) is a three-year effort to characterize the Stardust interstellar dust collection and collector using non-destructive techniques. We summarize the status of the ISPE.

  3. 32 CFR 651.49 - Preliminary phase.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Preliminary phase. 651.49 Section 651.49... Preliminary phase. In the preliminary phase, the proponent agency or office identifies, as early as possible... tentative list of the affected parties to be notified. A key part of this preliminary identification is...

  4. Computer Assisted Diagnosis of Chest Pain. Preliminary Manual

    DTIC Science & Technology

    1984-04-27

    addition, chest pain has been reported to be one of the most frequent causes of medical evacuation from submarines. The Naval Submarine Medical...having potentially fatal outcomes. In addition, chest pain has been reported to be one of the most frequent causes of medical evacuation from submarines...serious causes of acute chest pain . The 5 illnesses which are considered by the conputer are MY0CARD1AL INFARCTION, ANGINA, NON-SPECIFIC CNEST PAIN

  5. A Computer Program for the Preliminary Design of Contrarotating Propellers

    DTIC Science & Technology

    1975-12-01

    109 FORMAT(///22X,*ON FW)#) O) liq’ !111 110 C0OIN’IE 5007 CONTINIJF 4 E IFujpg.rQ.0) jp.,= 11UT(I)= UTI (IIK) 5IF(IVV-IVA) 96,7!,70 79 GO TO 1U4𔄀 S 9...vYXSTAR~il 1) ,COSI(42) ,COEXCS) ,COSKN(20,24) ,GB(20,42) ,GLR(20),GLRZ(209 ,G(20),G 2MA(100) sGLT (20),GT(20,42),GTL(20),MHUS(17),NLEitCtNTE(20),NUN(41U

  6. Synthesis, Preliminary Bioevaluation and Computational Analysis of Caffeic Acid Analogues

    PubMed Central

    Liu, Zhiqian; Fu, Jianjun; Shan, Lei; Sun, Qingyan; Zhang, Weidong

    2014-01-01

    A series of caffeic acid amides were designed, synthesized and evaluated for anti-inflammatory activity. Most of them exhibited promising anti-inflammatory activity against nitric oxide (NO) generation in murine macrophage RAW264.7 cells. A 3D pharmacophore model was created based on the biological results for further structural optimization. Moreover, predication of the potential targets was also carried out by the PharmMapper server. These amide analogues represent a promising class of anti-inflammatory scaffold for further exploration and target identification. PMID:24857914

  7. Preliminary heavy-light decay constants from the MILC Collaboration

    SciTech Connect

    Bernard, C.

    1994-12-01

    Preliminary results from the MILC Collaboration for f{sub B}, f{sub B{sub s}}, f{sub D}, f{sub D{sub s}} and their ratios are presented. We compute in the quenched approximation at {beta} = 6.3, 6.0 and 5.7 with Wilson light quarks and static and Wilson heavy quarks. We attempt to quantify all systematic errors other than quenching.

  8. Bacterial Identification Using Light Scattering Measurements: a Preliminary Report

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.

    1971-01-01

    The light scattering properties of single bacterial cells were examined as a possible means of identification. Three species were studied with streptococcus faecalis exhibiting a unique pattern; the light-scattering traces for staphylococcus aureus and escherichia coli were quite similar although differences existed. Based on preliminary investigations, the light scattering approach appeared promising with additional research needed to include a wide variety of bacterial species, computer capability to handle and analyze data, and expansion of light scattering theory to include bacterial cells.

  9. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  10. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  11. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  12. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  13. Computer Starters!

    ERIC Educational Resources Information Center

    Instructor, 1983

    1983-01-01

    Instructor's Computer-Using Teachers Board members give practical tips on how to get a classroom ready for a new computer, introduce students to the machine, and help them learn about programing and computer literacy. Safety, scheduling, and supervision requirements are noted. (PP)

  14. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  15. Distributed Computing.

    ERIC Educational Resources Information Center

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  16. Portable Computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    SPOC, a navigation monitoring computer used by NASA in a 1983 mission, was a modification of a commercial computer called GRiD Compass, produced by GRiD Systems Corporation. SPOC was chosen because of its small size, large storage capacity, and high processing speed. The principal modification required was a fan to cool the computer. SPOC automatically computes position, orbital paths, communication locations, etc. Some of the modifications were adapted for commercial applications. The computer is presently used in offices for conferences, for on-site development, and by the army as part of a field communications systems.

  17. Stardust Interstellar Preliminary Examination

    NASA Astrophysics Data System (ADS)

    Westphal, A.; Stardust Interstellar Preliminary Examation Team: http://www. ssl. berkeley. edu/~westphal/ISPE/

    2011-12-01

    A. J. Westphal, C. Allen, A. Ansari, S. Bajt, R. S. Bastien, H. A. Bechtel, J. Borg, F. E. Brenker, J. Bridges, D. E. Brownlee, M. Burchell, M. Burghammer, A. L. Butterworth, A. M. Davis, P. Cloetens, C. Floss, G. Flynn, D. Frank, Z. Gainsforth, E. Grün, P. R. Heck, J. K. Hillier, P. Hoppe, G. Huss, J. Huth, B. Hvide, A. Kearsley, A. J. King, B. Lai, J. Leitner, L. Lemelle, H. Leroux, R. Lettieri, W. Marchant, L. R. Nittler, R. Ogliore, F. Postberg, M. C. Price, S. A. Sandford, J.-A. Sans Tresseras, T. Schoonjans, S. Schmitz, G. Silversmit, A. Simionovici, V. A. Solé, R. Srama, T. Stephan, V. Sterken, J. Stodolna, R. M. Stroud, S. Sutton, M. Trieloff, P. Tsou, A. Tsuchiyama, T. Tyliszczak, B. Vekemans, L. Vincze, D. Zevin, M. E. Zolensky, >29,000 Stardust@home dusters ISPE author affiliations are at http://www.ssl.berkeley.edu/~westphal/ISPE/. In 2000 and 2002, a ~0.1m2 array of aerogel tiles and alumi-num foils onboard the Stardust spacecraft was exposed to the interstellar dust (ISD) stream for an integrated time of 200 days. The exposure took place in interplanetary space, beyond the orbit of Mars, and thus was free of the ubiquitous orbital debris in low-earth orbit that precludes effective searches for interstellar dust there. Despite the long exposure of the Stardust collector, <<100 ISD particles are expected to have been captured. The particles are thought to be ~1μm or less in size, and the total ISD collection is probably <10-6 by mass of the collection of cometary dust parti-cles captured in the Stardust cometary dust collector from the coma of the Jupiter-family comet Wild 2. Thus, although the first solid sample from the local interstellar medium is clearly of high interest, the diminutive size of the particles and the low numbers of particles present daunting challenges. Nevertheless, six recent developments have made a Preliminary Examination (PE) of this sample practical: (1) rapid automated digital optical scanning microscopy for three

  18. PRELIMINARY DESIGN ANALYSIS OF AXIAL FLOW TURBINES

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1994-01-01

    A computer program has been developed for the preliminary design analysis of axial-flow turbines. Rapid approximate generalized procedures requiring minimum input are used to provide turbine overall geometry and performance adequate for screening studies. The computations are based on mean-diameter flow properties and a stage-average velocity diagram. Gas properties are assumed constant throughout the turbine. For any given turbine, all stages, except the first, are specified to have the same shape velocity diagram. The first stage differs only in the value of inlet flow angle. The velocity diagram shape depends upon the stage work factor value and the specified type of velocity diagram. Velocity diagrams can be specified as symmetrical, zero exit swirl, or impulse; or by inputting stage swirl split. Exit turning vanes can be included in the design. The 1991 update includes a generalized velocity diagram, a more flexible meanline path, a reheat model, a radial component of velocity, and a computation of free-vortex hub and tip velocity diagrams. Also, a loss-coefficient calibration was performed to provide recommended values for airbreathing engine turbines. Input design requirements include power or pressure ratio, mass flow rate, inlet temperature and pressure, and rotative speed. The design variables include inlet and exit diameters, stator angle or exit radius ratio, and number of stages. Gas properties are input as gas constant, specific heat ratio, and viscosity. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, blading angles, and last stage absolute and relative Mach numbers. This program is written in FORTRAN 77 and can be ported to any computer with a standard FORTRAN compiler which supports NAMELIST. It was originally developed on an IBM 7000 series computer running VM and has been implemented on IBM PC computers and compatibles running MS-DOS under Lahey FORTRAN, and

  19. Computer Jet-Engine-Monitoring System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Ray, Ronald J.

    1992-01-01

    "Intelligent Computer Assistant for Engine Monitoring" (ICAEM), computer-based monitoring system intended to distill and display data on conditions of operation of two turbofan engines of F-18, is in preliminary state of development. System reduces burden on propulsion engineer by providing single display of summary information on statuses of engines and alerting engineer to anomalous conditions. Effective use of prior engine-monitoring system requires continuous attention to multiple displays.

  20. Assessment of systolic thickening with thallium-201 ECG-gated single-photon emission computed tomography: A parameter for local left ventricular function

    SciTech Connect

    Mochizuki, T.; Murase, K.; Fujiwara, Y.; Tanada, S.; Hamamoto, K.; Tauxe, W.N. )

    1991-08-01

    The authors measured left ventricular (LV) systolic thickening expressed as a systolic thickening ratio in 28 patients, using 201Tl ECG-gated SPECT. Five normals, 15 patients with prior myocardial infarction, 5 with hypertrophic cardiomyopathy, and 3 with dilated cardiomyopathy were studied. The systolic thickening ratio was calculated as ((end-systolic--end-diastolic pixel counts) divided by end-diastolic pixel counts), using the circumferential profile technique of both end-diastolic and end-systolic short axial images. Functional images of the systolic thickening ratio were also displayed with the bull's-eye method. The mean systolic thickening ratio thus calculated were as follows: normals, 0.53 {plus minus} 0.05 (mean {plus minus} 1 s.d.); non-transmural prior myocardial infarction, 0.33 {plus minus} 0.09; transmural prior myocardial infarction, 0.14 {plus minus} 0.05; hypertrophic cardiomyopathy in relatively nonhypertrophied areas, 0.56 {plus minus} 0.11; hypertrophic cardiomyopathy in hypertrophied areas, 0.23 {plus minus} 0.07; and dilated cardiomyopathy, 0.19 {plus minus} 0.02. The systolic thickening ratio analysis by gated thallium SPECT offers a unique approach for assessing LV function.

  1. Carbon dioxide reactivity of tumor blood flow as measured by dynamic contrast-enhanced computed tomography: a new treatment protocol for laser thermal therapy

    NASA Astrophysics Data System (ADS)

    Purdie, Thomas G.; Sherar, Michael D.; Fenster, Aaron; Lee, Ting-Yim

    2001-05-01

    The purpose of the current study is to measure the carbon dioxide reactivity of blood flow in VX2 tumor in the rabbit thigh. The carbon dioxide reactivity of the functional parameters was investigated in eight rabbits by changing the ventilation rate in order to manipulate the arterial carbon dioxide tension (PaCO2). In each experiment, functional maps were generated at four PaCO2 levels: normocapnia (PaCO2 equals 40.7 +/- 1.4 mm Hg), hypocapnia (27.1 +/- 2.5 and 33.7 +/- 2.2) and hypercapnia (53.8 +/- 5.2). The carbon dioxide reactivity of tumor blood flow showed significant differences between normocapnia and the two levels of hypocapnia, but not between normocapnia and hypercapnia. The average fractional change of blood flow from normocapnia for the two hypocapnic level was -0.41 +/- 0.06 and -0.29 +/- 0.08, respectively. The ability to reduce blood flow through hypocapnia has significant implications in thermal therapy, as heat dissipation represents a major obstacle which limits the effectiveness of treatment.

  2. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  3. Computer Programming by Kindergarten Children Using LOGO.

    ERIC Educational Resources Information Center

    Munro-Mavrias, Sandra

    Conservation ability, spatial motor ability, age, and gender were used as predictive variables in a study of 26 kindergarten children's computer programming ability. A preliminary pilot study with first graders had suggested that programming success was related to the ability to reverse thought processes. In both studies, children were taught to…

  4. Synframe: a preliminary report.

    PubMed

    Aebi, M; Steffen, T

    2000-02-01

    Both endoscopic lumbar spinal surgery and the non-standardized and unstable retractor systems for the lumbar spine presently on the market have disadvantages and limitations in relation to the minimally invasive surgical concept, which have been gradually recognized in the last few years. In an attempt to resolve some of these issues, we have developed a highly versatile retractor system, which allows access to and surgery at the lumbar, thoracic and even cervical spine. This retractor system - Synframe - is based on a ring concept allowing 360 degrees access to a surgical opening in anterior as well as posterior surgery. The ring is concentrically laid over the surgical opening for the approach and is used as a carrier for retractor arms, which are instrumented with either different sizes or types of blades and/or different sizes of Hohmann hooks. In posterior surgery, nerve root retractors can also be installed. This ring also functions as a carrier for fiberoptic illumination devices and different sizes of endoscopes, used to transmit the surgical procedure out of the depth of the surgical exposure for both teaching purposes and for the surgical team when it has no longer direct visual access to the procedure. The ring is stable, being fixed onto the operating table, allowing precise minimally open approaches and surgical procedures under direct vision with optimal illumination. This ring system also opens perspectives for an integrated minimally open surgical concept, where the ring may be used as a reference platform in computer-navigated surgery.

  5. Heterogeneous concurrent computing with exportable services

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy

    1995-01-01

    Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.

  6. Concentrating solar collector subsystem: Preliminary design package

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Preliminary design data are presented for a concentrating solar collector including an attitude controller. Provided are schedules, technical status, all documents required for preliminary design, and other program activities.

  7. Measuring Organizational Learning: A Preliminary Progress Report

    DTIC Science & Technology

    2010-08-01

    Contractor Report 2010-01 Measuring Organizational Learning : A Preliminary Progress Report Chris Winkler and Charles T...TITLE AND SUBTITLE Measuring Organizational Learning : A Preliminary Progress Report 5a. CONTRACT OR GRANT NUMBER W91WAW-07-C-0074 5b...Technical Publications Specialist (703) 602-8049 ii iii MEASURING ORGANIZATIONAL LEARNING : A PRELIMINARY

  8. 18 CFR 806.11 - Preliminary consultations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... COMMISSION REVIEW AND APPROVAL OF PROJECTS Application Procedure § 806.11 Preliminary consultations. (a) Any... Commission staff for an informal discussion of preliminary plans for the proposed project. To facilitate preliminary consultations, it is suggested that the project sponsor provide a general description of...

  9. 18 CFR 806.11 - Preliminary consultations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... COMMISSION REVIEW AND APPROVAL OF PROJECTS Application Procedure § 806.11 Preliminary consultations. (a) Any... Commission staff for an informal discussion of preliminary plans for the proposed project. To facilitate preliminary consultations, it is suggested that the project sponsor provide a general description of...

  10. Communication and Computability: The Case of Alan Mathison Turing.

    ERIC Educational Resources Information Center

    Chesebro, James W.

    1993-01-01

    Provides a preliminary examination of the relationships which exist between the disciplines of communication and computer science. Isolates the original principles which determined the development of computer science. Suggests how these early formation principles had and continue to have on the study of communication. Focuses on the seminal role…

  11. Preliminary microfluidic simulations for immersion lithography

    NASA Astrophysics Data System (ADS)

    Wei, Alexander C.; Nellis, Greg F.; Abdo, Amr Y.; Engelstad, Roxann L.; Chen, Cheng-Fu; Switkes, Michael; Rothschild, Mordechai

    2003-06-01

    The premise behind immersion lithography is to improve the resolution for optical lithography technology by increasing the index of refraction in the space between the final projection lens of an exposure system and the device wafer. This is accomplished through the insertion of a high index liquid in place of the low index air that currently fills the gap. The fluid management system must reliably fill the lens-wafer gap with liquid, maintain the fill under the lens throughout the entire wafer exposure process, and ensure that no bubbles are entrained during filling or scanning. This paper presents a preliminary analysis of the fluid flow characteristics of a liquid between the lens and the wafer in immersion lithography. The objective of this feasibility study was to identify liquid candidates that meet both optical and specific fluid mechanical requirements. The mechanics of the filling process was analyzed to simplify the problem and identify those fluid properties and system parameters that affect the process. Two-dimensional computational fluid dynamics (CFD) models of the fluid between the lens and the wafer were developed for simulating the process. The CFD simulations were used to investigate two methods of liquid deposition. In the first, a liquid is dispensed onto the wafer as a "puddle" and then the wafer and liquid move under the lens. This is referred to as passive filling. The second method involves the use of liquid jets in close proximity to the edge of the lens and is referred to as active filling. Numerical simulations of passive filling included a parametric study of the key dimensionless group influencing the filling process and an investigation of the effects of the fluid/wafer and fluid/lens contact angles and wafer direction. The model results are compared with experimental measurements. For active filling, preliminary simulation results characterized the influence of the jets on fluid flow.

  12. Developing Computation

    ERIC Educational Resources Information Center

    McIntosh, Alistair

    2004-01-01

    In this article, the author presents the results of a state project that focused on the effect of developing informal written computation processes through Years 2-4. The "developing computation" project was conducted in Tasmania over the two years 2002-2003 and involved nine schools: five government schools, two Catholic schools, and…

  13. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  14. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  15. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  16. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  17. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  18. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  19. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  20. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  1. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  2. Preliminary design studies for the DESCARTES and CIDER codes

    SciTech Connect

    Eslinger, P.W.; Miley, T.B.; Ouderkirk, S.J.; Nichols, W.E.

    1992-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) project is developing several computer codes to model the release and transport of radionuclides into the environment. This preliminary design addresses two of these codes: Dynamic Estimates of Concentrations and Radionuclides in Terrestrial Environments (DESCARTES) and Calculation of Individual Doses from Environmental Radionuclides (CIDER). The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways, given the output of the air transport code HATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. This document reports on preliminary design work performed by the code development team to determine if the requirements could be met for Descartes and CIDER. The document contains three major sections: (i) a data flow diagram and discussion for DESCARTES, (ii) a data flow diagram and discussion for CIDER, and (iii) a series of brief statements regarding the design approach required to address each code requirement.

  3. Intelligent redundant actuation system requirements and preliminary system design

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Geiger, L. J.; Harris, J.

    1985-01-01

    Several redundant actuation system configurations were designed and demonstrated to satisfy the stringent operational requirements of advanced flight control systems. However, this has been accomplished largely through brute force hardware redundancy, resulting in significantly increased computational requirements on the flight control computers which perform the failure analysis and reconfiguration management. Modern technology now provides powerful, low-cost microprocessors which are effective in performing failure isolation and configuration management at the local actuator level. One such concept, called an Intelligent Redundant Actuation System (IRAS), significantly reduces the flight control computer requirements and performs the local tasks more comprehensively than previously feasible. The requirements and preliminary design of an experimental laboratory system capable of demonstrating the concept and sufficiently flexible to explore a variety of configurations are discussed.

  4. Mobile Computing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Swietek, Gregory E. (Technical Monitor)

    1994-01-01

    The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the

  5. Radioactive waste shredding: Preliminary evaluation

    SciTech Connect

    Soelberg, N.R.; Reimann, G.A.

    1994-07-01

    The critical constraints for sizing solid radioactive and mixed wastes for subsequent thermal treatment were identified via a literature review and a survey of shredding equipment vendors. The types and amounts of DOE radioactive wastes that will require treatment to reduce the waste volume, destroy hazardous organics, or immobilize radionuclides and/or hazardous metals were considered. The preliminary steps of waste receipt, inspection, and separation were included because many potential waste treatment technologies have limits on feedstream chemical content, physical composition, and particle size. Most treatment processes and shredding operations require at least some degree of feed material characterization. Preliminary cost estimates show that pretreatment costs per unit of waste can be high and can vary significantly, depending on the processing rate and desired output particle size.

  6. Lunar Excavator Preliminary Test Video

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This video shows a preliminary test of the boom and bucket wheel assembly of the lunar excavator prototype developed by the Center for Commercial Applications of Combustion at the Colorado School of Mines in Golden. According to Michael Duke, director for the center, the wheel on the end of the boom can dig up 45.36 kilograms (100 pounds) of dirt each hour which is several times the weight of the entire device.

  7. The MUNU experiment : preliminary results

    NASA Astrophysics Data System (ADS)

    Busto, J.; MUNU Collaboration

    2000-06-01

    The MUNU collaboration has built a detector to study overlineνe - e - scattering at low energy. From the results we expect to increase the sensitivity to the neutrino magnetic moment. The detector used, a 1 m 3 T.P.C. surrounded by an anti-Compton scintillator, is running at the Bugey nuclear plant. Some preliminary results will be presented in the following.

  8. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...Summary Cloud Computing is in essence an economic model • It is a different way to acquire and manage IT resources There are multiple cloud providers...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  9. Optical computing.

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  10. Computational gestalts and perception thresholds.

    PubMed

    Desolneux, Agnès; Moisan, Lionel; Morel, Jean-Michel

    2003-01-01

    In 1923, Max Wertheimer proposed a research programme and method in visual perception. He conjectured the existence of a small set of geometric grouping laws governing the perceptual synthesis of phenomenal objects, or "gestalt" from the atomic retina input. In this paper, we review this set of geometric grouping laws, using the works of Metzger, Kanizsa and their schools. In continuation, we explain why the Gestalt theory research programme can be translated into a Computer Vision programme. This translation is not straightforward, since Gestalt theory never addressed two fundamental matters: image sampling and image information measurements. Using these advances, we shall show that gestalt grouping laws can be translated into quantitative laws allowing the automatic computation of gestalts in digital images. From the psychophysical viewpoint, a main issue is raised: the computer vision gestalt detection methods deliver predictable perception thresholds. Thus, we are set in a position where we can build artificial images and check whether some kind of agreement can be found between the computationally predicted thresholds and the psychophysical ones. We describe and discuss two preliminary sets of experiments, where we compared the gestalt detection performance of several subjects with the predictable detection curve. In our opinion, the results of this experimental comparison support the idea of a much more systematic interaction between computational predictions in Computer Vision and psychophysical experiments.

  11. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  12. Computer Poker

    ERIC Educational Resources Information Center

    Findler, Nicholas V.

    1978-01-01

    This familiar card game has interested mathematicians, economists, and psychologists as a model of decision-making in the real world. It is now serving as a vehicle for investigations in computer science. (Author/MA)

  13. Evolutionary Computing

    SciTech Connect

    Patton, Robert M; Cui, Xiaohui; Jiao, Yu; Potok, Thomas E

    2008-01-01

    The rate at which information overwhelms humans is significantly more than the rate at which humans have learned to process, analyze, and leverage this information. To overcome this challenge, new methods of computing must be formulated, and scientist and engineers have looked to nature for inspiration in developing these new methods. Consequently, evolutionary computing has emerged as new paradigm for computing, and has rapidly demonstrated its ability to solve real-world problems where traditional techniques have failed. This field of work has now become quite broad and encompasses areas ranging from artificial life to neural networks. This chapter focuses specifically on two sub-areas of nature-inspired computing: Evolutionary Algorithms and Swarm Intelligence.

  14. Computer Calculus.

    ERIC Educational Resources Information Center

    Steen, Lynn Arthur

    1981-01-01

    The development of symbolic computer algebra designed to manipulate abstract mathematical expressions is discussed. The ability of this software to mimic the standard patterns of human problem solving represents a major advance toward "true" artificial intelligence. (MP)

  15. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  16. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  17. Quantum Computing

    DTIC Science & Technology

    1998-04-01

    information representation and processing technology, although faster than the wheels and gears of the Charles Babbage computation machine, is still in...the same computational complexity class as the Babbage machine, with bits of information represented by entities which obey classical (non-quantum...nuclear double resonances Charles M Bowden and Jonathan P. Dowling Weapons Sciences Directorate, AMSMI-RD-WS-ST Missile Research, Development, and

  18. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  19. Orbit Determination with Very Short Arcs: Preliminary Orbits and Identifications

    NASA Astrophysics Data System (ADS)

    Milani, A.; Gronchi, G. F.; Knezevic, Z.; Sansaturio, M. E.

    2004-05-01

    When the observation of a new asteroid are not enough to compute an orbit we can represent them with an attributable (two angles and their time derivatives). The undetermined range and range rate span an admissible region of solar system orbits, which can be represented by a set of Virtual Asteroids (VAs) selected by an optimal triangulation (see the presentation by G. Gronchi). The four coordinates of the attributable are the result of a fit and have a covariance matrix. Thus the predictions of future observations have a quasi-product structure (admissible region times confidence ellipsoid), approximated by a triangulation with a confidence ellipsoid for each node. If we have >2 observations we can also estimate the geodetic curvature and the acceleration of the observed path on the celestial sphere. If both are significantly measured they constrain the range and the range rate and may allow to reduce the size of the admissible region. To compute a a preliminary orbit starting from two attributables, for each VA (selected in the admissible region of the first arc) we consider the prediction at the time of the second and its covariance matrix, and we compare them with the attributable of the second arc with its covariance. By using the identification penalty (as in the algorithms for orbit identification) we can select as a preliminary orbit the VAs which fits together both arcs in the 8-dimensional space. Two attributables may not be enough to compute an orbit with convergent differential corrections. The preliminary orbit is used in a constrained differential correction, providing solutions along the Line Of Variations, to be used as second generation VAs to predict the observations at the time of a third arc. In general the identification with a third arc ensures a well determined orbit.

  20. Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.

    2011-01-01

    This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.

  1. Preliminary Results of Autotuning GEMM Kernels for the NVIDIA Kepler Architecture- GeForce GTX 680

    SciTech Connect

    Kurzak, Jakub; Luszczek, Pitor; Tomov, Stanimire; Dongarra, Jack

    2012-04-01

    Kepler is the newest GPU architecture from NVIDIA, and the GTX 680 is the first commercially available graphics card based on that architecture. Matrix multiplication is a canonical computational kernel, and often the main target of initial optimization efforts for a new chip. This article presents preliminary results of automatically tuning matrix multiplication kernels for the Kepler architecture using the GTX 680 card.

  2. CALAS: Carpathian laser strainmeter: a project and preliminary results

    NASA Astrophysics Data System (ADS)

    Garoi, F.; Apostol, D.; Damian, V.; Logofătu, P. C.; Ioniţă, B. F.; Lazar, J.; Molesini, G.; Papadopoulos, T.; Ionescu, C.; Ţugui, Andreea

    2008-06-01

    A laser strainmeter for in-situ monitoring of an important actively seismic area of Europe, namely Vrancea region in Romania is proposed. Six groups from four different countries (Romania, Czech Republic, Italy and Greece) with various areas of expertise (e.g. geophysics, lasers, optics, interferometry, and mechanics) are involved in order to sustain the complexity of the project. This paper presents some preliminary laboratory experiments related to measuring relative displacements with a stable interferometer. Displacements of the order of tens to hundreds of nanometers (80 to 285 nm) were measured with uncertainty of +/-1 nm. A computer algorithm was used to process the interferograms.

  3. Preliminary Study for a Tetrahedron Formation: Quality Factors and Visualization

    NASA Technical Reports Server (NTRS)

    Guzman, Jose J.; Schiff, Conrad; Bauer, Frank (Technical Monitor)

    2002-01-01

    Spacecraft flying in tetrahedron formations are excellent for electromagnetic and plasma studies. The quality of the science recorded is strongly affected by the tetrahedron evolution. This paper is a preliminary study on the computation of quality factors and visualization for a formation of four or five satellites. Four of the satellites are arranged geometrically in a tetrahedron shape. If a fifth satellite is present, it is arbitrarily initialized at the geometric center of the tetrahedron. The fifth satellite could act as a collector or as a spare spacecraft. Tetrahedron natural coordinates are employed for the initialization. The natural orbit evolution is visualized in geocentric equatorial inertial and in geocentric solar magnetospheric coordinates.

  4. Optical scattering (TAOS) by tire debris particles: preliminary results

    NASA Astrophysics Data System (ADS)

    Crosta, Giovanni F.; Camatini, Marina C.; Zomer, Simeone; Holler, Stephen; Pan, Yongle; Bhaskara, Praveena; Muangchareon, Pongphisanu; Sung, Changmo; Cencetti, Simone; Regazzoni, Claudia

    2001-03-01

    Tire debris particles from low severity laboratory wear tests have been investigated by the TAOS optical scattering facility at Yale University. The incident wavelength is 532 nm. After the TAOS event some particle samples have been imaged by a scanning electron microscope and microanalyzed. The TAOS intensity patterns recorded within a solid angle in the backward sector have been processed by cluster analysis and compared with the patterns computed by a T-matrix code. Preliminary agreement has been found between TAOS data and the particle models (size, shape, refractive index). The purpose of the investigation is to obtain signatures of the material, based on its TAOS pattern.

  5. Optical scattering (TAOS) by tire debris particles: preliminary results.

    PubMed

    Crosta, G; Camatini, M; Zomer, S; Holler, S; Pan, Y; Bhaskara, P; Muangchareon, P; Sung, C; Cencetti, S; Regazzoni, C

    2001-03-12

    Tire debris particles from low severity laboratory wear tests have been investigated by the TAOS optical scattering facility at Yale University. The incident wavelength is 532 nm. After the TAOS event some particle samples have been imaged by a scanning electron microscope and microanalyzed. The TAOS intensity patterns recorded within a solid angle in the backward sector have been processed by cluster analysis and compared with the patterns computed by a T-matrix code. Preliminary agreement has been found between TAOS data and the particle models (size, shape, refractive index). The purpose of the investigation is to obtain signatures of the material, based on its TAOS pattern.

  6. A preliminary experiment definition for video landmark acquisition and tracking

    NASA Technical Reports Server (NTRS)

    Schappell, R. T.; Tietz, J. C.; Hulstrom, R. L.; Cunningham, R. A.; Reel, G. M.

    1976-01-01

    Six scientific objectives/experiments were derived which consisted of agriculture/forestry/range resources, land use, geology/mineral resources, water resources, marine resources and environmental surveys. Computer calculations were then made of the spectral radiance signature of each of 25 candidate targets as seen by a satellite sensor system. An imaging system capable of recognizing, acquiring and tracking specific generic type surface features was defined. A preliminary experiment definition and design of a video Landmark Acquisition and Tracking system is given. This device will search a 10-mile swath while orbiting the earth, looking for land/water interfaces such as coastlines and rivers.

  7. GRIMD: distributed computing for chemists and biologists

    PubMed Central

    Piotto, Stefano; Biasi, Luigi Di; Concilio, Simona; Castiglione, Aniello; Cattaneo, Giuseppe

    2014-01-01

    Motivation: Biologists and chemists are facing problems of high computational complexity that require the use of several computers organized in clusters or in specialized grids. Examples of such problems can be found in molecular dynamics (MD), in silico screening, and genome analysis. Grid Computing and Cloud Computing are becoming prevalent mainly because of their competitive performance/cost ratio. Regrettably, the diffusion of Grid Computing is strongly limited because two main limitations: it is confined to scientists with strong Computer Science background and the analyses of the large amount of data produced can be cumbersome it. We have developed a package named GRIMD to provide an easy and flexible implementation of distributed computing for the Bioinformatics community. GRIMD is very easy to install and maintain, and it does not require any specific Computer Science skill. Moreover, permits preliminary analysis on the distributed machines to reduce the amount of data to transfer. GRIMD is very flexible because it shields the typical computational biologist from the need to write specific code for tasks such as molecular dynamics or docking calculations. Furthermore, it permits an efficient use of GPU cards whenever is possible. GRIMD calculations scale almost linearly and, therefore, permits to exploit efficiently each machine in the network. Here, we provide few examples of grid computing in computational biology (MD and docking) and bioinformatics (proteome analysis). Availability GRIMD is available for free for noncommercial research at www.yadamp.unisa.it/grimd Supplementary information www.yadamp.unisa.it/grimd/howto.aspx PMID:24516326

  8. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  9. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  10. Quantum computers.

    PubMed

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  11. Qubus computation

    NASA Astrophysics Data System (ADS)

    Munro, W. J.; Nemoto, Kae; Spiller, T. P.; van Loock, P.; Braunstein, Samuel L.; Milburn, G. J.

    2006-08-01

    Processing information quantum mechanically is known to enable new communication and computational scenarios that cannot be accessed with conventional information technology (IT). We present here a new approach to scalable quantum computing---a "qubus computer"---which realizes qubit measurement and quantum gates through interacting qubits with a quantum communication bus mode. The qubits could be "static" matter qubits or "flying" optical qubits, but the scheme we focus on here is particularly suited to matter qubits. Universal two-qubit quantum gates may be effected by schemes which involve measurement of the bus mode, or by schemes where the bus disentangles automatically and no measurement is needed. This approach enables a parity gate between qubits, mediated by a bus, enabling near-deterministic Bell state measurement and entangling gates. Our approach is therefore the basis for very efficient, scalable QIP, and provides a natural method for distributing such processing, combining it with quantum communication.

  12. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  13. Preliminary A{ampersand}PCT multiple detector design

    SciTech Connect

    Roberson, G. P.; Martz, H. E.; Camp, D. C.; Decman, D. J.; Johansson, E. M.

    1997-06-30

    The next generation, multi-detector active and passive computed tomography (A&PCT) scanner will be optimized for speed and accuracy. At the Lawrence Livermore National Lab (LLNL) we have demonstrated the trade-offs between different A&PCT design parameters that affect the speed and quality of the assay results. These fundamental parameters govern the optimum system design. Although the multi-detector scanner design has priority put on speed to increase waste drum throughput, higher speed should not compromise assay accuracy. One way to increase the speed of the A&PCT technology is to use multiple detectors. This yields a linear speedup by a factor approximately equal to the number of detectors used without a compromise in system accuracy. There are many different design scenarios that can be developed using multiple detectors. Here we describe four different scenarios and discuss the trade-offs between them. Also, some considerations are given in this design description for the implementation of a multiple detector technology in a field- deployable mobile trailer system.

  14. Preliminary ECLSS waste water model

    NASA Technical Reports Server (NTRS)

    Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.

    1991-01-01

    A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.

  15. Preliminary considerations concerning actinide solubilities

    SciTech Connect

    Newton, T.W.; Bayhurst, B.P.; Daniels, W.R.; Erdal, B.R.; Ogard, A.E.

    1980-01-01

    Work at the Los Alamos Scientific Laboratory on the fundamental solution chemistry of the actinides has thus far been confined to preliminary considerations of the problems involved in developing an understanding of the precipitation and dissolution behavior of actinide compounds under environmental conditions. Attempts have been made to calculate solubility as a function of Eh and pH using the appropriate thermodynamic data; results have been presented in terms of contour maps showing lines of constant solubility as a function of Eh and pH. Possible methods of control of the redox potential of rock-groundwater systems by the use of Eh buffers (redox couples) is presented.

  16. Dielectric cure monitoring: Preliminary studies

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Semmel, M. L.

    1984-01-01

    Preliminary studies have been conducted on two types of dielectric cure monitoring systems employing both epoxy resins and phenolic composites. An Audrey System was used for 23 cure monitoring runs with very limited success. Nine complete cure monitoring runs have been investigated using a Micromet System. Two additional measurements were performed to investigate the Micromet's sensitivity to water absorption in a post-cure carbon-phenolic material. While further work is needed to determine data significance, the Micromet system appears to show promise as a feedback control device during processing.

  17. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  18. Payload/orbiter contamination control requirement study: Computer interface

    NASA Technical Reports Server (NTRS)

    Bareiss, L. E.; Hooper, V. W.; Ress, E. B.; Strange, D. A.

    1976-01-01

    A preliminary assessment of the computer interface requirements of the Spacelab configuration contamination computer model was conducted to determine the compatibility of the program, as presently formatted, with the computer facilities at MSFC. The necessary Spacelab model modifications are pointed out. The MSFC computer facilities and their future plans are described, and characteristics of the various computers as to availability and suitability for processing the contamination program are discussed. A listing of the CDC 6000 series and UNIVAC 1108 characteristics is presented so that programming requirements can be compared directly and differences noted.

  19. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  20. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  1. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  2. Topaz II preliminary safety assessment

    NASA Astrophysics Data System (ADS)

    Marshall, Albert C.; Standley, Vaughn; Voss, Susan S.; Haskin, Eric

    1993-01-01

    The Strategic Defense Initiative Organization (SDIO) decided to investigate the possibility of launching a Russian Topaz II space nuclear power system. A preliminary safety assessment was conducted to determine whether or not a space mission could be conducted safely and within budget constraints. As part of this assessment, a safety policy and safety functional requirements were developed to guide both the safety assessment and future Topaz II activities. A review of the Russian flight safety program was conducted and documented. Our preliminary safety assessment included a top level event tree, neutronic analysis of normal and accident configurations, an evaluation of temperature coefficients of reactivity, a reentry and disposal analysis, and analysis of postulated launch abort impact accidents, and an analysis of postulated propellant fire and explosion accidents. Based on the assessment, it appears that it will be possible to safely launch the Topaz II system in the U.S. with some possible system modifications. The principal system modifications will probably include design changes to preclude water flooded criticality and to assure intact reentry.

  3. Preliminary Investigation of a Paraglider

    NASA Technical Reports Server (NTRS)

    Rogallo, Francis M.; Lowry, John G.; Croom, Delwin R.; Taylor, Robert T.

    1960-01-01

    A preliminary investigation of the aerodynamic and control characteristics of a flexible glider similar to a parachute in construction has been made at the Langley Research Center to evaluate its capabilities as a reentry glider. Preliminary weight estimates of the proposed vehicle indicate that such a structure can be made with extremely low wing loading. Maximum temperatures during the reentry maneuver might be held as low as about 1,500 F. The results of wind-tunnel and free-glide tests show that the glider when constructed of nonporous material performed extremely well at subsonic speeds and could be flown at angles of attack from about 200 to 900. At supersonic speeds the wing showed none of the unfavorable tendencies exhibited by conventional parachutes at these speeds, such as squidding and breathing. Several methods of packing and deploying the glider have been successfully demonstrated. The results of this study indicate that this flexible-lifting-surface concept may provide a lightweight controllable paraglider for manned space vehicles.

  4. Descent Advisor Preliminary Field Test

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Vivona, Robert A.; Sanford, Beverly

    1995-01-01

    A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were: (1) to evaluate the accuracy of DA trajectory predictions for conventional and flight-management system equipped jet transports, (2) to identify significant sources of trajectory prediction error, and (3) to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 seconds late with a standard deviation of 13.1 seconds. This paper describes the field test and presents preliminary results for the commercial flights.

  5. Topaz II preliminary safety assessment

    SciTech Connect

    Marshall, A.C. ); Standley, V. ); Voss, S.S. ); Haskin, E. . Dept. of Chemical and Nuclear Engineering)

    1992-01-01

    The Strategic Defense Initiative Organization (SDIO) decided to investigate the possibility of launching a Russian Topaz 11 space nuclear power system. A preliminary safety assessment was conducted to determine whether or not a space mission could be conducted safely and within budget constraints. As part of this assessment, a safety policy and safety functional requirements were developed to guide both the safely assessment and future Topaz II activities. A review of the Russian flight safety program was conducted and documented. Our preliminary safety assessment included a top level event tree, neutronic analysis of normal and accident configurations, an evaluation of temperature coefficients of reactivity, a reentry and disposal analysis, and analysis of postulated launch abort impact accidents, and an analysis of postulated propellant fire and explosion accidents. Based on the assessment, it appears that it will be possible to safely launch the Topaz II system in the US with some possible system modifications. The principal system modifications will probably include design changes to preclude water flooded criticality and to assure intact reentry.

  6. Topaz II preliminary safety assessment

    SciTech Connect

    Marshall, A.C. ); Standley, V. ); Voss, S.S. ); Haskin, E. )

    1993-01-10

    The Strategic Defense Initiative Organization (SDIO) decided to investigate the possibility of launching a Russian Topaz II space nuclear power system. A preliminary safety assessment was conducted to determine whether or not a space mission could be conducted safely and within budget constraints. As part of this assessment, a safety policy and safety functional requirements were developed to guide both the safety assessment and future Topaz II activities. A review of the Russian flight safety program was conducted and documented. Our preliminary safety assessment included a top level event tree, neutronic analysis of normal and accident configurations, an evaluation of temperature coefficients of reactivity, a reentry and disposal analysis, and analysis of postulated launch abort impact accidents, and an analysis of postulated propellant fire and explosion accidents. Based on the assessment, it appears that it will be possible to safely launch the Topaz II system in the U.S. with some possible system modifications. The principal system modifications will probably include design changes to preclude water flooded criticality and to assure intact reentry.

  7. Computational Hearing

    DTIC Science & Technology

    1998-11-01

    ranging from the anatomy and physiology of the auditory pathway to the perception of speech and music under both ideal and not-so-ideal (but more...physiology of various parts of the auditory pathway, to auditory prostheses, speech and audio coding, computational models of pitch and timbre , the role of

  8. Library Computing.

    ERIC Educational Resources Information Center

    Dayall, Susan A.; And Others

    1987-01-01

    Six articles on computers in libraries discuss training librarians and staff to use new software; appropriate technology; system upgrades of the Research Libraries Group's information system; pre-IBM PC microcomputers; multiuser systems for small to medium-sized libraries; and a library user's view of the traditional card catalog. (EM)

  9. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  10. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  11. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  12. Computer Guerrillas.

    ERIC Educational Resources Information Center

    Immel, A. Richard

    1983-01-01

    Describes several cases in which microcomputers were used to prevent large organizations (e.g., utility companies, U.S. Government Forestry Commission) from carrying out actions considered not to be in the public's best interests. The use of the computers by social activitists in their efforts to halt environmental destruction is discussed. (EAO)

  13. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  14. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  15. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  16. Computer Corner.

    ERIC Educational Resources Information Center

    Smith, David A.; And Others

    1986-01-01

    APL was invented specifically as a mathematical teaching tool, and is an excellent vehicle for teaching mathematical concepts using computers. This article illustrates the use of APL in teaching many different topics in mathematics, including logic, set theory, functions, statistics, linear algebra, and matrices. (MNS)

  17. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  18. COMPUTATIONAL SOCIOLINGUISTICS.

    ERIC Educational Resources Information Center

    SEDELOW, WALTER A., JR.

    THE USE OF THE COMPUTER MAY BE ONE OF THE WAYS IN WHICH VARIED LINGUISTIC INTERESTS (SOCIOLINGUISTICS, PSYCHOLINGUISTICS) COME TO BE RENDERED INTERRELATED AND EVEN INTELLECTUALLY COHERENT. (THE CRITERION OF COHERENCE IS SET HERE AT MONISM AS TO MODELS.) ONE OF THE AUTHOR'S MAJOR INTERESTS IS A SYSTEMATIC APPROACH TO SCIENTIFIC CREATIVITY,…

  19. Computational Mathematics

    DTIC Science & Technology

    2012-03-06

    Marsha Berger, NYU) Inclusion of the Adaptation/Adjoint module, Embedded Boundary Methods in the software package Cart3D --- Transition to NASA...ONR, DOE, AFRL, DIA Cart3D used for computing Formation Flight to reduce drag and improve energy efficiency Application to Explosively Formed

  20. Aerodynamic preliminary analysis system 2. Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    Sova, G.; Divan, P.; Spacht, L.

    1991-01-01

    An aerodynamic analysis system based on potential theory at subsonic and/or supersonic speeds and impact type finite element solutions at hypersonic conditions is described. Three dimensional configurations have multiple nonplanar surfaces of arbitrary planforms and bodies of noncircular contour may be analyzed. Static, rotary, and control longitudinal and lateral-directional characteristics may be generated. The analysis was implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis. Computation times on an IBM 3081 are typically less than one minute of CPU/Mach number at subsonic, supersonic, or hypersonic speeds. This is a user manual for the computer programming.

  1. Geometric Computational Mechanics and Optimal Control

    DTIC Science & Technology

    2011-12-02

    methods. Further methods that depend on global optimization problems are in development and preliminary versions of these results, many of which...de la Sociedad Espanola de Matimatica Aplicada (SeMA), 50, 2010, pp 61-81. K. Flaßkamp, S. Ober-Blöbaum, M. Kobilarov, Solving optimal control...continuous setting. Consequently, globally optimal methods for computing optimal trajectories for vehicles with complex dynamics were developed. The

  2. Orbital transfer rocket engine technology 7.5K-LB thrust rocket engine preliminary design

    NASA Technical Reports Server (NTRS)

    Harmon, T. J.; Roschak, E.

    1993-01-01

    A preliminary design of an advanced LOX/LH2 expander cycle rocket engine producing 7,500 lbf thrust for Orbital Transfer vehicle missions was completed. Engine system, component and turbomachinery analysis at both on design and off design conditions were completed. The preliminary design analysis results showed engine requirements and performance goals were met. Computer models are described and model outputs are presented. Engine system assembly layouts, component layouts and valve and control system analysis are presented. Major design technologies were identified and remaining issues and concerns were listed.

  3. 37 CFR 1.115 - Preliminary amendments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... amendment unduly interferes with the preparation of a first Office action in an application. Factors that... first Office action as of the date of receipt (§ 1.6) of the preliminary amendment by the Office; and... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Preliminary amendments....

  4. 37 CFR 1.115 - Preliminary amendments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... amendment unduly interferes with the preparation of a first Office action in an application. Factors that... first Office action as of the date of receipt (§ 1.6) of the preliminary amendment by the Office; and... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Preliminary amendments....

  5. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS UTILITIES Utility Relocations, Adjustments, and Reimbursement § 645.109 Preliminary engineering. (a)...

  6. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS UTILITIES Utility Relocations, Adjustments, and Reimbursement § 645.109 Preliminary engineering. (a)...

  7. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS UTILITIES Utility Relocations, Adjustments, and Reimbursement § 645.109 Preliminary engineering. (a)...

  8. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS UTILITIES Utility Relocations, Adjustments, and Reimbursement § 645.109 Preliminary engineering. (a)...

  9. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS UTILITIES Utility Relocations, Adjustments, and Reimbursement § 645.109 Preliminary engineering. (a)...

  10. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  11. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  12. 32 CFR 1801.11 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1801.11 Section 1801.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing Of Privacy Act Requests § 1801.11 Preliminary...

  13. 32 CFR 1801.11 - Preliminary information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Preliminary information. 1801.11 Section 1801.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing Of Privacy Act Requests § 1801.11 Preliminary...

  14. 32 CFR 1801.11 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1801.11 Section 1801.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing Of Privacy Act Requests § 1801.11 Preliminary...

  15. 32 CFR 1801.11 - Preliminary information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Preliminary information. 1801.11 Section 1801.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing Of Privacy Act Requests § 1801.11 Preliminary...

  16. 32 CFR 1801.11 - Preliminary information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Preliminary information. 1801.11 Section 1801.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing Of Privacy Act Requests § 1801.11 Preliminary...

  17. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  18. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  19. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  20. 18 CFR 806.11 - Preliminary consultations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... any proposed structures, anticipated water needs, and the environmental impacts. (b) Preliminary... 18 Conservation of Power and Water Resources 2 2013-04-01 2012-04-01 true Preliminary consultations. 806.11 Section 806.11 Conservation of Power and Water Resources SUSQUEHANNA RIVER...

  1. 18 CFR 806.11 - Preliminary consultations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... any proposed structures, anticipated water needs, and the environmental impacts. (b) Preliminary... 18 Conservation of Power and Water Resources 2 2014-04-01 2014-04-01 false Preliminary consultations. 806.11 Section 806.11 Conservation of Power and Water Resources SUSQUEHANNA RIVER...

  2. 18 CFR 806.11 - Preliminary consultations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... any proposed structures, anticipated water needs, and the environmental impacts. (b) Preliminary... 18 Conservation of Power and Water Resources 2 2012-04-01 2012-04-01 false Preliminary consultations. 806.11 Section 806.11 Conservation of Power and Water Resources SUSQUEHANNA RIVER...

  3. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  4. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  5. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  6. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  7. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing of Privacy Act Requests § 1901.11 Preliminary information....

  8. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS UNDER THE PRIVACY ACT OF 1974 Filing of Privacy Act Requests § 1901.11 Preliminary information....

  9. Plutonium Immobilization Can Loading Preliminary Specifications

    SciTech Connect

    Kriikku, E.

    1998-11-25

    This report discusses the Plutonium Immobilization can loading preliminary equipment specifications and includes a process block diagram, process description, equipment list, preliminary equipment specifications, plan and elevation sketches, and some commercial catalogs. This report identifies loading pucks into cans and backfilling cans with helium as the top priority can loading development areas.

  10. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  11. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  12. Computational Physics

    NASA Astrophysics Data System (ADS)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  13. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  14. Spatial Computation

    DTIC Science & Technology

    2003-12-01

    particular program, synthesized under compiler control from the application source code . The translation is illustrated in Figure 1.4. From now on, when we use...very efficient method of exploring the design of complex application-specific system-on-a-chip devices using only the application source code . • New...computation gates. This frees, but also complicates, the com- pilation process. In order to handle the great semantic gap between the source code and the

  15. Computational enzymology.

    PubMed

    Lonsdale, Richard; Ranaghan, Kara E; Mulholland, Adrian J

    2010-04-14

    Molecular simulations and modelling are changing the science of enzymology. Calculations can provide detailed, atomic-level insight into the fundamental mechanisms of biological catalysts. Computational enzymology is a rapidly developing area, and is testing theories of catalysis, challenging 'textbook' mechanisms, and identifying novel catalytic mechanisms. Increasingly, modelling is contributing directly to experimental studies of enzyme-catalysed reactions. Potential practical applications include interpretation of experimental data, catalyst design and drug development.

  16. Computational Electromagnetics

    DTIC Science & Technology

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  17. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  18. Surveyor 3 Preliminary Science Results

    NASA Technical Reports Server (NTRS)

    1967-01-01

    Surveyor III soft-landed on the Moon at 00:04 GMT on April 20, 1967. Data obtained have significantly increased our knowledge of the Moon. The Surveyor III spacecraft was similar to Surveyor I; the only major change in scientific instrumentation was the addition of a soil mechanics surface sampler. Surveyor III results at this preliminary evaluation of data give valuable information about the relation between the surface skin of under-dense material responsible for the photometric properties and the deeper layers of material whose properties resemble those of ordinary terrestrial soils. In addition, they provide new insight into the relation between the general lunar surface as seen by Surveyor I and the interior of a large subdued crater. The new results have also contributed to our understanding of the mechanism of downhill transport. Many critical questions cannot, however, be answered until final reduction of experimental data.

  19. Psychohistory and Slavery: Preliminary Issues.

    PubMed

    Adams, Kenneth Alan

    2015-01-01

    "Psychohistory and Slavery: Preliminary Issues," begins an examination of slavery in the antebellum South. The paper suggests that how slavery and the group-fantasy of white male supremacy were perpetuated among slaveholders is a question of fundamental importance for psychohistorians. The family and childrearing are the focus of attention. Given the ferocity of slavery, it is argued that the psychological and emotional consequences of this barbarism were not limited to the slaves themselves, but had significant impact on the slaveholders as well-their parenting, their children, and their children's parenting of the next generation. In each generation the trauma of slavery was injected into slaveholder children and became a fundamental component of elite Southern personality.

  20. Genesis Preliminary Examination: Ellipsometry Overview

    NASA Technical Reports Server (NTRS)

    Stansbery, E. K.; McNamara, K. M.

    2005-01-01

    The Genesis spacecraft returned to Earth on September 8, 2004, experiencing a non-nominal reentry in which both the drogue and main parachutes failed to deploy causing the capsule to impact the surface of the UTTR desert at a speed of approximately 310 kph (193 mph). The impact caused severe damage to the capsule and a breach of the science canister in the field. The science canister was recovered and transported to the cleanroom at UTTR within approximately 8 hours of reentry. Although the ground water table did not rise to canister level before removal, damp soil and debris from the heat shield and other spacecraft components did enter the canister and contaminate some collector surfaces. The objective of preliminary examination of the Genesis collectors is to provide the science community with the information necessary to request the most useful samples for their analysis.

  1. Preliminary results of UCN τ

    NASA Astrophysics Data System (ADS)

    Pattie, Robert; UCNtau Collaboration

    2017-01-01

    There is currently a 4 σ discrepancy between measurements of the neutron lifetime performed using cold neutron beams and those performed with ultracold neutron (UCN) storage vessels. The UCN τ experiment uses an asymmetric magneto-gravitational UCN trap with in situ counting of surviving neutrons to measure the neutron lifetime. This design eliminates a major systematic of previous bottle experiments related to the loss of UCN on material trap walls and with unloading neutrons from the storage vessel. A new in situ detection system was used in the 2015-2016 run that was able to measure the population of surviving UCN at different heights in the trap, providing important information on spectral evolution. Understanding the behavior of quasi-bound UCN in a bottle experiment is essential to achieving a subsecond precision measurement of τn. We will present the preliminary results from the 2015-2016 data set and an update on the UCN τ experiment.

  2. Monsoon '90 - Preliminary SAR results

    NASA Technical Reports Server (NTRS)

    Dubois, Pascale C.; Van Zyl, Jakob J.; Guerra, Abel G.

    1992-01-01

    Multifrequency polarimetric synthetic aperture radar (SAR) images of the Walnut Gulch watershed near Tombstone, Arizona were acquired on 28 Mar. 1990 and on 1 Aug. 1990. Trihedral corner reflectors were deployed prior to both overflights to allow calibration of the two SAR data sets. During both overflights, gravimetric soil moisture and dielectric constant measurements were made. Detailed vegetation height, density, and water content measurements were made as part of the Monsoon 1990 Experiment. Preliminary results based on analysis of the multitemporal polarimetric SAR data are presented. Only the C-band data (5.7-cm wavelength) radar images show significant difference between Mar. and Aug., with the strongest difference observed in the HV images. Based on the radar data analysis and the in situ measurements, we conclude that these differences are mainly due to changes in the vegetation and not due to the soil moisture changes.

  3. Monsoon 1990: Preliminary SAR results

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob J.; Dubois, Pascale; Guerra, Abel

    1991-01-01

    Multifrequency polarimetric synthetic aperture radar (SAR) images of the Walnut Gulch watershed near Tombstone, Arizona were acquired on 28 Mar. 1990 and on 1 Aug. 1990. Trihedral corner reflectors were deployed prior to both overflights to allow calibration of the two SAR data sets. During both overflights, gravimetric soil moisture and dielectric constant measurements were made. Detailed vegetation height, density, and water content measurements were made as part of the Monsoon 1990 Experiment. Preliminary results based on analysis of the multitemporal polarimetric SAR data are presented. Only the C-band data (5.7-cm wavelength) radar images show significant difference between Mar. and Aug., with the strongest difference observed in the HV images. Based on the radar data analysis and the in situ measurements, we conclude that these differences are mainly due to changes in the vegetation and not due to the soil moisture changes.

  4. Automatic system for computer program documentation

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.; Elliott, R. W.; Arseven, S.; Colunga, D.

    1972-01-01

    Work done on a project to design an automatic system for computer program documentation aids was made to determine what existing programs could be used effectively to document computer programs. Results of the study are included in the form of an extensive bibliography and working papers on appropriate operating systems, text editors, program editors, data structures, standards, decision tables, flowchart systems, and proprietary documentation aids. The preliminary design for an automated documentation system is also included. An actual program has been documented in detail to demonstrate the types of output that can be produced by the proposed system.

  5. Classical problems in computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.

  6. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  7. Stardust interstellar preliminary examination (ISPE).

    SciTech Connect

    Westphal, A.J.; Allen, C.; Bajt, S.; Basset, R.; Flynn, G.L.; Sutton, S.

    2009-03-23

    The Stardust Interstellar Preliminary Examination (ISPE) is a three-year effort to characterize the Stardust interstellar dust collection and collector using non-destructive techniques. We summarize the status of the ISPE. In January 2006 the Stardust sample return capsule returned to Earth bearing the first solid samples from a primitive solar system body, Comet 81P/Wild2, and a collector dedicated to the capture and return of contemporary interstellar dust. Both collectors were {approx}0.1 m{sup 2} in area and were composed of aerogel tiles (85% of the collecting area) and aluminum foils. The Stardust Interstellar Dust Collector (SIDC) was exposed to the interstellar dust stream for a total exposure factor of 20 m{sup 2}-day during two periods before the cometary encounter. The Stardust Interstellar Preliminary Examination (ISPE) is a three-year effort to characterize the collection using nondestructive techniques. The goals and restrictions of the ISPE are described in Westphal et al. The ISPE consists of six interdependent projects: (1) Candidate identification through automated digital microscopy and a massively distributed, calibrated search; (2) Candidate extraction and photodocumentation; (3) Characterization of candidates through synchrotron-based Fourier-Tranform Infrared Spectroscopy (FTIR), Scanning X-Ray Fluoresence Microscopy (SXRF), and Scanning Transmission X-ray Microscopy (STXM); (4) Search for and analysis of craters in foils through FESEM scanning, Auger Spectroscopy and synchrotron-based Photoemission Electron Microscopy (PEEM); (5) Modeling of interstellar dust transport in the solar system; and (6) Laboratory simulations of hypervelocity dust impacts into the collecting media.

  8. Preliminary Investigation of an Active PLZT Lens

    NASA Technical Reports Server (NTRS)

    Lightsey, W. D.; Peters, B. R.; Reardon, P. J.; Wong, J. K.

    2001-01-01

    The design, analysis and preliminary testing of a prototype Adjustable Focus Optical Correction Lens (AFOCL) is described. The AFOCL is an active optical component composed of solid state lead lanthanum-modified zirconate titanate (PLZT) ferroelectric ceramic with patterned indium tin oxide (ITO) transparent surface electrodes that modulate the refractive index of the PLZT to function as an electro-optic lens. The AFOCL was developed to perform optical re-alignment and wavefront correction to enhance the performance of Ultra-Lightweight Structures and Space Observatories (ULSSO). The AFOCL has potential application as an active optical component within a larger optical system. As such, information from a wavefront sensor would be processed to provide input to the AFOCL to drive the sensed wavefront to the desired shape and location. While offering variable and rapid focussing capability (controlled wavefront manipulation) similar to liquid crystal based spatial light modulators (SLM), the AFOCL offers some potential advantages because it is a solid-state, stationary, low-mass, rugged, and thin optical element that can produce wavefront quality comparable to the solid refractive lens it replaces. The AFOCL acts as a positive or negative lens by producing a parabolic phase-shift in the PLZT material through the application of a controlled voltage potential across the ITO electrodes. To demonstrate the technology, a 4 mm diameter lens was fabricated to produce 5-waves of optical power operating at 2.051 micrometer wavelength. Optical metrology was performed on the device to measure focal length, optical quality, and efficiency for a variety of test configurations. The data was analyzed and compared to theoretical data available from computer-based models of the AFOCL.

  9. Preliminary results of radiation measurements on EURECA

    SciTech Connect

    Benton, E.V.; Frank, A.L.

    1995-03-01

    The eleven-month duration of the EURECA mission allows long-term radiation effects to be studied similarly to those of the Long Duration Exposure Facility (LDEF). Basic data can be generated for projections to crew doses and electronic and computer reliability on spacecraft missions. A radiation experiment has been designed for EURECA which uses passive integrating detectors to measure average radiation levels. The components include a Trackoscope, which employs fourteen plastic nuclear track detector (PNTD) stacks to measure the angular dependence of high LET (greater than or equal to 6 keV/micro m) radiation. Also included are TLD`s for total absorbed doses, thermal/resonance neutron detectors (TRND`s) for low energy neutron fluences and a thick PNTD stack for depth dependence measurements. LET spectra are derived from the PNTD measurements. Preliminary TLD results from seven levels within the detector array show that integrated dose inside the flight canister varied from 18.8 +/- 0.6 cGy to 38.9 +/- 1.2 cGy. The TLD`s oriented toward the least shielded direction averaged 53% higher in dose than those oriented away from the least shielded direction (minimum shielding toward the least shielded direction varied from 1.13 to 7.9 g/cm(exp 2), Al equivalent). The maximum dose rate on EURECA (1.16 mGy/day) was 37% of the maximum measured on LDEF and dose rates at all depths were less than measured on LDEF. The shielding external to the flight canister covered a greater solid angle about the canister than the LDEF experiments.

  10. Quantum Computers

    DTIC Science & Technology

    2010-03-04

    1227–1230 (2009). 31. Olmschenk, S. et al. Quantum teleportation between distant matter qubits. Science 323, 486–489 (2009). 32. Dür, W., Briegel, H...REVIEWS Quantum computers T. D. Ladd1{, F. Jelezko2, R. Laflamme3,4,5, Y. Nakamura6,7, C. Monroe8,9 & J. L. O’Brien10 Over the past several decades... quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing

  11. Computer grants

    NASA Astrophysics Data System (ADS)

    The Computer and Information Science and Engineering Directorate of the National Science Foundation will offer educational supplements to CISE grants in Fiscal Year 1990. The purpose of the supplements is to establish closer links between CISE-supported research and undergraduate education and to accelerate transfer into the classroom of research results from work done under existing research grants. Any principal investigator with an active NSF research award from a program in the CISE Directorate can apply for an educational supplement. Proposals should be for creative activities to improve education, not for research.

  12. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  13. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  14. To get the most out of high resolution X-ray tomography: A review of the post-reconstruction analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yijin; Kiss, Andrew M.; Larsson, Daniel H.; Yang, Feifei; Pianetta, Piero

    2016-03-01

    X-ray microscopy has been well-recognized as one of the most important techniques for research in a wide range of scientific disciplines including materials science, geoscience, and bio-medical science. Advances in X-ray sources, optics, detectors, and imaging methodologies have made significant improvements to non-destructive reconstructions of the three dimensional (3D) structure of specimens over a wide range of length scales with different contrast mechanisms. A strength of 3D imaging is a "seeing is believing" way of reporting and analyzing data to better understand the structure/function characteristics of a sample. In addition to the excellent visualization capability, X-ray computed tomography has a lot more to offer. In this article, we review some of the experimental and analytical methods that enrich and extract scientifically relevant information from tomographic data. Several scientific cases are discussed along with how they enhance the tomographic dataset.

  15. Iterative Noise Elimination Preliminary Report.

    DTIC Science & Technology

    1983-01-01

    there is no very good way to remove them. The purpose of the present report is to describe the procedure and to show the results of a series of tests with data from a computed tomography x-ray scan of a defective batery .

  16. Preliminary design of JEM ECLSS and TCS

    NASA Astrophysics Data System (ADS)

    Suzuki, T.; Otsuki, F.; Suzuki, K.; Shibutani, S.; Hattori, A.; Sasayama, H.; Inoue, M.; Sugai, W.

    1991-12-01

    Preliminary design of the Japanese Module (JEM), which will be attached to the U.S. Space Station Freedom (SSF), began in early 1991. The target of the design activities is the Preliminary Design Review (PDR) planned in the beginning of 1992. The pressurized module of the JEM is composed of several subsystems including the Environmental Control and Life Support System (ECLSS), the Thermal Control System (TCS), mechanical equipment (airlock), the data management system, the communications and tracking system, the electrical power system, and the experiment support system. The preliminary design of ECLSS and TCS including the baseline configurations and the design requirements are described.

  17. Getting Started: Planning and Implementing Computer Instruction in Schools.

    ERIC Educational Resources Information Center

    Mojkowski, Charles

    Intended for use by local school district staff, this manual provides step-by-step guidelines for planning and implementing the incorporation of computer-based instruction in the curriculum. Procedures involved in five phases of planning and implementation are outlined, covering: (1) preliminary planning, where schools should establish planning…

  18. Working Together: Computers and People with Mobility Impairments.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle.

    This brief paper describes several computing tools that have been effectively used by individuals with mobility impairments. Emphasis is on tasks to be completed and how the individuals abilities (not disabilities), with possible assistance from technology, can be used to accomplish them. Preliminary information addresses the importance of…

  19. Computational introspection

    SciTech Connect

    Batali, J.

    1983-02-01

    Introspection is the process of thinking about one's own thoughts and feelings. In this paper, the author discusses recent attempts to make computational systems that exhibit introspective behavior. Each presents a system capable of manipulating representations of its own program and current context. He argues that introspective ability is crucial for intelligent systems--without it an agent cannot represent certain problems that it must be able to solve. A theory of intelligent action would describe how and why certain actions intelligently achieve an agent's goals. The agent would both embody and represent this theory: it would be implemented as the program for the agent; and the importance of introspection suggests that the agent represent its theory of action to itself.

  20. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  1. Computational micromechanics

    NASA Astrophysics Data System (ADS)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  2. Space station preliminary design report

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The results of a 3 month preliminary design and analysis effort is presented. The configuration that emerged consists of a very stiff deployable truss structure with an overall triangular cross section having universal modules attached at the apexes. Sufficient analysis was performed to show feasibility of the configuration. An evaluation of the structure shows that desirable attributes of the configuration are: (1) the solar cells, radiators, and antennas will be mounted to stiff structure to minimize control problems during orbit maintenance and correction, docking, and attitude control; (2) large flat areas are available for mounting and servicing of equipment; (3) Large mass items can be mounted near the center of gravity of the system to minimize gravity gradient torques; (4) the trusses are lightweight structures and can be transported into orbit in one Shuttle flight; (5) the trusses are expandable and will require a minimum of EVA; and (6) the modules are anticipated to be structurally identical except for internal equipment to minimize cost.

  3. Advanced space engine preliminary design

    NASA Technical Reports Server (NTRS)

    Cuffe, J. P. B.; Bradie, R. E.

    1973-01-01

    A preliminary design was completed for an O2/H2, 89 kN (20,000 lb) thrust staged combustion rocket engine that has a single-bell nozzle with an overall expansion ratio of 400:1. The engine has a best estimate vacuum specific impulse of 4623.8 N-s/kg (471.5 sec) at full thrust and mixture ratio = 6.0. The engine employs gear-driven, low pressure pumps to provide low NPSH capability while individual turbine-driven, high-speed main pumps provide the system pressures required for high-chamber pressure operation. The engine design dry weight for the fixed-nozzle configuration is 206.9 kg (456.3 lb). Engine overall length is 234 cm (92.1 in.). The extendible nozzle version has a stowed length of 141.5 cm (55.7 in.). Critical technology items in the development of the engine were defined. Development program plans and their costs for development, production, operation, and flight support of the ASE were established for minimum cost and minimum time programs.

  4. Stardust Interstellar Preliminary Examination (ISPE)

    NASA Technical Reports Server (NTRS)

    Westphal, A. J.; Allen, C.; Bajt, S.; Basset, R.; Bastien, R.; Bechtel, H.; Bleuet, P.; Borg, J.; Brenker F.; Bridges, J.

    2009-01-01

    In January 2006 the Stardust sample return capsule returned to Earth bearing the first solid samples from a primitive solar system body, C omet 81P/Wild2, and a collector dedicated to the capture and return o f contemporary interstellar dust. Both collectors were approximately 0.1m(exp 2) in area and were composed of aerogel tiles (85% of the co llecting area) and aluminum foils. The Stardust Interstellar Dust Col lector (SIDC) was exposed to the interstellar dust stream for a total exposure factor of 20 m(exp 2-) day during two periods before the co metary encounter. The Stardust Interstellar Preliminary Examination ( ISPE) is a three-year effort to characterize the collection using no ndestructive techniques. The ISPE consists of six interdependent proj ects: (1) Candidate identification through automated digital microsco py and a massively distributed, calibrated search (2) Candidate extr action and photodocumentation (3) Characterization of candidates thro ugh synchrotronbased FourierTranform Infrared Spectroscopy (FTIR), S canning XRay Fluoresence Microscopy (SXRF), and Scanning Transmission Xray Microscopy (STXM) (4) Search for and analysis of craters in f oils through FESEM scanning, Auger Spectroscopy and synchrotronbased Photoemission Electron Microscopy (PEEM) (5) Modeling of interstell ar dust transport in the solar system (6) Laboratory simulations of h ypervelocity dust impacts into the collecting media

  5. EUPORIAS: plans and preliminary results

    NASA Astrophysics Data System (ADS)

    Buontempo, C.

    2013-12-01

    Recent advances in our understanding and ability to forecast climate variability have meant that skilful predictions are beginning to be routinely made on seasonal to decadal (s2d) timescales. Such forecasts have the potential to be of great value to a wide range of decision-making, where outcomes are strongly influenced by variations in the climate. In 2012 the European Commission funded EUPORIAS, a four year long project to develop prototype end-to-end climate impact prediction services operating on a seasonal to decadal timescale, and assess their value in informing decision-making. EUPORIAS commenced on 1 November 2012, coordinated by the UK Met Office leading a consortium of 24 organisations representing world-class European climate research and climate service centres, expertise in impacts assessments and seasonal predictions, two United Nations agencies, specialists in new media, and commercial companies in climate-vulnerable sectors such as energy, water and tourism. The poster describes the setup of the project, its main outcome and some of the very preliminary results.

  6. Preliminary Iron Distribution on Vesta

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David W.; Mittlefehldt, David W.

    2013-01-01

    The distribution of iron on the surface of the asteroid Vesta was investigated using Dawn's Gamma Ray and Neutron Detector (GRaND) [1,2]. Iron varies predictably with rock type for the howardite, eucrite, and diogenite (HED) meteorites, thought to be representative of Vesta. The abundance of Fe in howardites ranges from about 12 to 15 wt.%. Basaltic eucrites have the highest abundance, whereas, lower crustal and upper mantle materials (cumulate eucrites and diogenites) have the lowest, and howardites are intermediate [3]. We have completed a mapping study of 7.6 MeV gamma rays produced by neutron capture by Fe as measured by the bismuth germanate (BGO) detector of GRaND [1]. The procedures to determine Fe counting rates are presented in detail here, along with a preliminary distribution map, constituting the necessary initial step to quantification of Fe abundances. We find that the global distribution of Fe counting rates is generally consistent with independent mineralogical and compositional inferences obtained by other instruments on Dawn such as measurements of pyroxene absorption bands by the Visual and Infrared Spectrometer (VIR) [4] and Framing Camera (FC) [5] and neutron absorption measurements by GRaND [6].

  7. Preliminary results of ANAIS-25

    NASA Astrophysics Data System (ADS)

    Amaré, J.; Cebrián, S.; Cuesta, C.; García, E.; Ginestra, C.; Martínez, M.; Oliván, M. A.; Ortigoza, Y.; Ortiz de Solórzano, A.; Pobes, C.; Puimedón, J.; Sarsa, M. L.; Villar, J. A.; Villar, P.

    2014-04-01

    The ANAIS (Annual Modulation with NaI(Tl) Scintillators) experiment aims at the confirmation of the DAMA/LIBRA signal using the same target and technique at the Canfranc Underground Laboratory. 250 kg of ultrapure NaI(Tl) crystals will be used as a target, divided into 20 modules, each coupled to two photomultipliers. Two NaI(Tl) crystals of 12.5 kg each, grown by Alpha Spectra from a powder having a potassium level under the limit of our analytical techniques, form the ANAIS-25 set-up. The background contributions are being carefully studied and preliminary results are presented: their natural potassium content in the bulk has been quantified, as well as the uranium and thorium radioactive chains presence in the bulk through the discrimination of the corresponding alpha events by PSA, and due to the fast commissioning, the contribution from cosmogenic activated isotopes is clearly identified and their decay observed along the first months of data taking. Following the procedures established with ANAIS-0 and previous prototypes, bulk NaI(Tl) scintillation events selection and light collection efficiency have been also studied in ANAIS-25.

  8. San Mateo Creek Basin Preliminary Assessment

    EPA Pesticide Factsheets

    The objective of this Preliminary Assessment is to evaluate the site using the Hazard Ranking System and the Superfund Chemical Data Matrix to determine if a threat to human health and the environment exists such that further action is warranted.

  9. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced by... point in the production process after which no further chemical reactions designed to produce or...

  10. 29 CFR 1955.31 - Preliminary conference.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) PROCEDURES FOR WITHDRAWAL OF APPROVAL OF STATE PLANS Preliminary Conference and Discovery § 1955... those not disposed of by admissions or agreements, and control the subsequent course of the...

  11. 43 CFR 3425.1-7 - Preliminary data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Preliminary data. 3425.1-7 Section 3425.1... Preliminary data. (a) Any application for a lease shall contain preliminary data to assist the authorized... preliminary data shall include: (1) A map, or maps, showing the topography, physical features and...

  12. Podcasting: A Preliminary Classroom Study

    ERIC Educational Resources Information Center

    Aristizabal, Alexander

    2009-01-01

    Podcasting is a term introduced through the use of Apple Computer, Inc.'s iPod, a term which denotes how a portable audio player can be used to download audio files, mostly MP3s, and be heard at the user's convenience. Initially such an operation was intended for entertainment; however, it has proven itself to be an important tool in the field of…

  13. Preliminary LISA Telescope Spacer Design

    NASA Technical Reports Server (NTRS)

    Livas, J.; Arsenovic, P.; Catellucci, K.; Generie, J.; Howard, J.; Stebbins, R. T.

    2010-01-01

    The Laser Interferometric Space Antenna (LISA) mission observes gravitational waves by measuring the separations between freely floating proof masses located 5 million kilometers apart with an accuracy of approximately 10 picometers. The separations are measured interferometrically. The telescope is an afocal Cassegrain style design with a magnification of 80x. The entrance pupil has a 40 cm diameter and will either be centered on-axis or de-centered off-axis to avoid obscurations. Its two main purposes are to transform the small diameter beam used on the optical bench to a diffraction limited collimated beam to efficiently transfer the metrology laser between spacecraft, and to receive the incoming light from the far spacecraft. It transmits and receives simultaneously. The basic optical design and requirements are well understood for a conventional telescope design for imaging applications, but the LISA design is complicated by the additional requirement that the total optical path through the telescope must remain stable at the picometer level over the measurement band during the mission to meet the measurement accuracy. This poster describes the requirements for the telescope and the preliminary work that has been done to understand the materials and mechanical issues associated with the design of a passive metering structure to support the telescope and to maintain the spacing between the primary and secondary mirrors in the LISA on-orbit environment. This includes the requirements flowdown from the science goals, thermal modeling of the spacecraft and telescope to determine the expected temperature distribution,layout options for the telescope including an on- and off-axis design, and plans for fabrication and testing.

  14. Preliminary design of a redundant strapped down inertial navigation unit using two-degree-of-freedom tuned-gimbal gyroscopes

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This redundant strapdown INS preliminary design study demonstrates the practicality of a skewed sensor system configuration by means of: (1) devising a practical system mechanization utilizing proven strapdown instruments, (2) thoroughly analyzing the skewed sensor redundancy management concept to determine optimum geometry, data processing requirements, and realistic reliability estimates, and (3) implementing the redundant computers into a low-cost, maintainable configuration.

  15. Effect of Aging on the Mechanical Properties of Li-Ion Cell Components - A Preliminary Look

    SciTech Connect

    Cao, Lei; Zhang, Chao; Santhanagopalan, Shriram; Pesaran, Ahmad

    2016-05-03

    DOE/VTO/ES initiated the Computer Aided Engineering for Batteries (CAEBAT) in 2010. CAEBAT had a strong focus on building electrochemical-thermal models that simulate the performance of lithium-ion batteries. Since the start of CAEBAT-2 projects in FY14, our emphasis has been on safety aspects -- mechanical deformation in particular. This presentation gives a preliminary look at the effect of aging on the mechanical properties of lithium-ion cell components.

  16. Preliminary Analysis of STS-3 Entry Heat-Transfer Data for the Orbiter Windward Centerline

    NASA Technical Reports Server (NTRS)

    Throckmorton, D. A.; Hamilton, H. H., II; Zoby, E. V.

    1982-01-01

    A preliminary analysis of heat transfer data on the space shuttle orbiter windward centerline for the STS-3 mission entry is presented. Temperature-time history plots for each measurement location and tabulated wall temperature and convective heating rate data at 21 selected trajectory points are included. The STS-3 flight data are also compared with predictions by two approximation methods for computing convective heat transfer rates in equilibrium air.

  17. Muon-catalyzed fusion experiment target and detector system. Preliminary design report

    SciTech Connect

    Jones, S.E.; Watts, K.D.; Caffrey, A.J.; Walter, J.B.

    1982-03-01

    We present detailed plans for the target and particle detector systems for the muon-catalyzed fusion experiment. Requirements imposed on the target vessel by experimental conditions and safety considerations are delineated. Preliminary designs for the target vessel capsule and secondary containment vessel have been developed which meet these requirements. In addition, the particle detection system is outlined, including associated fast electronics and on-line data acquisition. Computer programs developed to study the target and detector system designs are described.

  18. Near-term hybrid vehicle program, phase 1. Appendix C: Preliminary design data package

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The design methodology, the design decision rationale, the vehicle preliminary design summary, and the advanced technology developments are presented. The detailed vehicle design, the vehicle ride and handling and front structural crashworthiness analysis, the microcomputer control of the propulsion system, the design study of the battery switching circuit, the field chopper, and the battery charger, and the recent program refinements and computer results are presented.

  19. Operations analysis (study 2.6). Volume 4: Computer specification; logistics of orbiting vehicle servicing (LOVES)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The logistics of orbital vehicle servicing computer specifications was developed and a number of alternatives to improve utilization of the space shuttle and the tug were investigated. Preliminary results indicate that space servicing offers a potential for reducing future operational and program costs over ground refurbishment of satellites. A computer code which could be developed to simulate space servicing is presented.

  20. A preliminary design of the collinear dielectric wakefield accelerator

    NASA Astrophysics Data System (ADS)

    Zholents, A.; Gai, W.; Doran, S.; Lindberg, R.; Power, J. G.; Strelnikov, N.; Sun, Y.; Trakhtenberg, E.; Vasserman, I.; Jing, C.; Kanareykin, A.; Li, Y.; Gao, Q.; Shchegolkov, D. Y.; Simakov, E. I.

    2016-09-01

    A preliminary design of the multi-meter long collinear dielectric wakefield accelerator that achieves a highly efficient transfer of the drive bunch energy to the wakefields and to the witness bunch is considered. It is made from 0.5 m long accelerator modules containing a vacuum chamber with dielectric-lined walls, a quadrupole wiggler, an rf coupler, and BPM assembly. The single bunch breakup instability is a major limiting factor for accelerator efficiency, and the BNS damping is applied to obtain the stable multi-meter long propagation of a drive bunch. Numerical simulations using a 6D particle tracking computer code are performed and tolerances to various errors are defined.

  1. Phoebe: A preliminary control network and rotational elements

    NASA Technical Reports Server (NTRS)

    Colvin, Tim R.; Davies, Merton E.; Rogers, Patricia G.; Heller, Jeanne (Editor)

    1989-01-01

    A preliminary control network for the Saturnian satellite Phoebe was determined based upon 6 distinct albedo features mapped on 16 Voyager 2 images. Using an existing map and an analytical triangulation program which minimized the measurement error, the north pole of Phoebe was calculated to be alpha sub 0 = 355.0 deg + or - 9.6 deg, delta sub 0 = 68.7 deg + or - 7.9 deg, where alpha sub 0, delta sub 0 are standard equatorial coordinates with equinox J2000 at epoch J2000. The prime meridian of Phoebe was computed to be W = 304.7 deg + 930.833872d, where d is the interval in days from JD 2451545.0 TDB.

  2. Preliminary Numerical and Experimental Analysis of the Spallation Phenomenon

    NASA Technical Reports Server (NTRS)

    Martin, Alexandre; Bailey, Sean C. C.; Panerai, Francesco; Davuluri, Raghava S. C.; Vazsonyi, Alexander R.; Zhang, Huaibao; Lippay, Zachary S.; Mansour, Nagi N.; Inman, Jennifer A.; Bathel, Brett F.; Splinter, Scott C.; Danehy, Paul M.

    2015-01-01

    The spallation phenomenon was studied through numerical analysis using a coupled Lagrangian particle tracking code and a hypersonic aerothermodynamics computational fluid dynamics solver. The results show that carbon emission from spalled particles results in a significant modification of the gas composition of the post shock layer. Preliminary results from a test-campaign at the NASA Langley HYMETS facility are presented. Using an automated image processing of high-speed images, two-dimensional velocity vectors of the spalled particles were calculated. In a 30 second test at 100 W/cm2 of cold-wall heat-flux, more than 1300 particles were detected, with an average velocity of 102 m/s, and most frequent observed velocity of 60 m/s.

  3. Demo III processing architecture trades and preliminary design

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.; Cory, Phil; Peterman, Pete

    1999-01-01

    This paper will provide a summary of the methodology, metrics, analysis, and trade study efforts for the preliminary design o the Vetronics Processing Architecture (PA) system based on the Demo III Experimental Unmanned Ground Vehicle (XUV) program requirements. We will document and describe both the provided and analytically derived system requirements expressed by the proposal. Our experience based on previous mobility and Reconnaissance, Surveillance, Targeting, Acquisition systems designed and implemented for Demo II Semi-Autonomous Surrogate Vehicle and Mobile Detection, Assessment and Response System will be used to describe lessons learned as applied to the XUV in PA architecture, Single Board Computers, Card Cage Buses, Real-Time and Non Real-Time processor and Card Cage to Card Cage Communications, and Imaging and Radar pre-processors selection and choices. We have selected an initial architecture methodology.

  4. Aerodynamic preliminary analysis system. Part 1: Theory. [linearized potential theory

    NASA Technical Reports Server (NTRS)

    Bonner, E.; Clever, W.; Dunn, K.

    1978-01-01

    A comprehensive aerodynamic analysis program based on linearized potential theory is described. The solution treats thickness and attitude problems at subsonic and supersonic speeds. Three dimensional configurations with or without jet flaps having multiple non-planar surfaces of arbitrary planform and open or closed slender bodies of non-circular contour may be analyzed. Longitudinal and lateral-directional static and rotary derivative solutions may be generated. The analysis was implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis problem. Nominal case computation time of 45 CPU seconds on the CDC 175 for a 200 panel simulation indicates the program provides an efficient analysis for systematically performing various aerodynamic configuration tradeoff and evaluation studies.

  5. Synthetic Light Curves for Born Again Events: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Miller Bertolami, M. M.; Rohrmann, R. D.

    2013-01-01

    The development of surveys which will be able to cover a large region of the sky several times per year will allow the massive detection of transient events taking place on timescales of years. In addition, the projected full digitalization of the Harvard plate collection will open a new window on the identification of slow transients taking place on timescales of centuries. In particular, these projects will allow the detection of stars undergoing slow eruptions as those expected during late helium flashes in the post-AGB evolution. In order to identify those transients which correspond with late helium flashes the development of synthetic light curves of those events is mandatory. In this connection we present preliminary results of a project aimed at computing grids of theoretical light curves of born again stars.

  6. ERIS: preliminary design phase overview

    NASA Astrophysics Data System (ADS)

    Kuntschner, Harald; Jochum, Lieselotte; Amico, Paola; Dekker, Johannes K.; Kerber, Florian; Marchetti, Enrico; Accardo, Matteo; Brast, Roland; Brinkmann, Martin; Conzelmann, Ralf D.; Delabre, Bernard A.; Duchateau, Michel; Fedrigo, Enrico; Finger, Gert; Frank, Christoph; Rodriguez, Fernando G.; Klein, Barbara; Knudstrup, Jens; Le Louarn, Miska; Lundin, Lars; Modigliani, Andrea; Müller, Michael; Neeser, Mark; Tordo, Sebastien; Valenti, Elena; Eisenhauer, Frank; Sturm, Eckhard; Feuchtgruber, Helmut; George, Elisabeth M.; Hartl, Michael; Hofmann, Reiner; Huber, Heinrich; Plattner, Markus P.; Schubert, Josef; Tarantik, Karl; Wiezorrek, Erich; Meyer, Michael R.; Quanz, Sascha P.; Glauser, Adrian M.; Weisz, Harald; Esposito, Simone; Xompero, Marco; Agapito, Guido; Antichi, Jacopo; Biliotti, Valdemaro; Bonaglia, Marco; Briguglio, Runa; Carbonaro, Luca; Cresci, Giovanni; Fini, Luca; Pinna, Enrico; Puglisi, Alfio T.; Quirós-Pacheco, Fernando; Riccardi, Armando; Di Rico, Gianluca; Arcidiacono, Carmelo; Dolci, Mauro

    2014-07-01

    The Enhanced Resolution Imager and Spectrograph (ERIS) is the next-generation adaptive optics near-IR imager and spectrograph for the Cassegrain focus of the Very Large Telescope (VLT) Unit Telescope 4, which will soon make full use of the Adaptive Optics Facility (AOF). It is a high-Strehl AO-assisted instrument that will use the Deformable Secondary Mirror (DSM) and the new Laser Guide Star Facility (4LGSF). The project has been approved for construction and has entered its preliminary design phase. ERIS will be constructed in a collaboration including the Max- Planck Institut für Extraterrestrische Physik, the Eidgenössische Technische Hochschule Zürich and the Osservatorio Astrofisico di Arcetri and will offer 1 - 5 μm imaging and 1 - 2.5 μm integral field spectroscopic capabilities with a high Strehl performance. Wavefront sensing can be carried out with an optical high-order NGS Pyramid wavefront sensor, or with a single laser in either an optical low-order NGS mode, or with a near-IR low-order mode sensor. Due to its highly sensitive visible wavefront sensor, and separate near-IR low-order mode, ERIS provides a large sky coverage with its 1' patrol field radius that can even include AO stars embedded in dust-enshrouded environments. As such it will replace, with a much improved single conjugated AO correction, the most scientifically important imaging modes offered by NACO (diffraction limited imaging in the J to M bands, Sparse Aperture Masking and Apodizing Phase Plate (APP) coronagraphy) and the integral field spectroscopy modes of SINFONI, whose instrumental module, SPIFFI, will be upgraded and re-used in ERIS. As part of the SPIFFI upgrade a new higher resolution grating and a science detector replacement are envisaged, as well as PLC driven motors. To accommodate ERIS at the Cassegrain focus, an extension of the telescope back focal length is required, with modifications of the guider arm assembly. In this paper we report on the status of the

  7. Very preliminary reference Moon model

    NASA Astrophysics Data System (ADS)

    Garcia, Raphaël F.; Gagnepain-Beyneix, Jeannine; Chevrot, Sébastien; Lognonné, Philippe

    2011-09-01

    The deep structure of the Moon is a missing piece to understand the formation and evolution of the Earth-Moon system. Despite the great amount of information brought by the Apollo passive seismic experiment (ALSEP), the lunar structure below deep moonquakes, which occur around 900 km depth, remains largely unknown. We construct a reference Moon model which incorporates physical constraints, and fits both geodesic (lunar mass and polar moment of inertia, and Love numbers) and seismological (body wave arrivals measured by Apollo network) data. In this model, the core radius is constrained by the detection of S waves reflected from the core. In a first step, for each core radius, a radial model of the lunar interior, including P and S wave velocities and density, is inverted from seismic and geodesic data. In a second step, the core radius is determined from the detection of shear waves reflected on the lunar core by waveform stacking of deep moonquake Apollo records. This detection has been made possible by careful data selection and processing, including a correction of the gain of horizontal sensors based on the principle of energy equipartition inside the coda of lunar seismic records, and a precise alignment of SH waveforms by a non-linear inversion method. The Very Preliminary REference MOON model (VPREMOON) obtained here has a core radius of 380 ± 40 km and an average core mass density of 5200 ± 1000 kg/m 3. The large error bars on these estimates are due to the poorly constrained S-wave velocity profile at the base of the mantle and to mislocation errors of deep moonquakes. The detection of horizontally polarized S waves reflected from the core and the absence of detection of vertically polarized S waves favour a liquid state in the outermost part of the core. All these results are consistent, within their error bars, with previous estimates based on lunar rotation dissipation ( Williams et al., 2001) and on lunar induced magnetic moment ( Hood et al., 1999).

  8. Preliminary structural sizing of a Mach 3.0 high-speed civil transport model

    NASA Technical Reports Server (NTRS)

    Blackburn, Charles L.

    1992-01-01

    An analysis has been performed pertaining to the structural resizing of a candidate Mach 3.0 High Speed Civil Transport (HSCT) conceptual design using a computer program called EZDESIT. EZDESIT is a computer program which integrates the PATRAN finite element modeling program to the COMET finite element analysis program for the purpose of calculating element sizes or cross sectional dimensions. The purpose of the present report is to document the procedure used in accomplishing the preliminary structural sizing and to present the corresponding results.

  9. The Computer Aided Aircraft-design Package (CAAP)

    NASA Technical Reports Server (NTRS)

    Yalif, Guy U.

    1994-01-01

    The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.

  10. The engineering design integration (EDIN) system. [digital computer program complex

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  11. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  12. Visualizing ultrasound through computational modeling

    NASA Technical Reports Server (NTRS)

    Guo, Theresa W.

    2004-01-01

    The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.

  13. Heliogyro Preliminary Design, Phase 2

    NASA Technical Reports Server (NTRS)

    1978-01-01

    There are 12 blades in the Heliogyro design, and each blade is envisioned to be 8 meters in width and 7,500 meters in length. The blades are expected to be composed primarily of a thin membrane constructed of material such as Kapton film with an aluminum reflective coating on one side and an infrared emissive coating on the other. The present Phase 2 Final Report covers work done on the following six topics: (1) Design and analysis of a stowable circular lattice batten for the Heliogyro blade. (2) Design and analysis of a biaxially tensioned blade panel. (3) Definition of a research program for micrometeoroid damage to tendons. (4) A conceptual design for a flight test model of the Heliogyro. (5) Definition of modifications to the NASTRAN computer program required to provide improved analysis of the Heliogyro. (6) A User's Manual covering applications of NASTRAN to the Heliogyro.

  14. COMSAC: Computational Methods for Stability and Control. Part 1

    NASA Technical Reports Server (NTRS)

    Fremaux, C. Michael (Compiler); Hall, Robert M. (Compiler)

    2004-01-01

    Work on stability and control included the following reports:Introductory Remarks; Introduction to Computational Methods for Stability and Control (COMSAC); Stability & Control Challenges for COMSAC: a NASA Langley Perspective; Emerging CFD Capabilities and Outlook A NASA Langley Perspective; The Role for Computational Fluid Dynamics for Stability and Control:Is it Time?; Northrop Grumman Perspective on COMSAC; Boeing Integrated Defense Systems Perspective on COMSAC; Computational Methods in Stability and Control:WPAFB Perspective; Perspective: Raytheon Aircraft Company; A Greybeard's View of the State of Aerodynamic Prediction; Computational Methods for Stability and Control: A Perspective; Boeing TacAir Stability and Control Issues for Computational Fluid Dynamics; NAVAIR S&C Issues for CFD; An S&C Perspective on CFD; Issues, Challenges & Payoffs: A Boeing User s Perspective on CFD for S&C; and Stability and Control in Computational Simulations for Conceptual and Preliminary Design: the Past, Today, and Future?

  15. Program Facilitates Distributed Computing

    NASA Technical Reports Server (NTRS)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  16. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  17. CAA: Computer Assisted Athletics.

    ERIC Educational Resources Information Center

    Hall, John H.

    Computers have been used in a variety of applications for athletics since the late 1950's. These have ranged from computer-controlled electric scoreboards to computer-designed pole vaulting poles. Described in this paper are a computer-based athletic injury reporting system and a computer-assisted football scouting system. The injury reporting…

  18. The assumptions of computing

    SciTech Connect

    Huggins, J.K.

    1994-12-31

    The use of computers, like any technological activity, is not content-neutral. Users of computers constantly interact with assumptions regarding worthwhile activity which are embedded in any computing system. Directly questioning these assumptions in the context of computing allows us to develop an understanding of responsible computing.

  19. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  20. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  1. Computing the Profession.

    ERIC Educational Resources Information Center

    Denning, Peter J.

    1998-01-01

    Discussion of computing as a science and profession examines the chasm between computer scientists and users, barriers to the use and growth of computing, experimental computer science, computational science, software engineering, professional identity, professional practices, applications of technology, innovation, field boundaries, and…

  2. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  3. Nondestructive assay of TRU waste using gamma-ray active and passive computed tomography

    SciTech Connect

    Roberson, G.P.; Decman, D.; Martz, H.; Keto, E.R.; Johansson, E.M.

    1995-10-04

    The authors have developed an active and passive computed tomography (A and PCT) scanner for assaying radioactive waste drums. Here they describe the hardware components of their system and the software used for data acquisition, gamma-ray spectroscopy analysis, and image reconstruction. They have measured the performance of the system using ``mock`` waste drums and calibrated radioactive sources. They also describe the results of measurements using this system to assay a real TRU waste drum with relatively low Pu content. The results are compared with X-ray NDE studies of the same TRU waste drum as well as assay results from segmented gamma scanner (SGS) measurements.

  4. Endodontic Management of a Maxillary First Molar with Seven Root Canals Using Spiral Computed Tomography

    PubMed Central

    Yadav, Hemant Kumar; Saini, Gaurav Kumar; Chhabra, Harpreet Singh; Panwar, Pratyaksha Singh

    2017-01-01

    The main objective of this case report is to present a rare root canal configuration of maxillary molar with seven root canals; three mesiobuccal, two palatal and two distobuccal canals diagnosed during treatment procedure confirmed by spiral computed tomography. A thorough knowledge of root canal morphology, proper clinical and radiographic examination, and use of dental operating microscopes are necessary for successful clinical outcomes. This article highlights the variations in the morphology of maxillary first molar and use of the latest techniques in successful diagnosis and negotiation of the additional canals. PMID:28293666

  5. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  6. Preliminary melter performance assessment report

    SciTech Connect

    Elliott, M.L.; Eyler, L.L.; Mahoney, L.A.; Cooper, M.F.; Whitney, L.D.; Shafer, P.J.

    1994-08-01

    The Melter Performance Assessment activity, a component of the Pacific Northwest Laboratory`s (PNL) Vitrification Technology Development (PVTD) effort, was designed to determine the impact of noble metals on the operational life of the reference Hanford Waste Vitrification Plant (HWVP) melter. The melter performance assessment consisted of several activities, including a literature review of all work done with noble metals in glass, gradient furnace testing to study the behavior of noble metals during the melting process, research-scale and engineering-scale melter testing to evaluate effects of noble metals on melter operation, and computer modeling that used the experimental data to predict effects of noble metals on the full-scale melter. Feed used in these tests simulated neutralized current acid waste (NCAW) feed. This report summarizes the results of the melter performance assessment and predicts the lifetime of the HWVP melter. It should be noted that this work was conducted before the recent Tri-Party Agreement changes, so the reference melter referred to here is the Defense Waste Processing Facility (DWPF) melter design.

  7. Computerized detection of lung nodules by CT for radiologic technologists in preliminary screening.

    PubMed

    Lee, Yongbum; Tsai, Du-Yih; Hokari, Hiroshi; Minagawa, Yasuko; Tsurumaki, Masaki; Hara, Takeshi; Fujita, Hiroshi

    2012-07-01

    In Japan, radiologists and radiologic technologists are endeavoring to improve the quality of lung CT screening. In particular, preliminary screening by radiologic technologists is expected to decrease radiologists' burden and improve the accuracy of CT screening. We considered that an application of computer-aided detection (CAD) would also be as useful in preliminary screening as in the radiologist's regular reading. Our purpose in this study was to investigate the potential of the application of CAD to preliminary screening. CAD software that we developed was applied to 17 lung CT scans that radiologic technologists had pre-interpreted. A radiologist recognized 29 lung nodules from the CT images, whereas radiologic technologists did not recognize 11 of the 29 nodules at their pre-reading. Our CAD software detected lung nodules at an accuracy of 100% (29/29), with 4.1 false positives per case. The 11 nodules that radiologic technologists did not recognize were included in the CAD-detected nodules. This result suggests that the application of CAD may aid radiologic technologists in their preliminary screening.

  8. Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model

    USGS Publications Warehouse

    Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William

    1982-01-01

    The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.

  9. Preliminary designs: passive solar manufactured housing. Technical status report

    SciTech Connect

    Not Available

    1980-05-12

    The criteria established to guide the development of the preliminary designs are listed. Three preliminary designs incorporating direct gain and/or sunspace are presented. Costs, drawings, and supporting calculations are included. (MHR)

  10. 37 CFR 1.484 - Conduct of international preliminary examination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... international preliminary examination will be conducted on inventions not previously searched by an International Searching Authority. (d) The International Preliminary Examining Authority will establish a... reply. (e) The written opinion established by the International Searching Authority under PCT Rule...

  11. 37 CFR 1.484 - Conduct of international preliminary examination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... international preliminary examination will be conducted on inventions not previously searched by an International Searching Authority. (d) The International Preliminary Examining Authority will establish a... reply. (e) The written opinion established by the International Searching Authority under PCT Rule...

  12. Seismic Hazard Maps for the Maltese Archipelago: Preliminary Results

    NASA Astrophysics Data System (ADS)

    D'Amico, S.; Panzera, F.; Galea, P. M.

    2013-12-01

    The Maltese islands form an archipelago of three major islands lying in the Sicily channel at about 140 km south of Sicily and 300 km north of Libya. So far very few investigations have been carried out on seismicity around the Maltese islands and no maps of seismic hazard for the archipelago are available. Assessing the seismic hazard for the region is currently of prime interest for the near-future development of industrial and touristic facilities as well as for urban expansion. A culture of seismic risk awareness has never really been developed in the country, and the public perception is that the islands are relatively safe, and that any earthquake phenomena are mild and infrequent. However, the Archipelago has been struck by several moderate/large events. Although recent constructions of a certain structural and strategic importance have been built according to high engineering standards, the same probably cannot be said for all residential buildings, many higher than 3 storeys, which have mushroomed rapidly in recent years. Such buildings are mostly of unreinforced masonry, with heavy concrete floor slabs, which are known to be highly vulnerable to even moderate ground shaking. We can surely state that in this context planning and design should be based on available national hazard maps. Unfortunately, these kinds of maps are not available for the Maltese islands. In this paper we attempt to compute a first and preliminary probabilistic seismic hazard assessment of the Maltese islands in terms of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) at different periods. Seismic hazard has been computed using the Esteva-Cornell (1968) approach which is the most widely utilized probabilistic method. It is a zone-dependent approach: seismotectonic and geological data are used coupled with earthquake catalogues to identify seismogenic zones within which earthquakes occur at certain rates. Therefore the earthquake catalogues can be reduced to the

  13. Avoiding Computer Viruses.

    ERIC Educational Resources Information Center

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  14. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  15. Computer Literacy Revisited.

    ERIC Educational Resources Information Center

    Klassen, Daniel

    1983-01-01

    This examination of important trends in the field of computing and education identifies and discusses four key computer literacy goals for educators as well as obstacles facing educators concerned with computer literacy. Nine references are listed. (Author/MBR)

  16. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  17. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation…

  18. Computers: Instruments of Change.

    ERIC Educational Resources Information Center

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  19. Environmentalists and the Computer.

    ERIC Educational Resources Information Center

    Baron, Robert C.

    1982-01-01

    Review characteristics, applications, and limitations of computers, including word processing, data/record keeping, scientific and industrial, and educational applications. Discusses misuse of computers and role of computers in environmental management. (JN)

  20. Polarized electrons in ELSA (preliminary results)

    NASA Astrophysics Data System (ADS)

    Nakamura, S.; von Drachenfels, W.; Durek, D.; Frommberger, F.; Hoffmann, M.; Husmann, D.; Kiel, B.; Klein, F. J.; Menze, D.; Michel, T.; Nakanishi, T.; Naumann, J.; Reichelt, T.; Steier, C.; Toyama, T.; Voigt, S.; Westermann, M.

    1998-01-01

    Polarized electrons have been accelerated in the electron stretcher accelerator ELSA for the first time. Up to 2.1 GeV the polarization of the electron beam supplied by the 120 keV polarized electron source has been measured with a Mo/ller polarimeter. Preliminary results of polarization measurements at high energies and the performance of the source are presented.

  1. 32 CFR 1900.11 - Preliminary Information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TO CIA RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of Foia Requests § 1900.11 Preliminary Information. Members of the public shall address all communications to the CIA Coordinator as... Information Act and this regulation. CIA employees receiving a communication in the nature of a FOIA...

  2. 28 CFR 36.604 - Preliminary determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Preliminary determination. 36.604 Section 36.604 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION ON THE BASIS OF DISABILITY BY PUBLIC ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building...

  3. Preliminary Findings on Rural Homelessness in Ohio.

    ERIC Educational Resources Information Center

    First, Richard J.; And Others

    This report is designed to present preliminary findings from the first comprehensive study of rural homelessness in the United States. The study was conducted during the first 6 months of 1990, and data were collected from interviews with 921 homeless adults in 21 randomly selected rural counties in Ohio. The sample counties represent 26% of the…

  4. Secondary School Mathematics. Preliminary Version. Sample Chapters.

    ERIC Educational Resources Information Center

    Bell, Max S.; And Others

    This volume contains preliminary versions of five of the chapters prepared by the SMSG curriculum project for use in grades 7 and 8. The first four chapters and the tenth chapter in the sequence are presented. The sample chapters in this volume illustrate a number of aspects of the curriculum project: (1) association of ideas of number and space…

  5. National Mathematics Advisory Panel Preliminary Report

    ERIC Educational Resources Information Center

    US Department of Education, 2007

    2007-01-01

    The National Mathematics Advisory Panel was established within the Department of Education as part of the President's "American Competitiveness Initiative" through Executive Order 13398, April 18, 2006. This document fulfills the obligation of that order to issue a Preliminary Report no later than January 31, 2007. The Panel chose to divide into…

  6. A Preliminary "Basic Math Program" Proposal.

    ERIC Educational Resources Information Center

    Luhring, Richard

    This report proposes a preliminary basic math program that may help mitigate the student attrition problem at Contra Costa College (California). For the purposes of this proposal, Math 101, 115, and 118 are identified as courses to be included in the program. The essential features of this proposed curriculum are summarized as follows: (1) it…

  7. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  8. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2014-07-01 2014-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  9. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2010-07-01 2010-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  10. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2012-07-01 2012-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  11. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  12. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Preliminary analysis. 161.170 Section 161.170 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements §...

  13. 32 CFR 1800.11 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1800.11 Section 1800.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC ACCESS TO NACIC RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of FOIA Requests §...

  14. 32 CFR 1803.11 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1803.11 Section 1803.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC REQUESTS FOR MANDATORY DECLASSIFICATION REVIEW OF CLASSIFIED INFORMATION PURSUANT TO SECTION 3.6...

  15. 32 CFR 1800.11 - Preliminary information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Preliminary information. 1800.11 Section 1800.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC ACCESS TO NACIC RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of FOIA Requests §...

  16. 32 CFR 1803.11 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1803.11 Section 1803.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC REQUESTS FOR MANDATORY DECLASSIFICATION REVIEW OF CLASSIFIED INFORMATION PURSUANT TO SECTION 3.6...

  17. 32 CFR 1800.11 - Preliminary information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Preliminary information. 1800.11 Section 1800.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC ACCESS TO NACIC RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of FOIA Requests §...

  18. 32 CFR 1800.11 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1800.11 Section 1800.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC ACCESS TO NACIC RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of FOIA Requests §...

  19. 32 CFR 1800.11 - Preliminary information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Preliminary information. 1800.11 Section 1800.11 National Defense Other Regulations Relating to National Defense NATIONAL COUNTERINTELLIGENCE CENTER PUBLIC ACCESS TO NACIC RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of FOIA Requests §...

  20. A DESCRIPTIVE INDONESIAN GRAMMAR--PRELIMINARY EDITION.

    ERIC Educational Resources Information Center

    DYEN, ISIDORE

    THIS PRELIMINARY EDITION COMPRISES A DESCRIPTIVE GRAMMAR OF INDONESIAN (BAHASA INDONESIA), THE OFFICIAL LANGUAGE OF THE REPUBLIC OF INDONESIA. THE THREE SECTIONS--PHONOLOGY, SYNTAX, AND MORPHOLOGY--PRESENT A COMPREHENSIVE LINGUISTIC ANALYSIS OF INDONESIAN, WITH OCCASIONAL CONTRASTIVE REFERENCE TO MALAY, JAVANESE, SUNDANESE, AND SUMATRAN. THIS…

  1. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Preliminary analysis. 161.170 Section 161.170 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements §...

  2. Preliminary radiation shielding design for BOOMERANG

    SciTech Connect

    Donahue, Richard J.

    2002-10-23

    Preliminary radiation shielding specifications are presented here for the 3 GeV BOOMERANG Australian synchrotron light source project. At this time the bulk shield walls for the storage ring and injection system (100 MeV Linac and 3 GeV Booster) are considered for siting purposes.

  3. 25 CFR 11.1005 - Preliminary inquiry.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... detention or shelter care, the children's court shall conduct a preliminary inquiry within 24 hours for the... delinquent act; and (2) Whether continued detention or shelter care is necessary pending further proceedings...; and (2) The need for detention or shelter care. (f) If the children's court finds that probable...

  4. 25 CFR 11.1105 - Preliminary inquiry.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... shelter care, the children's court shall conduct a preliminary inquiry with 24 hours for the purpose of...) Whether continued shelter care is necessary pending further proceedings. (b) If a minor has been released... the complaint or the taking of the minor into custody; and (2) The need for shelter care. (f) If...

  5. Preliminary aerothermodynamic design method for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Petrie, S. L.

    1987-01-01

    Preliminary design methods are presented for vehicle aerothermodynamics. Predictions are made for Shuttle orbiter, a Mach 6 transport vehicle and a high-speed missile configuration. Rapid and accurate methods are discussed for obtaining aerodynamic coefficients and heat transfer rates for laminar and turbulent flows for vehicles at high angles of attack and hypersonic Mach numbers.

  6. 10 CFR 71.85 - Preliminary determinations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Preliminary determinations. 71.85 Section 71.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Operating... shipment of licensed material— (a) The licensee shall ascertain that there are no cracks,...

  7. 19 CFR 207.108 - Preliminary conference.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Preliminary conference. 207.108 Section 207.108 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF WHETHER INJURY TO DOMESTIC INDUSTRIES RESULTS FROM IMPORTS SOLD AT LESS THAN FAIR VALUE OR...

  8. 19 CFR 207.108 - Preliminary conference.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Preliminary conference. 207.108 Section 207.108 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF WHETHER INJURY TO DOMESTIC INDUSTRIES RESULTS FROM IMPORTS SOLD AT LESS THAN FAIR VALUE OR...

  9. 19 CFR 207.108 - Preliminary conference.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Preliminary conference. 207.108 Section 207.108 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF WHETHER INJURY TO DOMESTIC INDUSTRIES RESULTS FROM IMPORTS SOLD AT LESS THAN FAIR VALUE OR...

  10. 19 CFR 207.108 - Preliminary conference.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Preliminary conference. 207.108 Section 207.108 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF WHETHER INJURY TO DOMESTIC INDUSTRIES RESULTS FROM IMPORTS SOLD AT LESS THAN FAIR VALUE OR...

  11. 19 CFR 207.108 - Preliminary conference.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 3 2012-04-01 2012-04-01 false Preliminary conference. 207.108 Section 207.108 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF WHETHER INJURY TO DOMESTIC INDUSTRIES RESULTS FROM IMPORTS SOLD AT LESS THAN FAIR VALUE OR...

  12. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Preliminary analysis. 161.170 Section 161.170 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements §...

  13. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Preliminary analysis. 161.170 Section 161.170 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements §...

  14. Preliminary tests of the electrostatic plasma accelerator

    NASA Technical Reports Server (NTRS)

    Aston, G.; Acker, T.

    1990-01-01

    This report describes the results of a program to verify an electrostatic plasma acceleration concept and to identify those parameters most important in optimizing an Electrostatic Plasma Accelerator (EPA) thruster based upon this thrust mechanism. Preliminary performance measurements of thrust, specific impulse and efficiency were obtained using a unique plasma exhaust momentum probe. Reliable EPA thruster operation was achieved using one power supply.

  15. Preliminary Safety Analysis for the IRIS Reactor

    SciTech Connect

    Ricotti, M.E.; Cammi, A.; Cioncolini, A.; Lombardi, C.; Cipollaro, A.; Orioto, F.; Conway, L.E.; Barroso, A.C.

    2002-07-01

    A deterministic analysis of the IRIS safety features has been carried out by means of the best-estimate code RELAP (ver. RELAP5 mod3.2). First, the main system components were modeled and tested separately, namely: the Reactor Pressure Vessel (RPV), the modular helical-coil Steam Generators (SG) and the Passive (natural circulation) Emergency Heat Removal System (PEHRS). Then, a preliminary set of accident transients for the whole primary and safety systems was investigated. Since the project was in a conceptual phase, the reported analyses must be considered preliminary. In fact, neither the reactor components, nor the safety systems and the reactor signal logics were completely defined at that time. Three 'conventional' design basis accidents have been preliminary evaluated: a Loss Of primary Flow Accident, a Loss Of Coolant Accident and a Loss Of Feed Water accident. The results show the effectiveness of the safety systems also in LOCA conditions; the core remains covered for the required grace period. This provides the basis to move forward to the preliminary design. (authors)

  16. 32 CFR 651.49 - Preliminary phase.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Preliminary phase. 651.49 Section 651.49 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Public Involvement and the Scoping Process §...

  17. 32 CFR 651.49 - Preliminary phase.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Preliminary phase. 651.49 Section 651.49 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Public Involvement and the Scoping Process §...

  18. 32 CFR 651.49 - Preliminary phase.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 4 2013-07-01 2013-07-01 false Preliminary phase. 651.49 Section 651.49 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Public Involvement and the Scoping Process §...

  19. 32 CFR 651.49 - Preliminary phase.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 4 2012-07-01 2011-07-01 true Preliminary phase. 651.49 Section 651.49 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Public Involvement and the Scoping Process §...

  20. 18 CFR 154.105 - Preliminary statement.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Preliminary statement. 154.105 Section 154.105 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER NATURAL GAS ACT RATE SCHEDULES AND TARIFFS Form and Composition...

  1. Engineering Technology Programs. Preliminary Curriculum Planning Guide.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    Developed as a resource to assist in a major revision underway in Georgia area technical schools to change curricula for preparing engineering technicians, this preliminary program-planning guide describes curriculum structures for specialized programs in three major areas--electronics, electromechanics, and mechanics. The handbook, which is…

  2. 32 CFR 1900.11 - Preliminary Information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... TO CIA RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of Foia Requests § 1900.11 Preliminary Information. Members of the public shall address all communications to the CIA Coordinator as... Information Act and this regulation. CIA employees receiving a communication in the nature of a FOIA...

  3. 32 CFR 1900.11 - Preliminary Information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... TO CIA RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of Foia Requests § 1900.11 Preliminary Information. Members of the public shall address all communications to the CIA Coordinator as... Information Act and this regulation. CIA employees receiving a communication in the nature of a FOIA...

  4. 32 CFR 1900.11 - Preliminary Information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... TO CIA RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of Foia Requests § 1900.11 Preliminary Information. Members of the public shall address all communications to the CIA Coordinator as... Information Act and this regulation. CIA employees receiving a communication in the nature of a FOIA...

  5. 32 CFR 1900.11 - Preliminary Information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... TO CIA RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of Foia Requests § 1900.11 Preliminary Information. Members of the public shall address all communications to the CIA Coordinator as... Information Act and this regulation. CIA employees receiving a communication in the nature of a FOIA...

  6. Computed tomography of human joints and radioactive waste drums

    SciTech Connect

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-12-02

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity.

  7. Computed tomography of human joints and radioactive waste drums

    SciTech Connect

    Ashby, E; Bernardi, R; Hollerbach, K; Logan, C; Martz, H; Roberson, G P

    1999-06-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have been increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed. (1) The computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. (2) They are developing NDE and NDE techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity.

  8. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  9. Optics in neural computation

    NASA Astrophysics Data System (ADS)

    Levene, Michael John

    In all attempts to emulate the considerable powers of the brain, one is struck by both its immense size, parallelism, and complexity. While the fields of neural networks, artificial intelligence, and neuromorphic engineering have all attempted oversimplifications on the considerable complexity, all three can benefit from the inherent scalability and parallelism of optics. This thesis looks at specific aspects of three modes in which optics, and particularly volume holography, can play a part in neural computation. First, holography serves as the basis of highly-parallel correlators, which are the foundation of optical neural networks. The huge input capability of optical neural networks make them most useful for image processing and image recognition and tracking. These tasks benefit from the shift invariance of optical correlators. In this thesis, I analyze the capacity of correlators, and then present several techniques for controlling the amount of shift invariance. Of particular interest is the Fresnel correlator, in which the hologram is displaced from the Fourier plane. In this case, the amount of shift invariance is limited not just by the thickness of the hologram, but by the distance of the hologram from the Fourier plane. Second, volume holography can provide the huge storage capacity and high speed, parallel read-out necessary to support large artificial intelligence systems. However, previous methods for storing data in volume holograms have relied on awkward beam-steering or on as-yet non- existent cheap, wide-bandwidth, tunable laser sources. This thesis presents a new technique, shift multiplexing, which is capable of very high densities, but which has the advantage of a very simple implementation. In shift multiplexing, the reference wave consists of a focused spot a few millimeters in front of the hologram. Multiplexing is achieved by simply translating the hologram a few tens of microns or less. This thesis describes the theory for how shift

  10. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  11. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  12. 36 CFR 72.16 - Preliminary Action Program requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Preliminary Action Program... INTERIOR URBAN PARK AND RECREATION RECOVERY ACT OF 1978 Local Recovery Action Programs § 72.16 Preliminary.... Included should be a brief discussion of the relationship of the Preliminary Action Program to...

  13. 36 CFR 72.16 - Preliminary Action Program requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Preliminary Action Program... INTERIOR URBAN PARK AND RECREATION RECOVERY ACT OF 1978 Local Recovery Action Programs § 72.16 Preliminary.... Included should be a brief discussion of the relationship of the Preliminary Action Program to...

  14. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Preliminary real estate work. 644.30 Section 644... PROPERTY REAL ESTATE HANDBOOK Project Planning Military (army and Air Force) and Other Federal Agencies § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action...

  15. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Preliminary real estate work. 644.30 Section 644... PROPERTY REAL ESTATE HANDBOOK Project Planning Military (army and Air Force) and Other Federal Agencies § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action...

  16. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 4 2012-07-01 2011-07-01 true Preliminary real estate work. 644.30 Section 644... PROPERTY REAL ESTATE HANDBOOK Project Planning Military (army and Air Force) and Other Federal Agencies § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action...

  17. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 4 2013-07-01 2013-07-01 false Preliminary real estate work. 644.30 Section 644... PROPERTY REAL ESTATE HANDBOOK Project Planning Military (army and Air Force) and Other Federal Agencies § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action...

  18. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Preliminary documented safety analysis. 830.206 Section... Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor... category 1, 2, or 3 DOE nuclear facility must: (a) Prepare a preliminary documented safety analysis for...

  19. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Preliminary documented safety analysis. 830.206 Section... Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor... category 1, 2, or 3 DOE nuclear facility must: (a) Prepare a preliminary documented safety analysis for...

  20. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Preliminary documented safety analysis. 830.206 Section... Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor... category 1, 2, or 3 DOE nuclear facility must: (a) Prepare a preliminary documented safety analysis for...