Science.gov

Sample records for computed tomography-a preliminary

  1. Preliminary Experimental Results on Controlled Cardiac Computed Tomography: A Phantom Study

    PubMed Central

    Lu, Yang; Cai, Zhijun; Wang, Ge; Zhao, Jun; Bai, Er-Wei

    2010-01-01

    In this paper, we present the preliminary experimental results on controlled cardiac computed tomography (CT), which aims to reduce the motion artifacts by means of controlling the x-ray source rotation speed. An innovative cardiac phantom enables us to perform this experiment without modifying the scanner. It is the first experiment on the cardiac CT with speed controlled x-ray source. Experimental results demonstrate that the proposed method successfully separates the phantom images at different phases (improve the temporal resolution) though controlling the x-ray speed. PMID:19696470

  2. 99mTc-IgG-Lung Scintigraphy in the Assessment of Pulmonary Involvement in Interstitial Lung Disease and Its Comparison With Pulmonary Function Tests and High-Resolution Computed Tomography: A Preliminary Study

    PubMed Central

    Bahtouee, Mehrzad; Saberifard, Jamshid; Javadi, Hamid; Nabipour, Iraj; Malakizadeh, Hasan; Monavvarsadegh, Gholamhossein; Ilkhani Pak, Hoda; Sadeghi, Azadeh; Assadi, Majid

    2015-01-01

    Background: The discrimination of inactive inflammatory processes from the active form of the disease is of great importance in the management of interstitial lung disease (ILD). Objectives: The aim of this study was to determine the efficacy of 99mTc-IgG scan for the detection of severity of disease compared to high-resolution computed tomography (HRCT) and pulmonary function test (PFT). Patients and Methods: Eight known cases of ILD including four cases of Mustard gas (MG) intoxication and four patients with ILD of unknown cause were included in this study. A population of six patients without lung disease was considered as the control group. The patients underwent PFT and high-resolution computed tomography, followed by 99mTc-IgG scan. They were followed up for one year. 99mTc-IgG scan assessment of IgG uptake was accomplished both qualitatively (subjectively) and semiquantitatively. Results: All eight ILD patients demonstrated a strong increase in 99mTc-IgG uptake in the lungs, compared to the control patients. The 99mTc-IgG scan scores were higher in the patient group (0.64[95% confidence interval(CI)=0.61-0.69])) than the control group (0.35 (0.35[95% CI=0.28-0.40]), (P<0.05)). In patients, a statistically significant positive correlation was detected between 99mTc-IgG scan and HRCT scores (Spearman’s correlation coefficient = 0.92, P < 0.008). The 99mTc-Human Immunoglobulin (HIG) scores were not significantly correlated with PFT findings (including FVC, FEV1, FEV1/FVC), O2 saturation and age ( P values > 0.05). There were no significant correlations between 99mTc-IgG score and HRCT patterns including ground glass opacity, reticular fibrosis and honeycombing (P value > 0.05). Conclusion: The present results confirmed that 99mTc-IgG scan could be applied to detect the severity of pulmonary involvement, which was well correlated with HRCT findings. This data also showed that the 99mTc-IgG scan might be used as a complement to HRCT in the functional evaluation

  3. 99mTc-MIBI Lung Scintigraphy in the Assessment of Pulmonary Involvement in Interstitial Lung Disease and Its Comparison With Pulmonary Function Tests and High-Resolution Computed Tomography: A Preliminary Study.

    PubMed

    Bahtouee, Mehrzad; Saberifard, Jamshid; Javadi, Hamid; Nabipour, Iraj; Raeisi, Alireza; Assadi, Majid; Eftekhari, Mohammad

    2015-11-01

    The differentiation of active inflammatory processes from an inactive form of the disease is of great value in the management of interstitial lung disease (ILD). The aim of this investigation was to assess the efficacy of 99mTc-methoxy-isobutyl-isonitrile (99mTc-MIBI) scans in distinguishing the severity of the disease compared to radiological and clinical parameters.In total, 19 known cases of ILD were included in this study and were followed up for 1 year. Five patients without lung disease were considered as the control group. The patients underwent pulmonary function tests (PFTs) and high-resolution computed tomography scans, followed by 99mTc-MIBI scanning. The 99mTc-MIBI scans were analyzed either qualitatively (subjectively) or semiquantitatively.All 19 ILD patients demonstrated a strong increase in 99mTc-MIBI uptake in the lungs compared to the control group. The 99mTc-MIBI scan scores were higher in the patient group in both the early phase (0.24[0.19-0.31] vs 0.11[0.10-0.15], P < 0.05) and the delayed phase (0.15[0.09-0.27] vs 0.04[0.01-0.09], P < 0.05) compared with the control group. A positive correlation was detected between the 99mTc-MIBI scan and the high-resolution computed tomography (HRCT) scores (Spearman's correlation coefficient = 0.65, P < 0.02) in the early phase but not in the delayed phase in patients (P > 0.14). The 99mTc-MIBI scan scores were not significantly correlated with the PFT findings (P > 0.05). In total, 5 patients died and 14 patients were still alive over the 1-year follow-up period. There was also a significant difference between the uptake intensity of 99mTc-MIBI and the outcome in the early phase (dead: 0.32[0.29-0.43] vs alive: 0.21[0.18-0.24], P < 0.05) and delayed phase (dead: 0.27[0.22-0.28] vs alive: 0.10[0.07-0.19], P < 0.05).The washout rate was ~40 min starting from 20 min up to 60 min and this rate was significantly different in our 2 study groups (ILD: 46.61[15.61-50.39] vs NL: 70.91[27.09-116.36], P = 0.04).The

  4. Nanoparticle Contrast Agents for Computed Tomography: A Focus on Micelles

    PubMed Central

    Cormode, David P.; Naha, Pratap C.; Fayad, Zahi A.

    2014-01-01

    Computed tomography (CT) is an X-ray based whole body imaging technique that is widely used in medicine. Clinically approved contrast agents for CT are iodinated small molecules or barium suspensions. Over the past seven years there has been a great increase in the development of nanoparticles as CT contrast agents. Nanoparticles have several advantages over small molecule CT contrast agents, such as long blood-pool residence times, and the potential for cell tracking and targeted imaging applications. Furthermore, there is a need for novel CT contrast agents, due to the growing population of renally impaired patients and patients hypersensitive to iodinated contrast. Micelles and lipoproteins, a micelle-related class of nanoparticle, have notably been adapted as CT contrast agents. In this review we discuss the principles of CT image formation and the generation of CT contrast. We discuss the progress in developing non-targeted, targeted and cell tracking nanoparticle CT contrast agents. We feature agents based on micelles and used in conjunction with spectral CT. The large contrast agent doses needed will necessitate careful toxicology studies prior to clinical translation. However, the field has seen tremendous advances in the past decade and we expect many more advances to come in the next decade. PMID:24470293

  5. Dose spread functions in computed tomography: A Monte Carlo study

    PubMed Central

    Boone, John M.

    2009-01-01

    Purpose: Current CT dosimetry employing CTDI methodology has come under fire in recent years, partially in response to the increasing width of collimated x-ray fields in modern CT scanners. This study was conducted to provide a better understanding of the radiation dose distributions in CT. Methods: Monte Carlo simulations were used to evaluate radiation dose distributions along the z axis arising from CT imaging in cylindrical phantoms. Mathematical cylinders were simulated with compositions of water, polymethyl methacrylate (PMMA), and polyethylene. Cylinder diameters from 10 to 50 cm were studied. X-ray spectra typical of several CT manufacturers (80, 100, 120, and 140 kVp) were used. In addition to no bow tie filter, the head and body bow tie filters from modern General Electric and Siemens CT scanners were evaluated. Each cylinder was divided into three concentric regions of equal volume such that the energy deposited is proportional to dose for each region. Two additional dose assessment regions, central and edge locations 10 mm in diameter, were included for comparisons to CTDI100 measurements. Dose spread functions (DSFs) were computed for a wide number of imaging parameters. Results: DSFs generally exhibit a biexponential falloff from the z=0 position. For a very narrow primary beam input (⪡1 mm), DSFs demonstrated significant low amplitude long range scatter dose tails. For body imaging conditions (30 cm diameter in water), the DSF at the center showed ∼160 mm at full width at tenth maximum (FWTM), while at the edge the FWTM was ∼80 mm. Polyethylene phantoms exhibited wider DSFs than PMMA or water, as did higher tube voltages in any material. The FWTM were 80, 180, and 250 mm for 10, 30, and 50 cm phantom diameters, respectively, at the center in water at 120 kVp with a typical body bow tie filter. Scatter to primary dose ratios (SPRs) increased with phantom diameter from 4 at the center (1 cm diameter) for a 16 cm diameter cylinder to ∼12.5 for a

  6. Upper crustal structure beneath East Java from ambient noise tomography: A preliminary result

    SciTech Connect

    Martha, Agustya Adi; Widiyantoro, Sri; Cummins, Phil; Saygin, Erdinc; Masturyono

    2015-04-24

    East Java has a fairly complex geological structure. Physiographically East Java can be divided into three zones, i.e. the Southern Mountains zone in the southern part, the Kendeng zone in the middle part, and the Rembang zone in the northern part. Most of the seismic hazards in this region are due to processes in the upper crust. In this study, the Ambient Noise Tomography (ANT) method is used to image the upper crustal structure beneath East Java. We have used seismic waveform data recorded by 8Meteorological, Climatological and Geophysical Agency (BMKG) stationary seismographic stations and 16 portable seismographs installed for 2 to 8 weeks. The data were processed to obtain waveforms fromnoise cross-correlation between pairs of seismographic stations. Our preliminary results indicate that the Kendeng zone, an area of low gravity anomaly, is associated with a low velocity zone. On the other hand, the southern mountain range, which has a high gravity anomaly, is related to a high velocity anomaly as shown by our tomographic images.

  7. Preliminary blade design using integrated computer codes

    NASA Astrophysics Data System (ADS)

    Ryan, Arve

    1988-12-01

    Loads on the root of a horizontal axis wind turbine (HAWT) rotor blade were analyzed. A design solution for the root area is presented. The loads on the blades are given by different load cases that are specified. To get a clear picture of the influence of different parameters, the whole blade is designed from scratch. This is only a preliminary design study and the blade should not be looked upon as a construction reference. The use of computer programs for the design and optimization is extensive. After the external geometry is set and the aerodynamic loads calculated, parameters like design stresses and laminate thicknesses are run through the available programs, and a blade design optimized on basis of facts and estimates used is shown.

  8. Preoperative localization of parathyroid adenomas using 4-dimensional computed tomography: a pictorial essay.

    PubMed

    Ellika, Shehanaz; Patel, Suresh; Aho, Todd; Marin, Horia

    2013-08-01

    Accurate preoperative localization is the key to successful parathyroid surgery in the era of minimally invasive parathyroid surgery. This article presents and discusses the embryologic basis of parathyroid gland and ectopic location and different imaging modalities helpful in diagnosing and localizing parathyroid adenomas and/or hyperplasia. We also aim to review the current surgical concepts in treatment of parathyroid adenomas and/or hyperplasia, the utility of 4-dimensional computed tomography for accurate preoperative localization of hyperfunctioning parathyroid glands, imaging classification of adenomas and/or hyperplasia, and, finally, present some of the limitations of 4-dimensional computed tomography.

  9. Comparison of Swedish and Norwegian Use of Cone-Beam Computed Tomography: a Questionnaire Study

    PubMed Central

    Strindberg, Jerker Edén; Hol, Caroline; Torgersen, Gerald; Møystad, Anne; Nilsson, Mats; Hellén-Halme, Kristina

    2015-01-01

    ABSTRACT Objectives Cone-beam computed tomography in dentistry can be used in some countries by other dentists than specialists in radiology. The frequency of buying cone-beam computed tomography to examine patients is rapidly growing, thus knowledge of how to use it is very important. The aim was to compare the outcome of an investigation on the use of cone-beam computed tomography in Sweden with a previous Norwegian study, regarding specifically technical aspects. Material and Methods The questionnaire contained 45 questions, including 35 comparable questions to Norwegian clinics one year previous. Results were based on inter-comparison of the outcome from each of the two questionnaire studies. Results Responses rate was 71% in Sweden. There, most of cone-beam computed tomography (CBCT) examinations performed by dental nurses, while in Norway by specialists. More than two-thirds of the CBCT units had a scout image function, regularly used in both Sweden (79%) and Norway (75%). In Sweden 4% and in Norway 41% of the respondents did not wait for the report from the radiographic specialist before initiating treatment. Conclusions The bilateral comparison showed an overall similarity between the two countries. The survey gave explicit and important knowledge of the need for education and training of the whole team, since radiation dose to the patient could vary a lot for the same kind of radiographic examination. It is essential to establish quality assurance protocols with defined responsibilities in the team in order to maintain high diagnostic accuracy for all examinations when using cone-beam computed tomography for patient examinations. PMID:26904179

  10. Stimulated dual-band infrared computed tomography: A tool to inspect the aging infrastructure

    SciTech Connect

    Del Grande, N.K.; Durbin, P.F.

    1995-06-27

    The authors have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. The system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. They conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. The dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness and type. The authors quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, they conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. They determined the sites and relative concrete displacement of 2-in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. They demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally-heated bridge decks.

  11. Stimulated dual-band infrared computed tomography: a tool to inspect the aging infrastructure

    NASA Astrophysics Data System (ADS)

    DelGrande, Nancy; Durbin, Philip F.

    1995-09-01

    We have developed stimulated dual-band infrared (IR) computed tomography as a tool to inspect the aging infrastructure. Our system has the potential to locate and quantify structural damage within airframes and bridge decks. Typically, dual-band IR detection methods improve the signal-to-noise ratio by a factor of ten, compared to single-band IR detection methods. We conducted a demonstration at Boeing using a uniform pulsed-heat source to stimulate IR images of hidden defects in the 727 fuselage. Our dual-band IR camera and image processing system produced temperature, thermal inertia, and cooling-rate maps. In combination, these maps characterized the defect site, size, depth, thickness, and type. We quantified the percent metal loss from corrosion above a threshold of 5%, with overall uncertainties of 3%. Also, we conducted a feasibility study of dual-band IR thermal imaging for bridge deck inspections. We determined the sites and relative concrete displacement of 12- in. and 4-in. deep delaminations from thin styrofoam implants in asphalt-covered concrete slabs. We demonstrated the value of dual-band IR computed tomography to quantify structural damage within flash-heated airframes and naturally heated bridge decks.

  12. Radiation doses in cone-beam breast computed tomography: A Monte Carlo simulation study

    SciTech Connect

    Yi Ying; Lai, Chao-Jen; Han Tao; Zhong Yuncheng; Shen Youtao; Liu Xinming; Ge Shuaiping; You Zhicheng; Wang Tianpeng; Shaw, Chris C.

    2011-02-15

    Purpose: In this article, we describe a method to estimate the spatial dose variation, average dose and mean glandular dose (MGD) for a real breast using Monte Carlo simulation based on cone beam breast computed tomography (CBBCT) images. We present and discuss the dose estimation results for 19 mastectomy breast specimens, 4 homogeneous breast models, 6 ellipsoidal phantoms, and 6 cylindrical phantoms. Methods: To validate the Monte Carlo method for dose estimation in CBBCT, we compared the Monte Carlo dose estimates with the thermoluminescent dosimeter measurements at various radial positions in two polycarbonate cylinders (11- and 15-cm in diameter). Cone-beam computed tomography (CBCT) images of 19 mastectomy breast specimens, obtained with a bench-top experimental scanner, were segmented and used to construct 19 structured breast models. Monte Carlo simulation of CBBCT with these models was performed and used to estimate the point doses, average doses, and mean glandular doses for unit open air exposure at the iso-center. Mass based glandularity values were computed and used to investigate their effects on the average doses as well as the mean glandular doses. Average doses for 4 homogeneous breast models were estimated and compared to those of the corresponding structured breast models to investigate the effect of tissue structures. Average doses for ellipsoidal and cylindrical digital phantoms of identical diameter and height were also estimated for various glandularity values and compared with those for the structured breast models. Results: The absorbed dose maps for structured breast models show that doses in the glandular tissue were higher than those in the nearby adipose tissue. Estimated average doses for the homogeneous breast models were almost identical to those for the structured breast models (p=1). Normalized average doses estimated for the ellipsoidal phantoms were similar to those for the structured breast models (root mean square (rms

  13. Rare appearance of an odontogenic myxoma in cone-beam computed tomography: a case report.

    PubMed

    Dabbaghi, Arash; Nikkerdar, Nafiseh; Bayati, Soheyla; Golshah, Amin

    2016-01-01

    Odontogenic myxoma (OM) is an infiltrative benign bone tumor that occurs almost exclusively in the facial skeleton. The radiographic characteristics of odontogenic myxoma may produce several patterns, making diagnosis difficult. Cone-beam computed tomography (CBCT) may prove extremely useful in clarifying the intraosseous extent of the tumor and its effects on surrounding structures. Here, we report a case of odontogenic myxoma of the mandible in a 27-year-old female. The patient exhibited a slight swelling in the left mandible. Surgical resection was performed. No recurrence was noted. In the CBCT sections, we observed perforation of the cortical plate and radiopaque line that extended from the periosteum, resembling "sunray" appearance-a rare feature of OM-which could not be assessed by panoramic radiography. PMID:27092217

  14. Rare appearance of an odontogenic myxoma in cone-beam computed tomography: a case report

    PubMed Central

    Dabbaghi, Arash; Nikkerdar, Nafiseh; Bayati, Soheyla; Golshah, Amin

    2016-01-01

    Odontogenic myxoma (OM) is an infiltrative benign bone tumor that occurs almost exclusively in the facial skeleton. The radiographic characteristics of odontogenic myxoma may produce several patterns, making diagnosis difficult. Cone-beam computed tomography (CBCT) may prove extremely useful in clarifying the intraosseous extent of the tumor and its effects on surrounding structures. Here, we report a case of odontogenic myxoma of the mandible in a 27-year-old female. The patient exhibited a slight swelling in the left mandible. Surgical resection was performed. No recurrence was noted. In the CBCT sections, we observed perforation of the cortical plate and radiopaque line that extended from the periosteum, resembling "sunray" appearance—a rare feature of OM—which could not be assessed by panoramic radiography. PMID:27092217

  15. Cardiac findings on non-gated chest computed tomography: A clinical and pictorial review.

    PubMed

    Kanza, Rene Epunza; Allard, Christian; Berube, Michel

    2016-02-01

    The use of chest computed tomography (CT) as an imaging test for the evaluation of thoracic pathology has significantly increased during the last four decades. Although cardiopulmonary diseases often overlap in their clinical manifestation, radiologists tend to overlook the heart while interpreting routine chest CT. Recent advances in CT technology have led to significant reduction of heart motion artefacts and now allow for the identification of several cardiac findings on chest CT even without electrocardiogram (ECG) gating. These observations range from simple curiosity to both benign and malignant discoveries, to life-threatening discoveries. We here present a clinical and radiologic review of common and less common cardiac findings discovered on non-gated chest CT in order to draw the attention of radiologists and referring physicians to these possibilities.

  16. Rare appearance of an odontogenic myxoma in cone-beam computed tomography: a case report.

    PubMed

    Dabbaghi, Arash; Nikkerdar, Nafiseh; Bayati, Soheyla; Golshah, Amin

    2016-01-01

    Odontogenic myxoma (OM) is an infiltrative benign bone tumor that occurs almost exclusively in the facial skeleton. The radiographic characteristics of odontogenic myxoma may produce several patterns, making diagnosis difficult. Cone-beam computed tomography (CBCT) may prove extremely useful in clarifying the intraosseous extent of the tumor and its effects on surrounding structures. Here, we report a case of odontogenic myxoma of the mandible in a 27-year-old female. The patient exhibited a slight swelling in the left mandible. Surgical resection was performed. No recurrence was noted. In the CBCT sections, we observed perforation of the cortical plate and radiopaque line that extended from the periosteum, resembling "sunray" appearance-a rare feature of OM-which could not be assessed by panoramic radiography.

  17. Cardiac findings on non-gated chest computed tomography: A clinical and pictorial review.

    PubMed

    Kanza, Rene Epunza; Allard, Christian; Berube, Michel

    2016-02-01

    The use of chest computed tomography (CT) as an imaging test for the evaluation of thoracic pathology has significantly increased during the last four decades. Although cardiopulmonary diseases often overlap in their clinical manifestation, radiologists tend to overlook the heart while interpreting routine chest CT. Recent advances in CT technology have led to significant reduction of heart motion artefacts and now allow for the identification of several cardiac findings on chest CT even without electrocardiogram (ECG) gating. These observations range from simple curiosity to both benign and malignant discoveries, to life-threatening discoveries. We here present a clinical and radiologic review of common and less common cardiac findings discovered on non-gated chest CT in order to draw the attention of radiologists and referring physicians to these possibilities. PMID:26781150

  18. Measurement of breast tissue composition with dual energy cone-beam computed tomography: A postmortem study

    SciTech Connect

    Ding Huanjun; Ducote, Justin L.; Molloi, Sabee

    2013-06-15

    Purpose: To investigate the feasibility of a three-material compositional measurement of water, lipid, and protein content of breast tissue with dual kVp cone-beam computed tomography (CT) for diagnostic purposes. Methods: Simulations were performed on a flat panel-based computed tomography system with a dual kVp technique in order to guide the selection of experimental acquisition parameters. The expected errors induced by using the proposed calibration materials were also estimated by simulation. Twenty pairs of postmortem breast samples were imaged with a flat-panel based dual kVp cone-beam CT system, followed by image-based material decomposition using calibration data obtained from a three-material phantom consisting of water, vegetable oil, and polyoxymethylene plastic. The tissue samples were then chemically decomposed into their respective water, lipid, and protein contents after imaging to allow direct comparison with data from dual energy decomposition. Results: Guided by results from simulation, the beam energies for the dual kVp cone-beam CT system were selected to be 50 and 120 kVp with the mean glandular dose divided equally between each exposure. The simulation also suggested that the use of polyoxymethylene as the calibration material for the measurement of pure protein may introduce an error of -11.0%. However, the tissue decomposition experiments, which employed a calibration phantom made out of water, oil, and polyoxymethylene, exhibited strong correlation with data from the chemical analysis. The average root-mean-square percentage error for water, lipid, and protein contents was 3.58% as compared with chemical analysis. Conclusions: The results of this study suggest that the water, lipid, and protein contents can be accurately measured using dual kVp cone-beam CT. The tissue compositional information may improve the sensitivity and specificity for breast cancer diagnosis.

  19. Measurement of breast tissue composition with dual energy cone-beam computed tomography: A postmortem study

    PubMed Central

    Ding, Huanjun; Ducote, Justin L.; Molloi, Sabee

    2013-01-01

    Purpose: To investigate the feasibility of a three-material compositional measurement of water, lipid, and protein content of breast tissue with dual kVp cone-beam computed tomography (CT) for diagnostic purposes. Methods: Simulations were performed on a flat panel-based computed tomography system with a dual kVp technique in order to guide the selection of experimental acquisition parameters. The expected errors induced by using the proposed calibration materials were also estimated by simulation. Twenty pairs of postmortem breast samples were imaged with a flat-panel based dual kVp cone-beam CT system, followed by image-based material decomposition using calibration data obtained from a three-material phantom consisting of water, vegetable oil, and polyoxymethylene plastic. The tissue samples were then chemically decomposed into their respective water, lipid, and protein contents after imaging to allow direct comparison with data from dual energy decomposition. Results: Guided by results from simulation, the beam energies for the dual kVp cone-beam CT system were selected to be 50 and 120 kVp with the mean glandular dose divided equally between each exposure. The simulation also suggested that the use of polyoxymethylene as the calibration material for the measurement of pure protein may introduce an error of −11.0%. However, the tissue decomposition experiments, which employed a calibration phantom made out of water, oil, and polyoxymethylene, exhibited strong correlation with data from the chemical analysis. The average root-mean-square percentage error for water, lipid, and protein contents was 3.58% as compared with chemical analysis. Conclusions: The results of this study suggest that the water, lipid, and protein contents can be accurately measured using dual kVp cone-beam CT. The tissue compositional information may improve the sensitivity and specificity for breast cancer diagnosis. PMID:23718593

  20. Computed tomography: a powerful imaging technique in the fields of dimensional metrology and quality control

    NASA Astrophysics Data System (ADS)

    Probst, Gabriel; Boeckmans, Bart; Dewulf, Wim; Kruth, Jean-Pierre

    2016-05-01

    X-ray computed tomography (CT) is slowly conquering its space in the manufacturing industry for dimensional metrology and quality control purposes. The main advantage is its non-invasive and non-destructive character. Currently, CT is the only measurement technique that allows full 3D visualization of both inner and outer features of an object through a contactless probing system. Using hundreds of radiographs, acquired while rotating the object, a 3D representation is generated and dimensions can be verified. In this research, this non-contact technique was used for the inspection of assembled components. A dental cast model with 8 implants, connected by a screwed retained bar made of titanium. The retained bar includes a mating interface connection that should ensure a perfect fitting without residual stresses when the connection is fixed with screws. CT was used to inspect the mating interfaces between these two components. Gaps at the connections can lead to bacterial growth and potential inconvenience for the patient who would have to face a new surgery to replace his/hers prosthesis. With the aid of CT, flaws in the design or manufacturing process that could lead to gaps at the connections could be assessed.

  1. Image-Guided Drug Delivery with Single-Photon Emission Computed Tomography: A Review of Literature

    PubMed Central

    Chakravarty, Rubel; Hong, Hao; Cai, Weibo

    2014-01-01

    Tremendous resources are being invested all over the world for prevention, diagnosis, and treatment of various types of cancer. Successful cancer management depends on accurate diagnosis of the disease along with precise therapeutic protocol. The conventional systemic drug delivery approaches generally cannot completely remove the competent cancer cells without surpassing the toxicity limits to normal tissues. Therefore, development of efficient drug delivery systems holds prime importance in medicine and healthcare. Also, molecular imaging can play an increasingly important and revolutionizing role in disease management. Synergistic use of molecular imaging and targeted drug delivery approaches provides unique opportunities in a relatively new area called `image-guided drug delivery' (IGDD). Single-photon emission computed tomography (SPECT) is the most widely used nuclear imaging modality in clinical context and is increasingly being used to guide targeted therapeutics. The innovations in material science have fueled the development of efficient drug carriers based on, polymers, liposomes, micelles, dendrimers, microparticles, nanoparticles, etc. Efficient utilization of these drug carriers along with SPECT imaging technology have the potential to transform patient care by personalizing therapy to the individual patient, lessening the invasiveness of conventional treatment procedures and rapidly monitoring the therapeutic efficacy. SPECT-IGDD is not only effective for treatment of cancer but might also find utility in management of several other diseases. Herein, we provide a concise overview of the latest advances in SPECT-IGDD procedures and discuss the challenges and opportunities for advancement of the field. PMID:25182469

  2. Cone beam x-ray luminescence computed tomography: A feasibility study

    SciTech Connect

    Chen Dongmei; Zhu Shouping; Yi Huangjian; Zhang Xianghan; Chen Duofang; Liang Jimin; Tian Jie

    2013-03-15

    Purpose: The appearance of x-ray luminescence computed tomography (XLCT) opens new possibilities to perform molecular imaging by x ray. In the previous XLCT system, the sample was irradiated by a sequence of narrow x-ray beams and the x-ray luminescence was measured by a highly sensitive charge coupled device (CCD) camera. This resulted in a relatively long sampling time and relatively low utilization of the x-ray beam. In this paper, a novel cone beam x-ray luminescence computed tomography strategy is proposed, which can fully utilize the x-ray dose and shorten the scanning time. The imaging model and reconstruction method are described. The validity of the imaging strategy has been studied in this paper. Methods: In the cone beam XLCT system, the cone beam x ray was adopted to illuminate the sample and a highly sensitive CCD camera was utilized to acquire luminescent photons emitted from the sample. Photons scattering in biological tissues makes it an ill-posed problem to reconstruct the 3D distribution of the x-ray luminescent sample in the cone beam XLCT. In order to overcome this issue, the authors used the diffusion approximation model to describe the photon propagation in tissues, and employed the sparse regularization method for reconstruction. An incomplete variables truncated conjugate gradient method and permissible region strategy were used for reconstruction. Meanwhile, traditional x-ray CT imaging could also be performed in this system. The x-ray attenuation effect has been considered in their imaging model, which is helpful in improving the reconstruction accuracy. Results: First, simulation experiments with cylinder phantoms were carried out to illustrate the validity of the proposed compensated method. The experimental results showed that the location error of the compensated algorithm was smaller than that of the uncompensated method. The permissible region strategy was applied and reduced the reconstruction error to less than 2 mm. The robustness

  3. The 100 most-cited original articles in cardiac computed tomography: A bibliometric analysis.

    PubMed

    O'Keeffe, Michael E; Hanna, Tarek N; Holmes, Davis; Marais, Olivia; Mohammed, Mohammed F; Clark, Sheldon; McLaughlin, Patrick; Nicolaou, Savvas; Khosa, Faisal

    2016-01-01

    Bibliometric analysis is the application of statistical methods to analyze quantitative data about scientific publications. It can evaluate research performance, author productivity, and manuscript impact. To the best of our knowledge, no bibliometric analysis has focused on cardiac computed tomography (CT). The purpose of this paper was to compile a list of the 100 most-cited articles related to cardiac CT literature using Scopus and Web of Science (WOS). A list of the 100 most-cited articles was compiled by order of citation frequency, as well a list of the top 10 most-cited guideline and review articles and the 20 most-cited articles of the years 2014-2015. The database of 100 most-cited articles was analyzed to identify characteristics of highly cited publications. For each manuscript, the number of authors, study design, size of patient cohort and departmental affiliations were cataloged. The 100 most-cited articles were published from 1990 to 2012, with the majority (53) published between 2005 and 2009. The total number of citations varied from 3354 to 196, and the number of citations per year varied from 9.5 to 129.0 with a median and mean of 30.9 and 38.7, respectively. The majority of publications had a study patients sample size of 200 patients or less. The USA and Germany were the nations with the highest number of frequently cited publications. This bibliometric analysis provides insights on the most-cited articles published on the subject of cardiac CT and calcium volume, thus helping to characterize the field and guide future research.

  4. Puffed-cheek computed tomography: a dynamic maneuver for imaging oral cavity tumors.

    PubMed

    Erdogan, Nezahat; Bulbul, Erdogan; Songu, Murat; Uluc, Engin; Onal, Kazim; Apaydin, Melda; Katilmis, Huseyin

    2012-09-01

    We conducted a prospective study to compare the effectiveness of conventional computed tomography (CT) and puffed-cheek CT in detecting the presence and extension of oral cavity malignant tumors. We enrolled 11 patients--5 men and 6 women, aged 32 to 85 years--who had a primary squamous cell carcinoma of the oral cavity. These tumors were located in the floor of the mouth in 4 patients, in the buccal mucosa in 4, in both the buccal mucosa and retromolar trigone in 2, and in the retromolar trigone only in 1. First, conventional contrast-enhanced axial CT was obtained through the oral cavity and neck in each patient. Next, axial imaging was obtained through the oral cavity while patients inflated their cheeks, pursed their lips, and held their breath. We found that the puffed-cheek CTs provided more information regarding the size and extent of the squamous cell carcinomas than did the conventional CTs. For example, in 8 patients, conventional CT could not differentiate the tumor from the normal mucosal surface, but puffed-cheek images clearly showed the surface of the tumor as distinct from the normal mucosa. More disconcerting was the fact that in the other 3 patients, conventional CTs were evaluated as normal, even though puffed-cheek imaging clearly showed the mass in each case. We conclude that puffed-cheek CT is superior to conventional CT for evaluating the mucosal surfaces of the oral cavity. It provides a clearer and more detailed picture with no downside. PMID:22996710

  5. Respiratory triggered 4D cone-beam computed tomography: A novel method to reduce imaging dose

    PubMed Central

    Cooper, Benjamin J.; O’Brien, Ricky T.; Balik, Salim; Hugo, Geoffrey D.; Keall, Paul J.

    2013-01-01

    Purpose: A novel method called respiratory triggered 4D cone-beam computed tomography (RT 4D CBCT) is described whereby imaging dose can be reduced without degrading image quality. RT 4D CBCT utilizes a respiratory signal to trigger projections such that only a single projection is assigned to a given respiratory bin for each breathing cycle. In contrast, commercial 4D CBCT does not actively use the respiratory signal to minimize image dose. Methods: To compare RT 4D CBCT with conventional 4D CBCT, 3600 CBCT projections of a thorax phantom were gathered and reconstructed to generate a ground truth CBCT dataset. Simulation pairs of conventional 4D CBCT acquisitions and RT 4D CBCT acquisitions were developed assuming a sinusoidal respiratory signal which governs the selection of projections from the pool of 3600 original projections. The RT 4D CBCT acquisition triggers a single projection when the respiratory signal enters a desired acquisition bin; the conventional acquisition does not use a respiratory trigger and projections are acquired at a constant frequency. Acquisition parameters studied were breathing period, acquisition time, and imager frequency. The performance of RT 4D CBCT using phase based and displacement based sorting was also studied. Image quality was quantified by calculating difference images of the test dataset from the ground truth dataset. Imaging dose was calculated by counting projections. Results: Using phase based sorting RT 4D CBCT results in 47% less imaging dose on average compared to conventional 4D CBCT. Image quality differences were less than 4% at worst. Using displacement based sorting RT 4D CBCT results in 57% less imaging dose on average, than conventional 4D CBCT methods; however, image quality was 26% worse with RT 4D CBCT. Conclusions: Simulation studies have shown that RT 4D CBCT reduces imaging dose while maintaining comparable image quality for phase based 4D CBCT; image quality is degraded for displacement based RT 4D

  6. Computed tomography--a possible aid in the diagnosis of smoke inhalation injury?

    PubMed

    Reske, A; Bak, Z; Samuelsson, A; Morales, O; Seiwerts, M; Sjöberg, F

    2005-02-01

    Inhalation injury is an important contributor to morbidity and mortality in burn victims and can trigger acute lung injury and acute respiratory distress syndrome (ARDS) (1-3). Early diagnosis and treatment of inhalation injury are important, but a major problem in planning treatment and evaluating the prognosis has been the lack of consensus about diagnostic criteria (4). Chest radiographs on admission are often non-specific (5, 6), but indicators include indoor fires, facial burns, bronchoscopic findings of soot in the airways, and detection of carbon monoxide or cyanide in the blood (7). Changes in the lungs may be detected by bronchoscopy with biopsy, xenon imaging, or measurement of pulmonary extracellular fluid (4, 5, 8). These methods have, however, been associated with low sensitivity and specificity, as exemplified by the 50% predictive value in the study of Masanes et al. (8). Computed tomographs (CTs) are better than normal chest radiographs in the detection of other pulmonary lesions such as pulmonary contusion (9, 10). The importance of CT scans in patients with ARDS has been reviewed recently (9), but unfortunately there has been no experience of CT in patients with smoke inhalation injury. To our knowledge, there are only two animal studies reporting that smoke inhalation injury can be detected by CT (4, 11); specific changes in human CT scans have not yet been described. Therefore, confronted with a patient with severe respiratory failure after a burn who from the history and physical examination showed the classic risk factors for inhalation injury, we decided to request a CT. PMID:15715631

  7. Segmentation and quantification of materials with energy discriminating computed tomography: A phantom study

    PubMed Central

    Le, Huy Q.; Molloi, Sabee

    2011-01-01

    Purpose: To experimentally investigate whether a computed tomography (CT) system based on CdZnTe (CZT) detectors in conjunction with a least-squares parameter estimation technique can be used to decompose four different materials. Methods: The material decomposition process was divided into a segmentation task and a quantification task. A least-squares minimization algorithm was used to decompose materials with five measurements of the energy dependent linear attenuation coefficients. A small field-of-view energy discriminating CT system was built. The CT system consisted of an x-ray tube, a rotational stage, and an array of CZT detectors. The CZT array was composed of 64 pixels, each of which is 0.8×0.8×3 mm. Images were acquired at 80 kVp in fluoroscopic mode at 50 ms per frame. The detector resolved the x-ray spectrum into energy bins of 22–32, 33–39, 40–46, 47–56, and 57–80 keV. Four phantoms were constructed from polymethylmethacrylate (PMMA), polyethylene, polyoxymethylene, hydroxyapatite, and iodine. Three phantoms were composed of three materials with embedded hydroxyapatite (50, 150, 250, and 350 mg∕ml) and iodine (4, 8, 12, and 16 mg∕ml) contrast elements. One phantom was composed of four materials with embedded hydroxyapatite (150 and 350 mg∕ml) and iodine (8 and 16 mg∕ml). Calibrations consisted of PMMA phantoms with either hydroxyapatite (100, 200, 300, 400, and 500 mg∕ml) or iodine (5, 15, 25, 35, and 45 mg∕ml) embedded. Filtered backprojection and a ramp filter were used to reconstruct images from each energy bin. Material segmentation and quantification were performed and compared between different phantoms. Results: All phantoms were decomposed accurately, but some voxels in the base material regions were incorrectly identified. Average quantification errors of hydroxyapatite∕iodine were 9.26∕7.13%, 7.73∕5.58%, and 12.93∕8.23% for the three-material PMMA, polyethylene, and polyoxymethylene phantoms, respectively. The

  8. Preliminary Phase Field Computational Model Development

    SciTech Connect

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  9. Modeling the complete Otto cycle: Preliminary version. [computer programming

    NASA Technical Reports Server (NTRS)

    Zeleznik, F. J.; Mcbride, B. J.

    1977-01-01

    A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

  10. Emission Computed Tomography: A New Technique for the Quantitative Physiologic Study of Brain and Heart in Vivo

    DOE R&D Accomplishments Database

    Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.

    1978-01-01

    Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.

  11. Measurement of the Interantral Bone in Implant Dentistry Using Panoramic Radiography and Cone Beam Computed Tomography: A Human Radiographic Study

    PubMed Central

    Kopecka, D; Simunek, A; Streblov, J; Slezak, R; Capek, L

    2014-01-01

    ABSTRACT Objective: To analyse the dimensions of interantral bone available for dental implant placement in the fully edentulous maxilla. Methods: Interantral bone height (IBH) was measured using panoramic radiography and computed tomography (CT). Interantral bone width (IBW) was measured by means of CT. Results: The difference between both imaging methods in IBH assessment was highly statistically significant (p < 0.001) in the canine area, whereas in other areas, it was found to not be significant. Measured in the CT scans, bone is significantly higher in the canine area compared to the area of central and lateral incisors (p < 0.001). Significant variations in IBW were found in all three locations: bone in the central incisor area is the widest, in the area of the lateral incisor, the narrowest (p < 0.001). Conclusion: Panoramic radiography is a sufficiently accurate method for IBH imaging in the incisor area, but not in the canine area. PMID:25781290

  12. Clival lesion incidentally discovered on cone-beam computed tomography: A case report and review of the literature

    PubMed Central

    Tadinada, Aditya; Rengasamy, Kandasamy; Fellows, Douglas; Lurie, Alan G.

    2014-01-01

    An osteolytic lesion with a small central area of mineralization and sclerotic borders was discovered incidentally in the clivus on the cone-beam computed tomography (CBCT) of a 27-year-old male patient. This benign appearance indicated a primary differential diagnosis of non-aggressive lesions such as fibro-osseous lesions and arrested pneumatization. Further, on magnetic resonance imaging (MRI), the lesion showed a homogenously low T1 signal intensity with mild internal enhancement after post-gadolinium and a heterogeneous T2 signal intensity. These signal characteristics might be attributed to the fibrous tissues, chondroid matrix, calcific material, or cystic component of the lesion; thus, chondroblastoma and chondromyxoid fibroma were added to the differential diagnosis. Although this report was limited by the lack of final diagnosis and the patient lost to follow-up, the incidental skull base finding would be important for interpreting the entire volume of CBCT by a qualified oral and maxillofacial radiologist. PMID:24944968

  13. Clival lesion incidentally discovered on cone-beam computed tomography: A case report and review of the literature.

    PubMed

    Jadhav, Aniket B; Tadinada, Aditya; Rengasamy, Kandasamy; Fellows, Douglas; Lurie, Alan G

    2014-06-01

    An osteolytic lesion with a small central area of mineralization and sclerotic borders was discovered incidentally in the clivus on the cone-beam computed tomography (CBCT) of a 27-year-old male patient. This benign appearance indicated a primary differential diagnosis of non-aggressive lesions such as fibro-osseous lesions and arrested pneumatization. Further, on magnetic resonance imaging (MRI), the lesion showed a homogenously low T1 signal intensity with mild internal enhancement after post-gadolinium and a heterogeneous T2 signal intensity. These signal characteristics might be attributed to the fibrous tissues, chondroid matrix, calcific material, or cystic component of the lesion; thus, chondroblastoma and chondromyxoid fibroma were added to the differential diagnosis. Although this report was limited by the lack of final diagnosis and the patient lost to follow-up, the incidental skull base finding would be important for interpreting the entire volume of CBCT by a qualified oral and maxillofacial radiologist. PMID:24944968

  14. Radiological patterns of primary graft dysfunction after lung transplantation evaluated by 64-multi-slice computed tomography: a descriptive study.

    PubMed

    Belmaati, Esther Okeke; Steffensen, Ida; Jensen, Claus; Kofoed, Klaus F; Mortensen, Jann; Nielsen, Michael B; Iversen, Martin

    2012-06-01

    We evaluated the diagnostic value of high-resolution computed tomography (HRCT) images generated from 64 detector multi-slice CT scanners (HRCT(64-MSCT) imaging) in relation to primary graft dysfunction (PGD) after lung-transplantation (LUTX) in a pilot study. PGD has mortality rates ranging from 17 to 50% over a 90-day period. Detailed HRCT lung images, reconstructed using 64-MSCT, may aid diagnostic and therapeutic efforts in PGD. Thirty-two patients were scanned four times within a year post-LUTX, in a single-centre prospective study. HRCT lung images were reviewed, evaluated and scored by two observers, for ground-glass (GG) opacities, consolidation, septal thickening (ST) and pulmonary embolism. Image and PGD scores were compared in each patient. GG and consolidation changes were largely present up until 2 weeks post-LUTX, and markedly reduced by the 12th week. ST was predominantly found in patients with PGD. There were no vascular changes found at CT angiographies. The most severe cases of GG opacities and consolidation were found in patients with PGD. ST seems to be an important indicator of PGD. HRCT(64-MSCT) imaging may be a useful tool for the identification of pathological features of PGD not detected by classical evaluation in patients undergoing LUTX. PMID:22378316

  15. Evaluation of stability after pre-orthodontic orthognathic surgery using cone-beam computed tomography: A comparison with conventional treatment

    PubMed Central

    Ann, Hye-Rim; Jung, Young-Soo; Lee, Kee-Joon

    2016-01-01

    Objective The aim of this study was to evaluate the skeletal and dental changes after intraoral vertical ramus osteotomy (IVRO) with and without presurgical orthodontics by using cone-beam computed tomography (CBCT). Methods This retrospective cohort study included 24 patients (mean age, 22.1 years) with skeletal Class III malocclusion who underwent bimaxillary surgery with IVRO. The patients were divided into the preorthodontic orthognathic surgery (POGS) group (n = 12) and conventional surgery (CS) group (n = 12). CBCT images acquired preoperatively, 1 month after surgery, and 1 year after surgery were analyzed to compare the intergroup differences in postoperative three-dimensional movements of the maxillary and mandibular landmarks and the changes in lateral cephalometric variables. Results Baseline demographics (sex and age) were similar between the two groups (6 men and 6 women in each group). During the postsurgical period, the POGS group showed more significant upward movement of the mandible (p < 0.05) than did the CS group. Neither group showed significant transverse movement of any of the skeletal landmarks. Moreover, none of the dental and skeletal variables showed significant intergroup differences 1 year after surgery. Conclusions Compared with CS, POGS with IVRO resulted in significantly different postsurgical skeletal movement in the mandible. Although both groups showed similar skeletal and dental outcomes at 1 year after surgery, upward movement of the mandible during the postsurgical period should be considered to ensure a more reliable outcome after POGS. PMID:27668193

  16. Evaluation of stability after pre-orthodontic orthognathic surgery using cone-beam computed tomography: A comparison with conventional treatment

    PubMed Central

    Ann, Hye-Rim; Jung, Young-Soo; Lee, Kee-Joon

    2016-01-01

    Objective The aim of this study was to evaluate the skeletal and dental changes after intraoral vertical ramus osteotomy (IVRO) with and without presurgical orthodontics by using cone-beam computed tomography (CBCT). Methods This retrospective cohort study included 24 patients (mean age, 22.1 years) with skeletal Class III malocclusion who underwent bimaxillary surgery with IVRO. The patients were divided into the preorthodontic orthognathic surgery (POGS) group (n = 12) and conventional surgery (CS) group (n = 12). CBCT images acquired preoperatively, 1 month after surgery, and 1 year after surgery were analyzed to compare the intergroup differences in postoperative three-dimensional movements of the maxillary and mandibular landmarks and the changes in lateral cephalometric variables. Results Baseline demographics (sex and age) were similar between the two groups (6 men and 6 women in each group). During the postsurgical period, the POGS group showed more significant upward movement of the mandible (p < 0.05) than did the CS group. Neither group showed significant transverse movement of any of the skeletal landmarks. Moreover, none of the dental and skeletal variables showed significant intergroup differences 1 year after surgery. Conclusions Compared with CS, POGS with IVRO resulted in significantly different postsurgical skeletal movement in the mandible. Although both groups showed similar skeletal and dental outcomes at 1 year after surgery, upward movement of the mandible during the postsurgical period should be considered to ensure a more reliable outcome after POGS.

  17. Risk to fragmented DNA in dry, wet, and frozen states from computed tomography: a comparative theoretical study.

    PubMed

    Wanek, Johann; Rühli, Frank Jakobus

    2016-05-01

    Computed tomography represents the gold standard in forensic and palaeopathological diagnosis. However, the X-rays used may affect the DNA quality through fragmentation and loss of genetic information. Previous work showed that the effects of ionizing radiation on dry DNA are non-significant with P < 10(-8), which cannot be detected by means of polymerase chain reaction methods. In the present paper, complete analytical model that characterizes radiation effects on fragmented DNA in dry, wet, and frozen states is described. Simulation of radiation tracks in water phantom cells was performed using the Geant4-DNA toolkit. Cell hits by electrons with energies between 5 and 20 keV were simulated, and the formation of radiolytic products was assessed at a temperature of 298 K. The diffusion coefficient and the mean square displacement of reactive species were calculated by Stokes-Einstein-Smoluchowski relations at 273 K. Finally, DNA fragment damage was estimated using the density distribution of fragments calculated from atomic force microscopy images. The lowest probability of radiation-induced DNA damage was observed for dry state, with a range from 2.5 × 10(-9) to 7.8 × 10(-12) at 298 K, followed by that for frozen state, with a range from 0.9 to 4 × 10(-7) at 273 K. The highest probability of radiation-induced DNA damage was demonstrated for fragmented DNA in wet state with a range from 2 to 9 × 10(-7) at 298 K. These results significantly improve the interpretation of CT imaging in future studies in forensic and palaeopathological science.

  18. Little impact of tsunami-stricken nuclear accident on awareness of radiation dose of cardiac computed tomography: A questionnaire study

    PubMed Central

    2013-01-01

    Background With the increased use of cardiac computed tomography (CT), radiation dose remains a major issue, although physicians are trying to reduce the substantial risks associated with use of this diagnostic tool. This study was performed to investigate recognition of the level of radiation exposure from cardiac CT and the differences in the level of awareness of radiation before and after the Fukushima nuclear plant accident. Methods We asked 30 physicians who were undergoing training in internal medicine to determine the equivalent doses of radiation for common radiological examinations when a normal chest X-ray is accepted as one unit; questions about the absolute radiation dose of cardiac CT data were also asked. Results According to the results, 86.6% of respondents believed the exposure to be 1 mSv at most, and 93.3% thought that the exposure was less than that of 100 chest X-rays. This finding indicates that their perceptions were far lower than the actual amounts. Even after the occurrence of such a large nuclear disaster in Fukushima, there were no significant differences in the same subjects’ overall awareness of radiation amounts. Conclusions Even after such a major social issue as the Fukushima nuclear accident, the level of awareness of the accurate radiation amount used in 64-channel multidetector CT (MDCT) by clinical physicians who order this test was not satisfactory. Thus, there is a need for the development of effective continuing education programs to improve awareness of radiation from ionizing radiation devices, including cardiac CT, and emphasis on risk-benefit evaluation based on accurate knowledge during medical training. PMID:23631688

  19. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4.

    PubMed

    Agasthya, G A; Harrawood, B C; Shah, J P; Kapadia, A J

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g(-1), corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g(-1) and sensitivity is ∼13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g(-1) and ∼5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  20. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4

    NASA Astrophysics Data System (ADS)

    Agasthya, G. A.; Harrawood, B. C.; Shah, J. P.; Kapadia, A. J.

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g-1,In this paper all iron concentrations with units mg g-1 refer to wet weight concentrations. corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g-1 and sensitivity is ˜13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g-1 and ˜5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  1. Estimation of effective doses to adult and pediatric patients from multislice computed tomography: A method based on energy imparted

    SciTech Connect

    Theocharopoulos, Nicholas; Damilakis, John; Perisinakis, Kostas; Tzedakis, Antonis; Karantanas, Apostolos; Gourtsoyiannis, Nicholas

    2006-10-15

    The purpose of this study is to provide a method and required data for the estimation of effective dose (E) values to adult and pediatric patients from computed tomography (CT) scans of the head, chest abdomen, and pelvis, performed on multi-slice scanners. Mean section radiation dose (d{sub m}) to cylindrical water phantoms of varying radius normalized over CT dose index free-in-air (CTDI{sub F}) were calculated for the head and body scanning modes of a multislice scanner with use of Monte Carlo techniques. Patients were modeled as equivalent water phantoms and the energy imparted ({epsilon}) to simulated pediatric and adult patients was calculated on the basis of measured CTDI{sub F} values. Body region specific energy imparted to effective dose conversion coefficients (E/{epsilon}) for adult male and female patients were generated from previous data. Effective doses to patients aged newborn to adult were derived for all available helical and axial beam collimations, taking into account age specific patient mass and scanning length. Depending on high voltage, body region, and patient sex, E/{epsilon} values ranged from 0.008 mSv/mJ for head scans to 0.024 mSv/mJ for chest scans. When scanned with the same technique factors as the adults, pediatric patients absorb as little as 5% of the energy imparted to adults, but corresponding effective dose values are up to a factor of 1.6 higher. On average, pediatric patients absorb 44% less energy per examination but have a 24% higher effective dose, compared with adults. In clinical practice, effective dose values to pediatric patients are 2.5 to 10 times lower than in adults due to the adaptation of tube current. A method is provided for the calculation of effective dose to adult and pediatric patients on the basis of individual patient characteristics such as sex, mass, dimensions, and density of imaged anatomy, and the technical features of modern multislice scanners. It allows the optimum selection of scanning

  2. A Preliminary Study of the Burgers Equation with Symbolic Computation

    NASA Astrophysics Data System (ADS)

    Derickson, Russell G.; Pielke, Roger A.

    2000-07-01

    A novel approach based on recursive symbolic computation is introduced for the approximate analytic solution of the Burgers equation. Once obtained, appropriate numerical values can be inserted into the symbolic solution to explore parametric variations. The solution is valid for both inviscid and viscous cases, covering the range of Reynolds number from 500 to infinity, whereas current direct numerical simulation (DNS) methods are limited to Reynolds numbers no greater than 4000. What further distinguishes the symbolic approach from numerical and traditional analytic techniques is the ability to reveal and examine direct nonlinear interactions between waves, including the interplay between inertia and viscosity. Thus, preliminary efforts suggest that symbolic computation may be quite effective in unveiling the “anatomy” of the myriad interactions that underlie turbulent behavior. However, due to the tendency of nonlinear symbolic operations to produce combinatorial explosion, future efforts will require the development of improved filtering processes to select and eliminate computations leading to negligible high order terms. Indeed, the initial symbolic computations present the character of turbulence as a problem in combinatorics. At present, results are limited in time evolution, but reveal the beginnings of the well-known “saw tooth” waveform that occurs in the inviscid case (i.e., Re=∞). Future efforts will explore more fully developed 1-D flows and investigate the potential to extend symbolic computations to 2-D and 3-D. Potential applications include the development of improved subgrid scale (SGS) parameterizations for large eddy simulation (LES) models, and studies that complement DNS in exploring fundamental aspects of turbulent flow behavior.

  3. Digital characterization and preliminary computer modeling of hydrocarbon bearing sandstone

    NASA Astrophysics Data System (ADS)

    Latief, Fourier Dzar Eljabbar; Haq, Tedy Muslim

    2014-03-01

    With the advancement of three dimensional imaging technologies, especially the μCT scanning systems, we have been able to obtain three-dimensional digital representation of porous rocks in the scale of micrometers. Characterization was then also possible to conduct using computational approach. Hydrocarbon bearing sandstone has become one of interesting objects to analyze in the last decade. In this research, we performed digital characterization of hydrocarbon bearing sandstone reservoir from Sumatra. The sample was digitized using a μCT scanner (Skyscan 1173) which produced series of reconstructed images with the spatial resolution of 15 μm. Using computational approaches, i.e., image processing, image analysis, and simulation of fluid flow inside the rock using Lattice Boltzmann Method, we have been able to obtain the porosity of the sandstone, which is 23.89%, and the permeability, which is 9382 mD. Based on visual inspection, the porosity value, along with the calculated specific surface area, we produce a preliminary computer model of the rock using grain based method. This method employs a reconstruction of grains using the non-spherical model, and a purely random deposition of the grains in a virtual three dimensional cube with the size of 300 × 300 × 300. The model has porosity of 23.96%, and the permeability is 7215 mD. While the error of the porosity is very small (which is only 0.3%), the permeability has error of around 23% from the real sample which is considered very significant. This suggests that the modeling based on porosity and specific surface area is not satisfactory to produce a representative model. However, this work has been a good example of how characterization and modeling of porous rock can be conducted using a non-destructive computational approach.

  4. Perforated duodenal ulcer presenting with a subphrenic abscess revealed by plain abdominal X-ray films and confirmed by multi-detector computed tomography: a case report

    PubMed Central

    2013-01-01

    Introduction Peptic ulcer disease is still the major cause of gastrointestinal perforation despite major improvements in both diagnostic and therapeutic strategies. While the diagnosis of a perforated ulcer is straightforward in typical cases, its clinical onset may be subtle because of comorbidities and/or concurrent therapies. Case presentation We report the case of a 53-year-old Caucasian man with a history of chronic myeloid leukemia on maintenance therapy (100mg/day) with imatinib who was found to have a subphrenic abscess resulting from a perforated duodenal ulcer that had been clinically overlooked. Our patient was febrile (38.5°C) with abdominal tenderness and hypoactive bowel sounds. On the abdominal plain X-ray films, a right subphrenic abscess could be seen. On contrast-enhanced multi-detector computed tomography, a huge air-fluid collection extending from the subphrenic to the subhepatic anterior space was observed. After oral administration of 500cm3 of 3 percent diluted diatrizoate meglumine, an extraluminal leakage of the water-soluble iodinated contrast media could then be appreciated as a result of a perforated duodenal ulcer. During surgery, the abscess was drained and extensive adhesiolysis had to be performed to expose the duodenal bulb where the ulcer was first identified by methylene blue administration and then sutured. Conclusions While subphrenic abscesses are well known complications of perforated gastric or duodenal ulcers, they have nowadays become rare thanks to advances in both diagnostic and therapeutic strategies for peptic ulcer disease. However, when peptic ulcer disease is not clinically suspected, the contribution of imaging may be substantial. PMID:24215711

  5. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... computed on preliminary term basis. 1.818-4 Section 1.818-4 Internal Revenue INTERNAL REVENUE SERVICE... Election with respect to life insurance reserves computed on preliminary term basis. (a) In general... reserves are computed on one of the recognized preliminary term bases to elect to revalue such reserves...

  6. An integrated computer system for preliminary design of advanced aircraft.

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Sobieszczanski, J.; Landrum, E. J.

    1972-01-01

    A progress report is given on the first phase of a research project to develop a system of Integrated Programs for Aerospace-Vehicle Design (IPAD) which is intended to automate to the largest extent possible the preliminary and detailed design of advanced aircraft. The approach used is to build a pilot system and simultaneously to carry out two major contractual studies to define a practical IPAD system preparatory to programing. The paper summarizes the specifications and goals of the IPAD system, the progress to date, and any conclusion reached regarding its feasibility and scope. Sample calculations obtained with the pilot system are given for aircraft preliminary designs optimized with respect to discipline parameters, such as weight or L/D, and these results are compared with designs optimized with respect to overall performance parameters, such as range or payload.

  7. Computer-Generated Geometry Instruction: A Preliminary Study

    ERIC Educational Resources Information Center

    Kang, Helen W.; Zentall, Sydney S.

    2011-01-01

    This study hypothesized that increased intensity of graphic information, presented in computer-generated instruction, could be differentially beneficial for students with hyperactivity and inattention by improving their ability to sustain attention and hold information in-mind. To this purpose, 18 2nd-4th grade students, recruited from general…

  8. Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation

    NASA Technical Reports Server (NTRS)

    Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred

    2008-01-01

    Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.

  9. Analyzing high energy physics data using database computing: Preliminary report

    NASA Technical Reports Server (NTRS)

    Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry

    1991-01-01

    A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.

  10. Developing ontological model of computational linear algebra - preliminary considerations

    NASA Astrophysics Data System (ADS)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  11. Evaluation of a tuberculous abscess on the right side of the diaphragm with contrast-enhanced computed tomography: A case report

    PubMed Central

    DONG, PENG; CHEN, JING-JING; WANG, XI-ZHEN

    2016-01-01

    We herein investigate the case of a patient with a tuberculous diaphragmatic abscess confirmed by pathology. The patient underwent plain computed tomography (CT) examination of the chest and contrast-enhanced abdominal CT examination. The abscess appeared as hypodense mass with thick and irregular wall, which was enhanced on the contrast-enhanced CT images. The shape of the mass resembled an irregular double convex lens. No enlarged lymph nodes were detected on the CT images. The presence of a tuberculous diaphragmatic abscess should be suspected in patients with a diaphragmatic hypodense mass with enhanced thick walls, even when there is absence of enlarged lymph nodes on the CT images. PMID:27330800

  12. Micro-computed tomography: a method for the non-destructive evaluation of the three-dimensional structure of biological specimens.

    PubMed

    Stauber, Martin; Müller, Ralph

    2008-01-01

    The large increase in interest in micro-computed tomography (micro-CT) over the last decade reflects the+ need for a method able to non-destructively visualize the internal three-dimensional structure of an object. Thereby, the real beauty of computed tomography lies in the fact that it is available for a large range of nominal resolutions, which allows hierarchical imaging from whole bodies down to the tissue level. Although micro-CT is currently mainly used for imaging of hard tissue (i.e., bone and tooth), future developments might also allow high soft tissue contrast either using appropriate contrast agents or x-ray contrast mechanisms. This chapter aims to review the steps necessary for a successful micro-CT measurement. Although the actual measurement is often machine dependent, the chapter does not describe a specific system but rather lists all steps that eventually have to be considered to set up a measurement, run the measurement, process the image data, and get morphometric indices as a result. The chapter provides an easy understandable manual that should allow newcomers to perform successful measurements and hence to best profit from this powerful technique.

  13. Preliminary Computational Analysis of the (HIRENASD) Configuration in Preparation for the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.

    2011-01-01

    This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.

  14. Comparative evaluation of soft and hard tissue dimensions in the anterior maxilla using radiovisiography and cone beam computed tomography: A pilot study

    PubMed Central

    Mallikarjun, Savita; Babu, Harsha Mysore; Das, Sreedevi; Neelakanti, Abhilash; Dawra, Charu; Shinde, Sachin Vaijnathrao

    2016-01-01

    Aims: To assess and compare the thickness of gingiva in the anterior maxilla using radiovisiography (RVG) and cone beam computed tomography (CBCT) and its correlation with the thickness of underlying alveolar bone. Settings and Design: This cross-sectional study included 10 male subjects in the age group of 20–45 years. Materials and Methods: After analyzing the width of keratinized gingiva of the maxillary right central incisor, the radiographic assessment was done using a modified technique for RVG and CBCT, to measure the thickness of both the labial gingiva and labial plate of alveolar bone at 4 predetermined locations along the length of the root in each case. Statistical Analysis Used: Statistical analysis was performed using Student's t-test and Pearson's correlation test, with the help of statistical software (SPSS V13). Results: No statistically significant differences were obtained in the measurement made using RVG and CBCT. The results of the present study also failed to reveal any significant correlation between the width of gingiva and the alveolar bone in the maxillary anterior region. Conclusions: Within the limitations of this study, it can be concluded that both CBCT and RVG can be used as valuable tools in the assessment of the soft and hard tissue dimensions. PMID:27143830

  15. Earthquake-related Crush Injury versus Non-Earthquake Injury in Abdominal Trauma Patients on Emergency Multidetector Computed Tomography: A Comparative Study

    PubMed Central

    Chen, Tian-wu; Dong, Zhi-hui; Chu, Zhi-gang; Tang, Si-shi; Deng, Wen

    2011-01-01

    The aim of this study was to investigate features of abdominal earthquake-related crush traumas in comparison with non-earthquake injury. A cross sectional survey was conducted with 51 survivors with abdominal crush injury in the 2008 Sichuan earthquake, and 41 with abdominal non-earthquake injury, undergoing non-enhanced computed tomography (CT) scans, serving as earthquake trauma and control group, respectively. Data were analyzed between groups focusing on CT appearance. We found that injury of abdominal-wall soft tissue and fractures of lumbar vertebrae were more common in earthquake trauma group than in control group (28 vs 13 victims, and 24 vs 9, respectively; all P < 0.05); and fractures were predominantly in transverse process of 1-2 vertebrae among L1-3 vertebrae. Retroperitoneal injury in the kidney occurred more frequently in earthquake trauma group than in control group (29 vs 14 victims, P < 0.05). Abdominal injury in combination with thoracic and pelvic injury occurred more frequently in earthquake trauma group than in control group (43 vs 29 victims, P < 0.05). In conclusion, abdominal earthquake-related crush injury might be characteristic of high incidence in injury of abdominal-wall soft tissue, fractures of lumbar vertebrae in transverse process of 1-2 vertebrae among L1-3 vertebrae, retroperitoneal injury in the kidney, and in combination with injury in the thorax and pelvis. PMID:21394315

  16. Micro- and nano-X-ray computed-tomography: A step forward in the characterization of the pore network of a leached cement paste

    SciTech Connect

    Bossa, Nathan; Chaurand, Perrine; Vicente, Jérôme; Borschneck, Daniel; Levard, Clément; Aguerre-Chariol, Olivier; Rose, Jérôme

    2015-01-15

    Pore structure of leached cement pastes (w/c = 0.5) was studied for the first time from micro-scale down to the nano-scale by combining micro- and nano-X-ray computed tomography (micro- and nano-CT). This allowed assessing the 3D heterogeneity of the pore network along the cement profile (from the core to the altered layer) of almost the entire range of cement pore size, i.e. from capillary to gel pores. We successfully quantified an increase of porosity in the altered layer at both resolutions. Porosity is increasing from 1.8 to 6.1% and from 18 to 58% at the micro-(voxel = 1.81 μm) and nano-scale (voxel = 63.5 nm) respectively. The combination of both CT allowed to circumvent weaknesses inherent of both investigation scales. In addition the connectivity and the channel size of the pore network were also evaluated to obtain a complete 3D pore network characterization at both scales.

  17. Non-invasive Assessment of Lower Limb Geometry and Strength Using Hip Structural Analysis and Peripheral Quantitative Computed Tomography: A Population-Based Comparison.

    PubMed

    Litwic, A E; Clynes, M; Denison, H J; Jameson, K A; Edwards, M H; Sayer, A A; Taylor, P; Cooper, C; Dennison, E M

    2016-02-01

    Hip fracture is the most significant complication of osteoporosis in terms of mortality, long-term disability and decreased quality of life. In the recent years, different techniques have been developed to assess lower limb strength and ultimately fracture risk. Here we examine relationships between two measures of lower limb bone geometry and strength; proximal femoral geometry and tibial peripheral quantitative computed tomography. We studied a sample of 431 women and 488 men aged in the range 59-71 years. The hip structural analysis (HSA) programme was employed to measure the structural geometry of the left hip for each DXA scan obtained using a Hologic QDR 4500 instrument while pQCT measurements of the tibia were obtained using a Stratec 2000 instrument in the same population. We observed strong sex differences in proximal femoral geometry at the narrow neck, intertrochanteric and femoral shaft regions. There were significant (p < 0.001) associations between pQCT-derived measures of bone geometry (tibial width; endocortical diameter and cortical thickness) and bone strength (strength strain index) with each corresponding HSA variable (all p < 0.001) in both men and women. These results demonstrate strong correlations between two different methods of assessment of lower limb bone strength: HSA and pQCT. Validation in prospective cohorts to study associations of each with incident fracture is now indicated.

  18. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  19. OPDOT: A computer program for the optimum preliminary design of a transport airplane

    NASA Technical Reports Server (NTRS)

    Sliwa, S. M.; Arbuckle, P. D.

    1980-01-01

    A description of a computer program, OPDOT, for the optimal preliminary design of transport aircraft is given. OPDOT utilizes constrained parameter optimization to minimize a performance index (e.g., direct operating cost per block hour) while satisfying operating constraints. The approach in OPDOT uses geometric descriptors as independent design variables. The independent design variables are systematically iterated to find the optimum design. The technical development of the program is provided and a program listing with sample input and output are utilized to illustrate its use in preliminary design. It is not meant to be a user's guide, but rather a description of a useful design tool developed for studying the application of new technologies to transport airplanes.

  20. A preliminary transient-fault experiment on the SIFT computer system

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Elks, Carl R.

    1987-01-01

    This paper presents the results of a preliminary experiment to study the effectiveness of a fault-tolerant system's ability to handle transient faults. The primary goal of the experiment was to develop the techniques to measure the parameters needed for a reliability analysis of the SIFT computer system which includes th effects of transient faults. A key aspect of such an analysis is the determination of the effectiveness of the operating system's ability to discriminate between transient and permanent faults. A detailed description of the preliminary transient fault experiment along with the results from 297 transient fault injections are given. Although not enough data was obtained to draw statistically significant conclusions, the foundation has been laid for a large-scale transient fault experiment.

  1. Computing the dissipative part of the gravitational self force: II. Numerical implementation and preliminary results

    NASA Astrophysics Data System (ADS)

    Hughes, Scott; Flanagan, Eanna; Hinderer, Tanja; Ruangsri, Uchupol

    2015-04-01

    We describe how we have modified a frequency-domain Teukolsky-equation solver, previously used for computing orbit-averaged dissipation, in order to compute the dissipative piece of the gravitational self force on orbits of Kerr black holes. This calculation involves summing over a large number of harmonics. Each harmonic is independent of all others, so it is well suited to parallel computation. We show preliminary results for equatorial eccentric orbits and circular inclined orbits, demonstrating convergence of the harmonic expansion, as well as interesting phenomenology of the self force's behavior in the strong field. We conclude by discussing plans for using this force to study generic orbits, with a focus on the behavior of orbital resonances.

  2. On the computation of preliminary orbits for Earth satellites with radar observations

    NASA Astrophysics Data System (ADS)

    Gronchi, G. F.; Dimare, L.; Bracali Cioci, D.; Ma, H.

    2015-08-01

    We introduce a new method to perform preliminary orbit determination for satellites on low Earth orbits (LEO). This method works with tracks of radar observations: each track is composed by n ≥ 4 topocentric position vectors per pass of the satellite, taken at very short time intervals. We assume very accurate values for the range ρ, while the angular positions (i.e. the line of sight, given by the pointing of the antenna) are less accurate. We wish to correct the errors in the angular positions already in the computation of a preliminary orbit. With the information contained in a pair of radar tracks, using the laws of the two-body dynamics, we can write eight equations in eight unknowns. The unknowns are the components of the topocentric velocity orthogonal to the line of sight at the two mean epochs of the tracks, and the corrections Δ to be applied to the angular positions. We take advantage of the fact that the components of Δ are typically small. We show the results of some tests, performed with simulated observations, and compare this method with Gibbs' and the Keplerian integrals methods.

  3. Computer-assisted intraosseous anaesthesia for molar and incisor hypomineralisation teeth. A preliminary study.

    PubMed

    Cabasse, C; Marie-Cousin, A; Huet, A; Sixou, J L

    2015-03-01

    Anesthetizing MIH (Molar and Incisor Hypomineralisation) teeth is one of the major challenges in paediatric dentistry. Computer-assisted IO injection (CAIO) of 4% articaine with 1:200,000 epinephrine (Alphacaine, Septodont) has been shown to be an efficient way to anesthetize teeth in children. The aim of this study was to assess the efficacy of this method with MIH teeth. This preliminary study was performed using the Quick Sleeper system (Dental Hi Tec, Cholet, France) that allows computer-controlled rotation of the needle to penetrate the bone and computer-controlled injection of the anaesthetic solution. Patients (39) of the department of Paediatric Dentistry were included allowing 46 sessions (including 32 mandibular first permanent molars) to be assessed. CAIO showed efficacy in 93.5% (43/46) of cases. Failures (3) were due to impossibility to reach the spongy bone (1) and to achieve anaesthesia (2). This prospective study confirms that CAIO anaesthesia is a promising method to anesthetize teeth with MIH that could therefore be routinely used by trained practitioners.

  4. Development of an online automatic computed radiography dose data mining program: a preliminary study.

    PubMed

    Ng, Curtise K C; Sun, Zhonghua

    2010-01-01

    Recent studies have reported the computed radiography (CR) dose creep problem and therefore the need to have monitoring processes in place in clinical departments. The objective of this study is to provide a better technological solution to implement a regular CR dose monitoring process. An online automatic CR dose data mining program which can be applied to different systems was developed based on freeware and existing softwares in the Picture Archiving and Communication System (PACS) server. The program was tested with 69 CR images. This preliminary study shows that the program addresses the major weaknesses of some existing studies including involvement of manual procedures in the monitoring process and being only applicable to a single manufacturer's CR images. The proposed method provides an efficient and effective solution to implement a CR dose monitoring program regularly in busy clinical departments to regulate the dose creep problem so as to reinforce the 'As Low As Reasonably Achievable' (ALARA) principle. PMID:19640604

  5. Feasibility Study for a Remote Terminal Central Computing Facility Serving School and College Institutions. Volume II, Preliminary Specifications.

    ERIC Educational Resources Information Center

    International Business Machines Corp., White Plains, NY.

    Preliminary specifications of major equipment and programing systems characteristics for a remote terminal central computing facility serving 25-75 secondary schools are presented. Estimation techniques developed in a previous feasibility study were used to delineate workload demands for four model regions with different numbers of institutions…

  6. Wolter X-Ray Microscope Computed Tomography Ray-Trace Model with Preliminary Simulation Results

    SciTech Connect

    Jackson, J A

    2006-02-27

    It is proposed to build a Wolter X-ray Microscope Computed Tomography System in order to characterize objects to sub-micrometer resolution. Wolter Optics Systems use hyperbolic, elliptical, and/or parabolic mirrors to reflect x-rays in order to focus or magnify an image. Wolter Optics have been used as telescopes and as microscopes. As microscopes they have been used for a number of purposes such as measuring emission x-rays and x-ray fluoresce of thin biological samples. Standard Computed Tomography (CT) Systems use 2D radiographic images, from a series of rotational angles, acquired by passing x-rays through an object to reconstruct a 3D image of the object. The x-ray paths in a Wolter X-ray Microscope will be considerably different than those of a standard CT system. There is little information about the 2D radiographic images that can be expected from such a system. There are questions about the quality, resolution and focusing range of an image created with such a system. It is not known whether characterization information can be obtained from these images and whether these 2D images can be reconstructed to 3D images of the object. A code has been developed to model the 2D radiographic image created by an object in a Wolter X-ray Microscope. This code simply follows the x-ray through the object and optics. There is no modeling at this point of other effects, such as scattering, reflection losses etc. Any object, of appropriate size, can be used in the model code. A series of simulations using a number of different objects was run to study the effects of the optics. The next step will be to use this model to reconstruct an object from the simulated data. Funding for the project ended before this goal could be accomplished. The following documentation includes: (1) background information on current X-ray imaging systems, (2) background on Wolter Optics, (3) description of the Wolter System being used, (4) purpose, limitations and development of the modeling

  7. Preliminary validation of a new methodology for estimating dose reduction protocols in neonatal chest computed radiographs

    NASA Astrophysics Data System (ADS)

    Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.

    2006-03-01

    The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.

  8. The Development of a Computer Model for Projecting Statewide College Enrollments: A Preliminary Study.

    ERIC Educational Resources Information Center

    Rensselaer Research Corp., Troy, NY.

    The purpose of this study was to develop the schema and methodology for the construction of a computerized mathematical model designed to project college and university enrollments in New York State and to meet the future increased demands of higher education planners. This preliminary report describes the main structure of the proposed computer…

  9. Reducing Foreign Language Communication Apprehension with Computer-Mediated Communication: A Preliminary Study

    ERIC Educational Resources Information Center

    Arnold, Nike

    2007-01-01

    Many studies (e.g., [Beauvois, M.H., 1998. "E-talk: Computer-assisted classroom discussion--attitudes and motivation." In: Swaffar, J., Romano, S., Markley, P., Arens, K. (Eds.), "Language learning online: Theory and practice in the ESL and L2 computer classroom." Labyrinth Publications, Austin, TX, pp. 99-120; Bump, J., 1990. "Radical changes in…

  10. Applying Computer Technology to Substance Abuse Prevention Science Results of a Preliminary Examination

    ERIC Educational Resources Information Center

    Marsch, Lisa A.; Bickel, Warren K.; Badger, Gary J.

    2007-01-01

    This manuscript reports on the development and evaluation of a computer-based substance abuse prevention program for middle school-aged adolescents, called "HeadOn: Substance Abuse Prevention for Grades 6-8TM". This self-guided program was designed to deliver effective drug abuse prevention science to youth via computer-based educational…

  11. Preliminary Computational Fluid Dynamics (CFD) Simulation of EIIB Push Barge in Shallow Water

    NASA Astrophysics Data System (ADS)

    Beneš, Petr; Kollárik, Róbert

    2011-12-01

    This study presents preliminary CFD simulation of EIIb push barge in inland conditions using CFD software Ansys Fluent. The RANSE (Reynolds Averaged Navier-Stokes Equation) methods are used for the viscosity solution of turbulent flow around the ship hull. Different RANSE methods are used for the comparison of their results in ship resistance calculations, for selecting the appropriate and removing inappropriate methods. This study further familiarizes on the creation of geometrical model which considers exact water depth to vessel draft ratio in shallow water conditions, grid generation, setting mathematical model in Fluent and evaluation of the simulations results.

  12. Preliminary study of the use of the STAR-100 computer for transonic flow calculations

    NASA Technical Reports Server (NTRS)

    Keller, J. D.; Jameson, A.

    1977-01-01

    An explicit method for solving the transonic small-disturbance potential equation is presented. This algorithm, which is suitable for the new vector-processor computers such as the CDC STAR-100, is compared to successive line over-relaxation (SLOR) on a simple test problem. The convergence rate of the explicit scheme is slower than that of SLOR, however, the efficiency of the explicit scheme on the STAR-100 computer is sufficient to overcome the slower convergence rate and allow an overall speedup compared to SLOR on the CYBER 175 computer.

  13. Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study

    PubMed Central

    Gargiulo, Gaetano D; Mohamed, Armin; McEwan, Alistair L; Bifulco, Paolo; Cesarelli, Mario; Jin, Craig T; Ruffo, Mariano; Tapson, Jonathan; van Schaik, André

    2012-01-01

    Feedback plays an important role when learning to use a brain computer interface (BCI), particularly in the case of synchronous feedback that relies on the interaction subject. In this preliminary study, we investigate the role of combined auditory-visual feedback during synchronous μ rhythm-based BCI sessions to help the subject to remain focused on the selected imaginary task. This new combined feedback, now integrated within the general purpose BCI2000 software, has been tested on eight untrained and three trained subjects during a monodimensional left-right control task. In order to reduce the setup burden and maximize subject comfort, an electroencephalographic device suitable for dry electrodes that required no skin preparation was used. Quality and index of improvement was evaluated based on a personal self-assessment questionnaire from each subject and quantitative data based on subject performance. Results for this preliminary study show that the combined feedback was well tolerated by the subjects and improved performance in 75% of the naïve subjects compared with visual feedback alone. PMID:23152713

  14. Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study.

    PubMed

    Gargiulo, Gaetano D; Mohamed, Armin; McEwan, Alistair L; Bifulco, Paolo; Cesarelli, Mario; Jin, Craig T; Ruffo, Mariano; Tapson, Jonathan; van Schaik, André

    2012-01-01

    Feedback plays an important role when learning to use a brain computer interface (BCI), particularly in the case of synchronous feedback that relies on the interaction subject. In this preliminary study, we investigate the role of combined auditory-visual feedback during synchronous μ rhythm-based BCI sessions to help the subject to remain focused on the selected imaginary task. This new combined feedback, now integrated within the general purpose BCI2000 software, has been tested on eight untrained and three trained subjects during a monodimensional left-right control task. In order to reduce the setup burden and maximize subject comfort, an electroencephalographic device suitable for dry electrodes that required no skin preparation was used. Quality and index of improvement was evaluated based on a personal self-assessment questionnaire from each subject and quantitative data based on subject performance. Results for this preliminary study show that the combined feedback was well tolerated by the subjects and improved performance in 75% of the naïve subjects compared with visual feedback alone.

  15. Comparison of different methods to compute a preliminary orbit of Space Debris using radar observations

    NASA Astrophysics Data System (ADS)

    Ma, Hélène; Gronchi, Giovanni F.

    2014-07-01

    We advertise a new method of preliminary orbit determination for space debris using radar observations, which we call Infang †. We can perform a linkage of two sets of four observations collected at close times. The context is characterized by the accuracy of the range ρ, whereas the right ascension α and the declination δ are much more inaccurate due to observational errors. This method can correct α, δ, assuming the exact knowledge of the range ρ. Considering no perturbations from the J 2 effect, but including errors in the observations, we can compare the new method, the classical method of Gibbs, and the more recent Keplerian integrals method. The development of Infang is still on-going and will be further improved and tested.

  16. Psychological underpinnings of intrafamilial computer-mediated communication: a preliminary exploration of CMC uptake with parents and siblings.

    PubMed

    Goby, Valerie Priscilla

    2011-06-01

    This preliminary study investigates the uptake of computer-mediated communication (CMC) with parents and siblings, an area on which no research appears to have been conducted. Given the lack of relevant literature, grounded theory methodology was used and online focus group discussions were conducted in an attempt to generate suitable hypotheses for further empirical studies. Codification of the discussion data revealed various categories of meaning, namely: a perceived inappropriateness of CMC with members of family of origin; issues relating to the family generational gap; the nature of the offline sibling/parent relationship; the non-viability of online affordances such as planned self-disclosure, deception, identity construction; and disinhibition in interactions with family-of-origin members. These themes could be molded into hypotheses to assess the psychosocial limitations of CMC and to determine if it can indeed become a ubiquitous alternative to traditional communication modes as some scholars have claimed.

  17. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  18. SIFT - A preliminary evaluation. [Software Implemented Fault Tolerant computer for aircraft control

    NASA Technical Reports Server (NTRS)

    Palumbo, D. L.; Butler, R. W.

    1983-01-01

    This paper presents the results of a performance evaluation of the SIFT computer system conducted in the NASA AIRLAB facility. The essential system functions are described and compared to both earlier design proposals and subsequent design improvements. The functions supporting fault tolerance are found to consume significant computing resources. With SIFT's specimen task load, scheduled at a 30-Hz rate, the executive tasks such as reconfiguration, clock synchronization and interactive consistency, require 55 percent of the available task slots. Other system overhead (e.g., voting and scheduling) use an average of 50 percent of each remaining task slot.

  19. Cognitive Styles among Computer Systems Students: Preliminary Findings from New Arrivals to University.

    ERIC Educational Resources Information Center

    Moore, Sarah; O'Maidin, Donncha; McElligot, Annette

    2003-01-01

    Explored the relationship among cognitive style, performance, gender, and communication among computer students. Found a significant relationship between performance and cognitive style; students whose cognitive style scores indicated a preference for analytical thinking had better performance scores when entering college than those with a…

  20. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    SciTech Connect

    Helton, J.C.; Anderson, D.R.; Baker, B.L.

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  1. Keewatin Region Educational Authority Pilot Education Project: Computer-Assisted Learning. Preliminary Final Report.

    ERIC Educational Resources Information Center

    Wolter, Heidi; And Others

    A project was conducted to improve and expand academic upgrading, job readiness, and special skill training for adults in the Keewatin Region through the implementation of computer-assisted learning (CAL). It was intended as a response to the special needs of unemployed Inuit who were not reached in the past by traditional training programs and…

  2. Computer-Animated Instruction and Students' Conceptual Change in Electrochemistry: Preliminary Qualitative Analysis

    ERIC Educational Resources Information Center

    Talib, Othman; Matthews, Robert; Secombe, Margaret

    2005-01-01

    This paper discusses the potential of applying computer-animated instruction (CAnI) as an effective conceptual change strategy in teaching electrochemistry in comparison to conventional lecture-based instruction (CLI). The core assumption in this study is that conceptual change in learners is an active, constructive process that is enhanced by the…

  3. Integrating Computer Algebra Systems in Post-Secondary Mathematics Education: Preliminary Results of a Literature Review

    ERIC Educational Resources Information Center

    Buteau, Chantal; Marshall, Neil; Jarvis, Daniel; Lavicza, Zsolt

    2010-01-01

    We present results of a literature review pilot study (326 papers) regarding the use of Computer Algebra Systems (CAS) in tertiary mathematics education. Several themes that have emerged from the review are discussed: diverse uses of CAS, benefits to student learning, issues of integration and mathematics learning, common and innovative usage of…

  4. Monitor Tone Generates Stress in Computer and VDT Operators: A Preliminary Study.

    ERIC Educational Resources Information Center

    Dow, Caroline; Covert, Douglas C.

    A near-ultrasonic pure tone of 15,570 Herz generated by flyback transformers in computer and video display terminal (VDT) monitors may cause severe non-specific irritation or stress disease in operators. Women hear higher frequency sounds than men and are twice as sensitive to "too loud" noise. Pure tones at high frequencies are more annoying than…

  5. MANIPULATION PAD--PRELIMINARY REPORT ON A STUDENT-COMPUTER INTERFACE DEVICE FOR PROGRAMMED INSTRUCTION.

    ERIC Educational Resources Information Center

    BALL, J.R.; AND OTHERS

    AN INVESTIGATION WAS MADE OF SOME ENTIRELY NEW INPUT-OUTPUT DEVICES DESIGNED TO SOLVE SOME FUNCTIONAL AND ECONOMIC PROBLEMS THAT ARE PRESENT IN EXISTING COMPUTER-CONTROLLED INSTRUCTION EQUIPMENT. THREE FORMS OF MANIPULATION PAD WERE STUDIED. INFORMATION ABOUT SPECIALLY PREPARED OBJECTS PLACED ON THE FLAT SURFACE OF THE PAD WAS SUPPLIED TO A…

  6. Using computational simulation to aid in the prediction of socket fit: a preliminary study.

    PubMed

    Lee, Winson C C; Zhang, Ming

    2007-10-01

    This study illustrates the use of computational analysis to predict prosthetic socket fit. A simple indentation test is performed by applying force to the residual limb of a trans-tibial amputee through an indenter until the subject perceives the onset of pain. Computational finite element (FE) analysis is then applied to evaluate the magnitude of pressure underlying the indenter that initiates pain (pain threshold pressure), and the pressure at the prosthetic socket-residual limb interface. The assessment of socket fit is examined by studying whether or not the socket-limb interface pressure exceeds the pain threshold pressure of the limb. Based on the computer-aided assessment, a new prosthetic socket is then fabricated and fitted to the amputee subject. Successful socket fit is achieved at the end of this process. The approach of using computational analysis to aid in assessing socket fit allows a more efficient evaluation and re-design of the socket even before the actual fabrication and fitting of the prosthetic socket. However, more thorough investigations are required before this approach can be widely used. A subsequent part of this paper discusses the limitations and suggests future research directions in this area.

  7. A preliminary sensitivity analysis of the Generalized Escape System Simulation (GESS) computer program

    SciTech Connect

    Holdeman, J.T.; Liepins, G.E.; Murphy, B.D.; Ohr, S.Y.; Sworski, T.J.; Warner, G.E.

    1989-06-01

    The Generalized Escape System Simulation (GESS) program is a computerized mathematical model for dynamically simulating the performance of existing or developmental aircraft ejection seat systems. The program generates trajectory predictions with 6 degrees of freedom for the aircraft, seat/occupant, occupant alone, and seat alone systems by calculating the forces and torques imposed on these elements by seat catapults, rails, rockets, stabilization and recovery systems included in most escape system configurations. User options are provided to simulate the performance of all conventional escape system designs under most environmental conditions and aircraft attitudes or trajectories. The concept of sensitivity analysis is discussed, as is the usefulness of GESS for retrospective studies, whereby one attempts to determine the aircraft configuration at ejection from the ejection outcome. A very limited and preliminary sensitivity analysis has been done with GESS to study the way the performance of the ejection system changes with certain user-specified options or parameters. A more complete analysis would study correlations, where simultaneous correlated variations of several parameters might affect performance to an extent not predictable from the individual sensitivities. Uncertainty analysis is discussed. Even with this limited analysis, a difficulty with some simulations involving a rolling aircraft has been discovered; the code produces inconsistent trajectories. One explanation is that the integration routine is not able to deal with the stiff differential equations involved. Another possible explanation is that the coding of the coordinate transformations is faulty when large angles are involved. 7 refs., 7 tabs.

  8. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    SciTech Connect

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O'Connell, R.A.; Luck, D.; Camli, U.; King, L.N. )

    1991-08-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy.

  9. High-accuracy computation of Delta V magnitude probability densities - Preliminary remarks

    NASA Technical Reports Server (NTRS)

    Chadwick, C.

    1986-01-01

    This paper describes an algorithm for the high accuracy computation of some statistical quantities of the magnitude of a random trajectory correction maneuver (TCM). The trajectory correction velocity increment Delta V is assumed to be a three-component random vector with each component being a normally distributed random scalar having a possibly nonzero mean. Knowledge of the statitiscal properties of the magnitude of a random TCM is important in the planning and execution of maneuver strategies for deep-space missions such as Galileo. The current algorithm involves the numerical integration of a set of differential equations. This approach allows the computation of density functions for specific Delta V magnitude distributions to high accuracy without first having to generate large numbers of random samples. Possible applications of the algorithm to maneuver planning, planetary quarantine evaluation, and guidance success probability calculations are described.

  10. Preliminary assessment of Tongue Drive System in medium term usage for computer access and wheelchair control.

    PubMed

    Yousefi, Behnaz; Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless, wearable assistive technology that enables individuals with severe motor impairments access computers, drive wheelchairs, and control their environments using tongue motion. In this paper, we have evaluated the TDS performance as a computer input device using ISO9241-9 standard tasks for pointing and selecting, based on the well known Fitts' Law, and as a powered wheelchair controller through an obstacle course navigation task. Nine able-bodied subjects who already had tongue piercing participated in this trial over 5 sessions during 5 weeks, allowing us to study the TDS learning process and its current limiting factors. Subjects worn tongue rings made of titanium in the form of a barbell with a small rare earth magnetic tracer hermetically sealed inside the upper ball. Comparing the results between 1(st) and 5(th) sessions showed that subjects' performance improved in all the measures through 5 sessions, demonstrating the effects of learning. PMID:22255650

  11. An integrated computer-program-system for the preliminary design of advanced hypersonic aircraft (PrADO-Hy)

    NASA Astrophysics Data System (ADS)

    Kossira, H.; Bardenhagen, A.; Heinze, W.

    The design program system PrADO-Hy (Preliminary Aircraft Design and Optimization - Hypersonic) for computer-aided conceptional hypersonic aircraft design, developed by the Institute of Aircraft Design and Structural Mechanics (IFL, TU Braunschweig), is introduced. The modular program simulates, controlled by a data management system, in its kernel the design process with the interactions between the different disciplines (aerodynamics, propulsion, structure, flight mechanics, etc.). The design process is superimposed by a multivariable optimization loop. This paper describes the organization of the PrADO system, the data management technique, and as an example of the program library the weight and balance module for the estimation of structural mass. The practical application and the capabilities of the program system are demonstrated by a design study of a TSTO (two-stage-to-orbit) vehicle, which should transfer a space payload of 3.3 tons to a low-earth-orbit (80 km/450 km). The computational results of some investigations will be presented.

  12. A Comparison between the Occurrence of Pauses, Repetitions and Recasts under Conditions of Face-to-Face and Computer-Mediated Communication: A Preliminary Study

    ERIC Educational Resources Information Center

    Cabaroglu, Nese; Basaran, Suleyman; Roberts, Jon

    2010-01-01

    This study compares pauses, repetitions and recasts in matched task interactions under face-to-face and computer-mediated conditions. Six first-year English undergraduates at a Turkish University took part in Skype-based voice chat with a native speaker and face-to-face with their instructor. Preliminary quantitative analysis of transcripts showed…

  13. X-ray phase computed tomography for nanoparticulated imaging probes and therapeutics: preliminary feasibility study

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Yang, Yi; Tang, Shaojie

    2011-03-01

    With the scientific progress in cancer biology, pharmacology and biomedical engineering, the nano-biotechnology based imaging probes and therapeutical agents (namely probes/agents) - a form of theranostics - are among the strategic solutions bearing the hope for the cure of cancer. The key feature distinguishing the nanoparticulated probes/agents from their conventional counterparts is their targeting capability. A large surface-to-volume ratio in nanoparticulated probes/agents enables the accommodation of multiple targeting, imaging and therapeutic components to cope with the intra- and inter-tumor heterogeneity. Most nanoparticulated probes/agents are synthesized with low atomic number materials and thus their x-ray attenuation are very similar to biological tissues. However, their microscopic structures are very different, which may result in significant differences in their refractive properties. Recently, the investigation in the x-ray grating-based differential phase contrast (DPC) CT has demonstrated its advantages in differentiating low-atomic materials over the conventional attenuation-based CT. We believe that a synergy of x-ray grating-based DPC CT and nanoparticulated imaging probes and therapeutic agents may play a significant role in extensive preclinical and clinical applications, or even become a modality for molecular imaging. Hence, we propose to image the refractive property of nanoparticulated imaging probes and therapeutical agents using x-ray grating-based DPC CT. In this work, we conduct a preliminary feasibility study with a focus to characterize the contrast-to-noise ratio (CNR) and contrast-detail behavior of the x-ray grating-based DPC CT. The obtained data may be instructive to the architecture design and performance optimization of the x-ray grating-based DPC CT for imaging biomarker-targeted imaging probes and therapeutic agents, and even informative to the translation of preclinical research in theranostics into clinical applications.

  14. Synopsis of some preliminary computational studies related to unsaturated zone transport at Area G

    SciTech Connect

    Vold, E.

    1998-03-01

    Computational transport models are described with applications in three problem areas related to unsaturated zone moisture movement beneath Area G. These studies may be used to support the ongoing maintenance of the site Performance Assessment. The three areas include: a 1-D transient analysis with average tuff hydraulic properties in the near surface region with computed results compared to field data; the influence on near surface transient moisture percolation due to realistic distributions in hydraulic properties derived statistically from the observed variance in the field data; and the west to east moisture flow in a 2-D steady geometry approximation of the Pajarito Plateau. Results indicate that a simple transient model for transport of moisture volume fraction fits field data well compared to a moisture pulse observed in the active disposal unit, pit 37. Using realistic infiltration boundary conditions for summer showers and for spring snow melt conditions, the computed moisture pulses show significant propagation to less than 10-ft depth. Next, the hydraulic properties were varied on a 2-D grid using statistical distributions based on the field data means and variances for the hydraulic parameters. Near surface transient percolation in these conditions shows a qualitatively realistic percolation with a spatially variable wave front moving into the tuff; however, the flow does not channel into preferred paths and suggests there is no formation of fast paths which could enhance transportation of contaminants. Finally, moisture transport is modeled through an unsaturated 2-D slice representing the upper stratigraphic layers beneath Area G and a west-to-east cut of several miles to examine possible lateral movement from the west where percolation is assumed to be greater than at Area G. Results show some west-to-east moisture flux consistent with the assumed profile for the percolation boundary conditions.

  15. In-tank fluid sloshing effects during earthquakes: A preliminary computational simulation

    SciTech Connect

    Park, J.E.; Rezvani, M.A.

    1995-04-01

    Hundreds of underground radioactive waste storage tanks are located at Department of Energy (DOE) sites. At present, no technique for evaluating the pressure loads due to the impact of earthquake generated waves on the side walls and dome of the tanks is known if the wave breaks back on itself. This paper presents the results of two-dimensional Computational Fluid Dynamics (CFD) calculations of the motion of waves in a generic rectangular tank as the result of accelerations recorded during an earthquake. The advantages and limitations of this technique and methods for avoiding the limitations will be discussed.

  16. Applications of computer assisted surgery and medical robotics at the ISSSTE, México: preliminary results.

    PubMed

    Mosso, José Luis; Pohl, Mauricio; Jimenez, Juan Ramon; Valdes, Raquel; Yañez, Oscar; Medina, Veronica; Arambula, Fernando; Padilla, Miguel Angel; Marquez, Jorge; Gastelum, Alfonso; Mosso, Alejo; Frausto, Juan

    2007-01-01

    We present the first results of four projects of a second phase of a Mexican Project Computer Assisted Surgery and Medical Robotics, supported by the Mexican Science and Technology National Council (Consejo Nacional de Ciencia y Tecnología) under grant SALUD-2002-C01-8181. The projects are being developed by three universities (UNAM, UAM, ITESM) and the goal of this project is to integrate a laboratory in a Hospital of the ISSSTE to give service to surgeons or clinicians of Endoscopic surgeons, urologist, gastrointestinal endoscopist and neurosurgeons.

  17. Group training with healthy computing practices to prevent repetitive strain injury (RSI): a preliminary study.

    PubMed

    Peper, Erik; Gibney, Katherine H; Wilson, Vietta E

    2004-12-01

    This pilot study investigated whether group training, in which participants become role models and coaches, would reduce discomfort as compared to a nontreatment Control Group. Sixteen experimental participants participated in 6 weekly 2-hr group sessions of a Healthy Computing program whereas 12 control participants received no training. None of the participants reported symptoms to their supervisors nor were they receiving medical treatment for repetitive strain injury prior to the program. The program included training in ergonomic principles, psychophysiological awareness and control, sEMG practice at the workstation, and coaching coworkers. Using two-tailed t tests to analyze the data, the Experimental Group reported (1) a significant overall reduction in most body symptoms as compared to the Control Group and (2) a significant increase in positive work-style habits, such as taking breaks at the computer, as compared to the Control Group. This study suggests that employees could possibly improve health and work style patterns based on a holistic training program delivered in a group format followed by individual practice.

  18. Preliminary experiment of fluorescent X-ray computed tomography to detect dual agents for biological study.

    PubMed

    Yu, Q; Takeda, T; Yuasa, T; Hasegawa, Y; Wu, J; Thet-Thet-Lwin; Hyodo, K; Dilmanian, F A; Itai, Y; Akatsuka, T

    2001-05-01

    The simultaneous observation of various information, such as blood flow, tissue metabolism and distribution of receptors, is quite important in order to understand the functional state of biomedical objects. The simultaneous detectability of contrast agents by fluorescent X-ray computed tomography (FXCT) with synchrotron radiation is examined in this study. The system consisted of a silicon (111) double-crystal monochromator, an X-ray slit system, a scanning table, a PIN diode, a highly purified germanium detector and an X-ray charge-coupled device (CCD) camera. The monochromatic X-ray beam energy was adjusted to 37.0 keV and collimated into a pencil beam of 1 x 1 mm. The fluorescent spectra of the K alpha lines for iodine and xenon were detected simultaneously. FXCT could image the distribution of both iodine and xenon agents in a phantom clearly and the contrast ratio was significantly better than that of transmission X-ray computed tomography images. PMID:11486409

  19. Is early response by 18F-2-fluoro-2-deoxy-D-glucose positron emission tomography-computed tomography a predictor of long-term outcome in patients with metastatic colorectal cancer?

    PubMed Central

    Fanelli, Marcello Ferretti; Dettino, Aldo Lourenço Abadde; Nicolau, Ulisses Ribaldo; Cavicchioli, Marcelo; Lima, Eduardo Nóbrega Pereira; de Mello, Celso Abdon Lopes

    2016-01-01

    Background Identify in advance responder patients to chemotherapy in metastatic colorectal cancer (CRC) would allow prompt interruption of ineffective therapies in non-responder patients. Hence, predictive markers are sought in numerous trials to detect responder patients, including tumor shrinkage measured by imaging methods. Usually, Response Evaluation Criteria in Solid Tumors (RECIST) is used to evaluate tumor response in metastatic CRC, but these criteria are questionable with use of biological agents associated to chemotherapy. Our aim was correlate early metabolic response by 18F-2-fluoro-2-deoxy-D-glucose positron emission tomography-computed tomography (18FDG-PET-CT) with long-term outcome in metastatic CRC in first-line therapy. Methods We prospectively evaluated 36 patients with metastatic CRC in first-line treatment with 5-fluorouracil, leucovorin (folinic acid), oxaliplatin (FOLFOX) or 5-fluorouracil, leucovorin (folinic acid), irinotecan (FOLFIRI) associated with cetuximab or bevacizumab. 18FDG-PET-CT was performed at baseline and after two cycles of chemotherapy. The early metabolic response [standardized uptake value (SUV)] was measured to identify responder and non-responder patients and correlated with overall survival (OS) and progression-free survival (PFS). Results Median age was 58.5 years (range, 41–74 years). PFS was 15.5 months for responder and 13.3 months for non-responder (P=0.42), OS was 55.7 months for responder and not reached for non-responder. There was no correlation between delta-SUV and clinical and pathological variables analyzed. In the subgroup of patients who did not undergo resection of metastasis (45%), PFS was higher for responders (15.3×6.8 months, P=0.02). Conclusions According to our findings, early response by 18FDG-PET-CT was not a predictor of long-term outcome for patients with metastatic CRC treated in the first-line chemotherapy with a monoclonal antibody. PMID:27284468

  20. Reproducibility of coronary atherosclerotic plaque characteristics in populations with low, intermediate, and high prevalence of coronary artery disease by multidetector computer tomography: a guide to reliable visual coronary plaque assessments.

    PubMed

    de Knegt, Martina C; Linde, Jesper J; Fuchs, Andreas; Nordestgaard, Børge G; Køber, Lars V; Hove, Jens D; Kofoed, Klaus F

    2016-10-01

    To evaluate the interobserver agreement of visual coronary plaque characteristics by 320-slice multidetector computed tomography (MDCT) in three populations with low, intermediate and high CAD prevalence and to identify determinants for the reproducible assessment of these plaque characteristics. 150 patients, 50 asymptomatic subjects from the general population (low CAD prevalence), 50 symptomatic non-acute coronary syndrome (non-ACS) patients (intermediate CAD prevalence), and 50 ACS patients (high CAD prevalence), matched according to age and gender, were retrospectively enrolled. All coronary segments were evaluated for overall image quality, evaluability, presence of CAD, coronary stenosis, plaque composition, plaque focality, and spotty calcification by four readers. Interobserver agreement was assessed using Fleiss' Kappa (κ) and intra-class correlation (ICC). Widely used clinical parameters (overall scan quality, presence of CAD, and determination of coronary stenosis) showed good agreement among the four readers, (ICC = 0.66, κ = 0.73, ICC = 0.74, respectively). When accounting for heart rate, body mass index, plaque location, and coronary stenosis above/below 50 %, interobserver agreement for plaque composition, presence of CAD, and coronary stenosis improved to either good or excellent, (κ = 0.61, κ = 0.81, ICC = 0.78, respectively). Spotty calcification was the least reproducible parameter investigated (κ = 0.33). Across subpopulations, reproducibility of coronary plaque characteristics generally decreased with increasing CAD prevalence except for plaque composition, (limits of agreement: ±2.03, ±1.96, ±1.79 for low, intermediate and high CAD prevalence, respectively). 320-slice MDCT can be used to assess coronary plaque characteristics, except for spotty calcification. Reproducibility estimates are influenced by heart rate, body size, plaque location, and degree of luminal stenosis.

  1. Development of a Computer Program for Analyzing Preliminary Aircraft Configurations in Relationship to Emerging Agility Metrics

    NASA Technical Reports Server (NTRS)

    Bauer, Brent

    1993-01-01

    This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition, one study illustrates the module's ability to optimize a configuration's agility performance.

  2. Fluid-dynamic computations on a connection machine - Preliminary timings and complex boundary conditions

    NASA Astrophysics Data System (ADS)

    Oran, Elaine S.; Boris, Jay P.; Brown, Eugene F.

    1990-01-01

    This paper describes the conversion and application of the explicit, time-dependent, fourth-order, phase-accurate, variable-grid flux-corrected transport module, LCPFCT, to the Connection Machine, a fine-grained SIMD parallel processor. Models developed are as similar to the production Cray codes as possible and include a variety of different realistic boundary conditions. Timing comparisons show that a 16K-processor Connection Machine allows computations at speeds up to a factor of seven faster than obtained on a Cray YMP for a functionally equivalent optimized, three-dimensional code. Test calculations of a two-dimensional exploding shock and a three-dimensional helically perturbed jet are described and discussed briefly.

  3. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  4. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    NASA Astrophysics Data System (ADS)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  5. The preliminary development of computer-assisted assessment of Chinese handwriting performance.

    PubMed

    Chang, Shao-Hsia; Yu, Nan-Ying; Shie, Jung-Jiun

    2009-06-01

    This paper describes a pilot study investigating an assessment for Chinese handwriting performance. In an attempt to computerize the existing Tseng Handwriting Problem Checklist (Tseng Checklist), this study employed MATLAB to develop a computer program entitled the Chinese Handwriting Assessment Program (CHAP) to be used for the evaluation of handwriting performance. Through a template-matching approach, the program processed each character by using size-adjustable standard models to calculate the two-dimensional cross-correlation coefficient of a template and a superimposed handwritten character. The program measured the size control, spacing, alignment, and the average resemblance between standard models and handwritten characters. The results of the CHAP's test-retest reliability showed that the high correlation coefficients (from .81 to .94) were statistically significant. Correlations between each CHAP and Tseng Checklist item were statistically significant. As these assessment tools for handwriting performance are required for quantitative and qualitative aspects, the integration of the two tools is a promising means for accomplishing a handwriting performance assessment.

  6. Knowledge-based computer-aided detection of masses on digitized mammograms: a preliminary assessment.

    PubMed

    Chang, Y H; Hardesty, L A; Hakim, C M; Chang, T S; Zheng, B; Good, W F; Gur, D

    2001-04-01

    The purpose of this work was to develop and evaluate a computer-aided detection (CAD) scheme for the improvement of mass identification on digitized mammograms using a knowledge-based approach. Three hundred pathologically verified masses and 300 negative, but suspicious, regions, as initially identified by a rule-based CAD scheme, were randomly selected from a large clinical database for development purposes. In addition, 500 different positive and 500 negative regions were used to test the scheme. This suspicious region pruning scheme includes a learning process to establish a knowledge base that is then used to determine whether a previously identified suspicious region is likely to depict a true mass. This is accomplished by quantitatively characterizing the set of known masses, measuring "similarity" between a suspicious region and a "known" mass, then deriving a composite "likelihood" measure based on all "known" masses to determine the state of the suspicious region. To assess the performance of this method, receiver-operating characteristic (ROC) analyses were employed. Using a leave-one-out validation method with the development set of 600 regions, the knowledge-based CAD scheme achieved an area under the ROC curve of 0.83. Fifty-one percent of the previously identified false-positive regions were eliminated, while maintaining 90% sensitivity. During testing of the 1,000 independent regions, an area under the ROC curve as high as 0.80 was achieved. Knowledge-based approaches can yield a significant reduction in false-positive detections while maintaining reasonable sensitivity. This approach has the potential of improving the performance of other rule-based CAD schemes.

  7. A comparative study of database reduction methods for case-based computer-aided detection systems: preliminary results

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Malof, Jordan M.; Zurada, Jacek M.; Tourassi, Georgia D.

    2009-02-01

    In case-based computer-aided decision systems (CB-CAD) a query case is compared to known examples stored in the systems case base (also called a reference library). These systems offer competitive classification performance and are easy to expand. However, they also require efficient management of the case base. As CB-CAD systems are becoming more popular, the problem of case base optimization has recently attracted interest among CAD researchers. In this paper we present preliminary results of a study comparing several case base reduction techniques. We implemented six techniques previously proposed in machine learning literature and applied it to the classification problem of distinguishing masses and normal tissue in mammographic regions of interest. The results show that the random mutation hill climbing technique offers a drastic reduction of the number of case base examples while providing a significant improvement in classification performance. Random selection allowed for reduction of the case base to 30% without notable decrease in performance. The remaining techniques (i.e., condensed nearest neighbor, reduced nearest neighbor, edited nearest neighbor, and All k-NN) resulted in moderate reduction (to 50-70% of the original size) at the cost of decrease in CB-CAD performance.

  8. Optical computed tomography utilizing a rotating mirror and Fresnel lenses: operating principles and preliminary results

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Wuu, Cheng-Shie

    2013-02-01

    The performance of a fast optical computed tomography (CT) scanner based on a point laser source, a small area photodiode detector, and two optical-grade Fresnel lenses is evaluated. The OCTOPUS™-10× optical CT scanner (MGS Research Inc., Madison, CT) is an upgrade of the OCTOPUS™ research scanner with improved design for faster motion of the laser beam and faster data acquisition process. The motion of the laser beam in the new configuration is driven by the rotational motion of a scanning mirror. The center of the scanning mirror and the center of the photodiode detector are adjusted to be on the focal points of two coaxial Fresnel lenses. A glass water tank is placed between the two Fresnel lenses to house gel phantoms and matching liquids. The laser beam scans over the water tank in parallel beam geometry for projection data as the scanning mirror rotates at a frequency faster than 0.1 s per circle. Signal sampling is performed independently of the motion of the scanning mirror, to reduce the processing time for the synchronization of the stepper motors and the data acquisition board. An in-house developed reference image normalization mechanism is added to the image reconstruction program to correct the non-uniform light transmitting property of the Fresnel lenses. Technical issues with regard to the new design of the scanner are addressed, including projection data extraction from raw data samples, non-uniform pixel averaging and reference image normalization. To evaluate the dosimetric accuracy of the scanner, the reconstructed images from a 16 MeV, 6 cm × 6 cm electron field irradiation were compared with those from the Eclipse treatment planning system (Varian Corporation, Palo Alto, CA). The spatial resolution of the scanner is demonstrated to be of sub-millimeter accuracy. The effectiveness of the reference normalization method for correcting the non-uniform light transmitting property of the Fresnel lenses is analyzed. A sub-millimeter accuracy of

  9. Optical computed tomography utilizing a rotating mirror and Fresnel lenses: operating principles and preliminary results.

    PubMed

    Xu, Y; Wuu, Cheng-Shie

    2013-02-01

    The performance of a fast optical computed tomography (CT) scanner based on a point laser source, a small area photodiode detector, and two optical-grade Fresnel lenses is evaluated. The OCTOPUS™-10× optical CT scanner (MGS Research Inc., Madison, CT) is an upgrade of the OCTOPUS™ research scanner with improved design for faster motion of the laser beam and faster data acquisition process. The motion of the laser beam in the new configuration is driven by the rotational motion of a scanning mirror. The center of the scanning mirror and the center of the photodiode detector are adjusted to be on the focal points of two coaxial Fresnel lenses. A glass water tank is placed between the two Fresnel lenses to house gel phantoms and matching liquids. The laser beam scans over the water tank in parallel beam geometry for projection data as the scanning mirror rotates at a frequency faster than 0.1 s per circle. Signal sampling is performed independently of the motion of the scanning mirror, to reduce the processing time for the synchronization of the stepper motors and the data acquisition board. An in-house developed reference image normalization mechanism is added to the image reconstruction program to correct the non-uniform light transmitting property of the Fresnel lenses. Technical issues with regard to the new design of the scanner are addressed, including projection data extraction from raw data samples, non-uniform pixel averaging and reference image normalization. To evaluate the dosimetric accuracy of the scanner, the reconstructed images from a 16 MeV, 6 cm × 6 cm electron field irradiation were compared with those from the Eclipse treatment planning system (Varian Corporation, Palo Alto, CA). The spatial resolution of the scanner is demonstrated to be of sub-millimeter accuracy. The effectiveness of the reference normalization method for correcting the non-uniform light transmitting property of the Fresnel lenses is analyzed. A sub

  10. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. A Feasibility and Preliminary Design Study. Interim Report.

    ERIC Educational Resources Information Center

    Computation Planning, Inc., Bethesda, MD.

    A feasibility analysis of a single integrated central computer system for secondary schools and junior colleges finds that a central computing facility capable of serving 50 schools with a total enrollment of 100,000 students is feasible at a cost of $18 per student per year. The recommended system is a multiprogrammed-batch operation. Preliminary…

  11. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... mortality or morbidity assumptions and interest rates for both the preliminary term basis and the net level... forth sufficient information as to mortality and morbidity asumptions; interest rates; the valuation... section 818(c) in the manner provided in paragraph (e) of this section, the amount to be taken...

  12. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... mortality or morbidity assumptions and interest rates for both the preliminary term basis and the net level... forth sufficient information as to mortality and morbidity asumptions; interest rates; the valuation... section 818(c) in the manner provided in paragraph (e) of this section, the amount to be taken...

  13. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... mortality or morbidity assumptions and interest rates for both the preliminary term basis and the net level... forth sufficient information as to mortality and morbidity asumptions; interest rates; the valuation... section 818(c) in the manner provided in paragraph (e) of this section, the amount to be taken...

  14. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... mortality or morbidity assumptions and interest rates for both the preliminary term basis and the net level... forth sufficient information as to mortality and morbidity asumptions; interest rates; the valuation... section 818(c) in the manner provided in paragraph (e) of this section, the amount to be taken...

  15. Preliminary Analysis of a Randomized Trial of Computer Attention Training in Children with Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Steiner, N.; Sidhu, T. K.; Frenette, E. C.; Mitchell, K.; Perrin, E. C.

    2011-01-01

    Clinically significant attention problems among children present a significant obstacle to increasing student achievement. Computer-based attention training holds great promise as a way for schools to address this problem. The aim of this project is to evaluate the efficacy of two computer-based attention training systems in schools. One program…

  16. Computer-Assisted Learning in Anatomy at the International Medical School in Debrecen, Hungary: A Preliminary Report

    ERIC Educational Resources Information Center

    Kish, Gary; Cook, Samuel A.; Kis, Greta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an…

  17. Addressing Cultural Context in the Development of Performance-based Assessments and Computer-adaptive Testing: Preliminary Validity Considerations.

    ERIC Educational Resources Information Center

    Boodoo, Gwyneth M.

    1998-01-01

    Discusses the research and steps needed to develop performance-based and computer-adaptive assessments that are culturally responsive. Supports the development of a new conceptual framework and more explicit guidelines for designing culturally responsive assessments. (SLD)

  18. Monitoring the Microgravity Environment Quality On-board the International Space Station Using Soft Computing Techniques. Part 2; Preliminary System Performance Results

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.

    2002-01-01

    This paper presents the preliminary performance results of the artificial intelligence monitoring system in full operational mode using near real time acceleration data downlinked from the International Space Station. Preliminary microgravity environment characterization analysis result for the International Space Station (Increment-2), using the monitoring system is presented. Also, comparison between the system predicted performance based on ground test data for the US laboratory "Destiny" module and actual on-orbit performance, using measured acceleration data from the U.S. laboratory module of the International Space Station is presented. Finally, preliminary on-orbit disturbance magnitude levels are presented for the Experiment of Physics of Colloids in Space, which are compared with on ground test data. The ground test data for the Experiment of Physics of Colloids in Space were acquired from the Microgravity Emission Laboratory, located at the NASA Glenn Research Center, Cleveland, Ohio. The artificial intelligence was developed by the NASA Glenn Principal Investigator Microgravity Services Project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment of time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a dynamic graphical display, implemented in Java, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, structural modes, etc., and decide whether or not to run their experiments, whenever that is an option, based on the acceleration magnitude and frequency sensitivity associated with that experiment. This monitoring system detects primarily the vibratory disturbance sources. The system has built-in capability to detect both known

  19. Computer-designed selective laser sintering surgical guide and immediate loading dental implants with definitive prosthesis in edentulous patient: A preliminary method

    PubMed Central

    Giacomo, Giovanni Di; Silva, Jorge; Martines, Rodrigo; Ajzen, Sergio

    2014-01-01

    Objective: The aim of this study was to analyze a preliminary method of immediately loading dental implants and a definitive prosthesis based on the computer-aided design/computer-aided manufacturing systems, after 2 years of clinical follow-up. Materials and Methods: The study comprised one patient in good general health with edentulous maxilla. Cone beam computer tomography (CBCT) was performed using a radiographic template. The surgical plan was made using the digital imaging and communications in medicine protocol with ImplantViewer (version 1.9, Anne Solutions, Sao Paulo, SP, Brazil), the surgical planning software. These data were used to produce a selective laser sintering surgical template. A maxilla prototype was used to guide the prosthesis technician in producing the prosthesis. Eight dental implants and a definitive prosthesis were installed on the same day. A post-operative CBCT image was fused with the image of the surgical planning to calculate the deviation between the planned and the placed implants positions. Patient was followed for 2 years. Results: On average, the match between the planned and placed angular deviation was within 6.0 ± 3.4° and the difference in coronal deviation was 0.7 ± 0.3 mm. At the end of the follow-up, neither the implant nor the prosthesis was lost. Conclusions: Considering the limited samples number, it was possible to install the dental implants and a definitive prosthesis on the same day with success. PMID:24966755

  20. Modeling resident error-making patterns in detection of mammographic masses using computer-extracted image features: preliminary experiments

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Zhang, Jing; Lo, Joseph Y.; Kuzmiak, Cherie M.; Ghate, Sujata V.; Yoon, Sora

    2014-03-01

    Providing high quality mammography education to radiology trainees is essential, as good interpretation skills potentially ensure the highest benefit of screening mammography for patients. We have previously proposed a computer-aided education system that utilizes trainee models, which relate human-assessed image characteristics to interpretation error. We proposed that these models be used to identify the most difficult and therefore the most educationally useful cases for each trainee. In this study, as a next step in our research, we propose to build trainee models that utilize features that are automatically extracted from images using computer vision algorithms. To predict error, we used a logistic regression which accepts imaging features as input and returns error as output. Reader data from 3 experts and 3 trainees were used. Receiver operating characteristic analysis was applied to evaluate the proposed trainee models. Our experiments showed that, for three trainees, our models were able to predict error better than chance. This is an important step in the development of adaptive computer-aided education systems since computer-extracted features will allow for faster and more extensive search of imaging databases in order to identify the most educationally beneficial cases.

  1. Computed tomography of the brain in children with minimal brain damage: a preliminary study of 46 children.

    PubMed

    Bergström, K; Bille, B

    1978-11-01

    Forty-six children aged 4 to 15 years with minimal brain damage (MBD) underwent computed cranial tomography (CT). The criteria used for a diagnosis of MDB was the presence of clinical features of a developmental disturbance of the central nervous system causing incoordination. CT revealed abnormalities in 15 cases (32.6%), consisting in cerebral atrophy, asymmetry or an anomaly.

  2. The Opinions of the Kindergarten Teachers in Relation to the Introduction of Computers to Nursery Schools: Preliminary Approach

    ERIC Educational Resources Information Center

    Sivropoulou, Irene; Tsapakidou, Aggeliki; Kiridis, Argyris

    2009-01-01

    Computers were introduced in Greek kindergartens of our country with the new curricula for kindergarten, "Inter-disciplinary Integrated Framework of Study Programs," ("Official Journal of the Hellenic Republic," 376 t.B/18-10-2001, article 6), in order to contribute to the spherical growth of children and to extend their…

  3. A Framework for Measuring Student Learning Gains and Engagement in an Introductory Computing Course: A Preliminary Report of Findings

    ERIC Educational Resources Information Center

    Lim, Billy; Hosack, Bryan; Vogt, Paul

    2012-01-01

    This paper describes a framework for measuring student learning gains and engagement in a Computer Science 1 (CS 1) / Information Systems 1 (IS 1) course. The framework is designed for a CS1/IS1 course as it has been traditionally taught over the years as well as when it is taught using a new pedagogical approach with Web services. It enables the…

  4. Positron Emission Tomography: A Basic Analysis

    NASA Astrophysics Data System (ADS)

    Kerbacher, M. E.; Deaton, J. W.; Phinney, L. C.; Mitchell, L. J.; Duggan, J. L.

    2007-10-01

    Positron Emission Tomography is useful in detecting biological abnormalities. The technique involves attaching radiotracers to a material used inside the body, in many cases glucose. Glucose is absorbed most readily in areas of unusual cell growth or uptake of nutrients so through natural processes the treated glucose highlights regions of tumors and other degenerative disorders such as Alzheimer's disease. The higher the concentration of isotopes, the more dynamic the area. Isotopes commonly used as tracers are 11C, 18F, 13N, and 15O due to their easy production and short half-lives. Once the tracers have saturated an area of tissue they are detected using coincidence detectors collinear with individual isotopes. As the isotope decays it emits a positron which, upon annihilating an electron, produces two oppositely directioned gamma rays. The PET machine consists of several pairs of detectors, each 180 degrees from their partner detector. When the oppositely positioned detectors are collinear with the area of the isotope, a computer registers the location of the isotope and can compile an image of the activity of the highlighted area based on the position and strength of the isotopes.

  5. Preliminary Development of a Workstation for Craniomaxillofacial Surgical Procedures: Introducing a Computer-Assisted Planning and Execution System

    PubMed Central

    Gordon, Chad R.; Murphy, Ryan J.; Coon, Devin; Basafa, Ehsan; Otake, Yoshito; Al Rakan, Mohammed; Rada, Erin; Susarla, Sriniras; Swanson, Edward; Fishman, Elliot; Santiago, Gabriel; Brandacher, Gerald; Liacouras, Peter; Grant, Gerald; Armand, Mehran

    2014-01-01

    Introduction Facial transplantation represents one of the most complicated scenarios in craniofacial surgery because of skeletal, aesthetic, and dental discrepancies between donor and recipient. However, standard off-the-shelf vendor computer-assisted surgery systems may not provide custom features to mitigate the increased complexity of this particular procedure. We propose to develop a computer-assisted surgery solution customized for preoperative planning, intraoperative navigation including cutting guides, and dynamic, instantaneous feedback of cephalometric measurements/angles as needed for facial transplantation. Methods We developed the Computer-Assisted Planning and Execution (CAPE) workstation to assist with planning and execution of facial transplantation. Preoperative maxillofacial computed tomography (CT) scans were obtained on 4 size-mismatched miniature swine encompassing 2 live face-jaw-teeth transplants. The system was tested in a laboratory setting using plastic models of mismatched swine, after which the system was used in 2 live swine transplants. Postoperative CT imaging was obtained and compared with the preoperative plan and intraoperative measures from the CAPE workstation for both transplants. Results Plastic model tests familiarized the team with the CAPE workstation and identified several defects in the workflow. Live swine surgeries demonstrated utility of the CAPE system in the operating room, showing submillimeter registration error of 0.6 ± 0.24 mm and promising qualitative comparisons between intraoperative data and postoperative CT imaging. Conclusions The initial development of the CAPE workstation demonstrated integration of computer planning and intraoperative navigation for facial transplantation are possible with submillimeter accuracy. This approach can potentially improve preoperative planning, allowing ideal donor-recipient matching despite significant size mismatch, and accurate surgical execution. PMID:24406592

  6. An analysis of Space Shuttle countdown activities: Preliminaries to a computational model of the NASA Test Director

    NASA Technical Reports Server (NTRS)

    John, Bonnie E.; Remington, Roger W.; Steier, David M.

    1991-01-01

    Before all systems are go just prior to the launch of a space shuttle, thousands of operations and tests have been performed to ensure that all shuttle and support subsystems are operational and ready for launch. These steps, which range from activating the orbiter's flight computers to removing the launch pad from the itinerary of the NASA tour buses, are carried out by launch team members at various locations and with highly specialized fields of expertise. The liability for coordinating these diverse activities rests with the NASA Test Director (NTD) at NASA-Kennedy. The behavior is being studied of the NTD with the goal of building a detailed computational model of that behavior; the results of that analysis to date are given. The NTD's performance is described in detail, as a team member who must coordinate a complex task through efficient audio communication, as well as an individual taking notes and consulting manuals. A model of the routine cognitive skill used by the NTD to follow the launch countdown procedure manual was implemented using the Soar cognitive architecture. Several examples are given of how such a model could aid in evaluating proposed computer support systems.

  7. Design of a Tablet Computer App for Facilitation of a Molecular Blood Culture Test in Clinical Microbiology and Preliminary Usability Evaluation

    PubMed Central

    Pape-Haugaard, Louise; Meltzer, Michelle C; Fuchs, Martin; Schønheyder, Henrik C; Hejlesen, Ole

    2016-01-01

    Background User mobility is an important aspect of the development of clinical information systems for health care professionals. Mobile phones and tablet computers have obtained widespread use by health care professionals, offering an opportunity for supporting the access to patient information through specialized applications (apps) while supporting the mobility of the users. The use of apps for mobile phones and tablet computers may support workflow of complex tasks, for example, molecular-based diagnostic tests in clinical microbiology. Multiplex Blood Culture Test (MuxBCT) is a molecular-based diagnostic test used for rapid identification of pathogens in positive blood cultures. To facilitate the workflow of the MuxBCT, a specialized tablet computer app was developed as an accessory to the diagnostic test. The app aims to reduce the complexity of the test by step-by-step guidance of microscopy and to assist users in reaching an exact bacterial or fungal diagnosis based on blood specimen observations and controls. Additionally, the app allows for entry of test results, and communication thereof to the laboratory information system (LIS). Objective The objective of the study was to describe the design considerations of the MuxBCT app and the results of a preliminary usability evaluation. Methods The MuxBCT tablet app was developed and set up for use in a clinical microbiology laboratory. A near-live simulation study was conducted in the clinical microbiology laboratory to evaluate the usability of the MuxBCT app. The study was designed to achieve a high degree of realism as participants carried out a scenario representing the context of use for the MuxBCT app. As the MuxBCT was under development, the scenario involved the use of molecular blood culture tests similar to the MuxBCT for identification of microorganisms from positive blood culture samples. The study participants were observed, and their interactions with the app were recorded. After the study, the

  8. Design and preliminary evaluation of the FINGER rehabilitation robot: controlling challenge and quantifying finger individuation during musical computer game play

    PubMed Central

    2014-01-01

    Background This paper describes the design and preliminary testing of FINGER (Finger Individuating Grasp Exercise Robot), a device for assisting in finger rehabilitation after neurologic injury. We developed FINGER to assist stroke patients in moving their fingers individually in a naturalistic curling motion while playing a game similar to Guitar Hero®a. The goal was to make FINGER capable of assisting with motions where precise timing is important. Methods FINGER consists of a pair of stacked single degree-of-freedom 8-bar mechanisms, one for the index and one for the middle finger. Each 8-bar mechanism was designed to control the angle and position of the proximal phalanx and the position of the middle phalanx. Target positions for the mechanism optimization were determined from trajectory data collected from 7 healthy subjects using color-based motion capture. The resulting robotic device was built to accommodate multiple finger sizes and finger-to-finger widths. For initial evaluation, we asked individuals with a stroke (n = 16) and without impairment (n = 4) to play a game similar to Guitar Hero® while connected to FINGER. Results Precision design, low friction bearings, and separate high speed linear actuators allowed FINGER to individually actuate the fingers with a high bandwidth of control (−3 dB at approximately 8 Hz). During the tests, we were able to modulate the subject’s success rate at the game by automatically adjusting the controller gains of FINGER. We also used FINGER to measure subjects’ effort and finger individuation while playing the game. Conclusions Test results demonstrate the ability of FINGER to motivate subjects with an engaging game environment that challenges individuated control of the fingers, automatically control assistance levels, and quantify finger individuation after stroke. PMID:24495432

  9. Applying computational nanomaterials design to the reactive ion etching of NiO thin films—a preliminary investigation

    NASA Astrophysics Data System (ADS)

    David, M.; Muhida, R.; Roman, T.; Kunikata, S.; Diño, W. A.; Nakanishi, H.; Kasai, H.; Takano, F.; Shima, H.; Akinaga, H.

    2007-09-01

    We have developed and proposed a model for reactive ion etching (RIE) process design of nickel oxide thin films using a computational materials design based on ab initio calculations. On etching NiO, we found that it was necessary to have hydrogen-based reactive gases in the initial state in order to enhance RIE (e.g. NH3, CH4). We strongly suggest the use of CH4 or any H-based gas source other than CHF3 to enhance RIE process.

  10. Preliminary validation of computational model for neutron flux prediction of Thai Research Reactor (TRR-1/M1)

    NASA Astrophysics Data System (ADS)

    Sabaibang, S.; Lekchaum, S.; Tipayakul, C.

    2015-05-01

    This study is a part of an on-going work to develop a computational model of Thai Research Reactor (TRR-1/M1) which is capable of accurately predicting the neutron flux level and spectrum. The computational model was created by MCNPX program and the CT (Central Thimble) in-core irradiation facility was selected as the location for validation. The comparison was performed with the typical flux measurement method routinely practiced at TRR-1/M1, that is, the foil activation technique. In this technique, gold foil is irradiated for a certain period of time and the activity of the irradiated target is measured to derive the thermal neutron flux. Additionally, the flux measurement with SPND (self-powered neutron detector) was also performed for comparison. The thermal neutron flux from the MCNPX simulation was found to be 1.79×1013 neutron/cm2s while that from the foil activation measurement was 4.68×1013 neutron/cm2s. On the other hand, the thermal neutron flux from the measurement using SPND was 2.47×1013 neutron/cm2s. An assessment of the differences among the three methods was done. The difference of the MCNPX with the foil activation technique was found to be 67.8% and the difference of the MCNPX with the SPND was found to be 27.8%.

  11. Computer simulations of comet- and asteroidlike bodies passing through the Venusian atmosphere: Preliminary results on atmospheric and ground shock effects

    NASA Technical Reports Server (NTRS)

    Roddy, D.; Hatfield, D.; Hassig, P.; Rosenblatt, M.; Soderblom, L.; Dejong, E.

    1992-01-01

    We have completed computer simulations that model shock effects in the venusian atmosphere caused during the passage of two cometlike bodies 100 m and 1000 m in diameter and an asteroidlike body 10 km in diameter. Our objective is to examine hypervelocity-generated shock effects in the venusian atmosphere for bodies of different types and sizes in order to understand the following: (1) their deceleration and depth of penetration through the atmosphere; and (2) the onset of possible ground-surface shock effects such as splotches, craters, and ejecta formations. The three bodies were chosen to include both a range of general conditions applicable to Venus as well as three specific cases of current interest. These calculations use a new multiphase computer code (DICE-MAZ) designed by California Research & Technology for shock-dynamics simulations in complex environments. The code was tested and calibrated in large-scale explosion, cratering, and ejecta research. It treats a wide range of different multiphase conditions, including material types (vapor, melt, solid), particle-size distributions, and shock-induced dynamic changes in velocities, pressures, temperatures (internal energies), densities, and other related parameters, all of which were recorded in our calculations.

  12. A preliminary study of a cloud-computing model for chronic illness self-care support in an underdeveloped country

    PubMed Central

    Piette, John D.; Mendoza-Avelares, Milton O.; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila

    2013-01-01

    Background Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. Objective This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. Methods A single group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients’ cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for six weeks, with automated follow-up emails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a six week follow-up. Other measures included patients’ glycemic control (A1c) and data from the IVR calling system. Results 55% of participants completed the majority of their IVR calls and 33% completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean A1c’s decreased from 10.0% at baseline to 8.9% at follow-up (p<.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Conclusions Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in under-developed countries. PMID:21565655

  13. Preliminary evaluation of the dosimetric accuracy of cone-beam computed tomography for cases with respiratory motion

    NASA Astrophysics Data System (ADS)

    Kim, Dong Wook; Bae, Sunhyun; Chung, Weon Kuu; Lee, Yoonhee

    2014-04-01

    Cone-beam computed tomography (CBCT) images are currently used for patient positioning and adaptive dose calculation; however, the degree of CBCT uncertainty in cases of respiratory motion remains an interesting issue. This study evaluated the uncertainty of CBCT-based dose calculations for a moving target. Using a phantom, we estimated differences in the geometries and the Hounsfield units (HU) between CT and CBCT. The calculated dose distributions based on CT and CBCT images were also compared using a radiation treatment planning system, and the comparison included cases with respiratory motion. The geometrical uncertainties of the CT and the CBCT images were less than 0.15 cm. The HU differences between CT and CBCT images for standard-dose-head, high-quality-head, normal-pelvis, and low-dose-thorax modes were 31, 36, 23, and 33 HU, respectively. The gamma (3%, 0.3 cm)-dose distribution between CT and CBCT was greater than 1 in 99% of the area. The gamma-dose distribution between CT and CBCT during respiratory motion was also greater than 1 in 99% of the area. The uncertainty of the CBCT-based dose calculation was evaluated for cases with respiratory motion. In conclusion, image distortion due to motion did not significantly influence dosimetric parameters.

  14. The use of non-contrast computed tomography and color Doppler ultrasound in the characterization of urinary stones - preliminary results

    PubMed Central

    Bulakçı, Mesut; Tefik, Tzevat; Akbulut, Fatih; Örmeci, Mehmet Tolgahan; Beşe, Caner; Şanlı, Öner; Oktar, Tayfun; Salmaslıoğlu, Artür

    2015-01-01

    Objective To investigate the role of density value in computed tomography (CT) and twinkling artifact observed in color Doppler analysis for the prediction of the mineral composition of urinary stones. Material and methods A total of 42 patients who were operated via percutaneous or endoscopic means and had undergone abdominal non-contrast CT and color Doppler ultrasonography examinations were included in the study. X-ray diffraction method was utilized to analyze a total of 86 stones, and the correlations between calculated density values and twinkling intensities with stone types were investigated for each stone. Results Analyses of extracted stones revealed the presence of 40 calcium oxalate monohydrate, 12 calcium oxalate dihydrate, 9 uric acid, 11 calcium phosphate, and 14 cystine stones. The density values were calculated as 1499±269 Hounsfield Units (HU) for calcium oxalate monohydrate, 1505±221 HU for calcium oxalate dihydrate, 348±67 HU for uric acid, 1106±219 HU for calcium phosphate, and 563±115 HU for cystine stones. The artifact intensities were determined as grade 0 in 15, grade 1 in 32, grade 2 in 24, and grade 3 in 15 stones. Conclusion In case the density value of the stone is measured below 780 HU and grade 3 artifact intensity is determined, it can be inferred that the mineral composition of the stone tends to be cystine. PMID:26623143

  15. A new computer-aided detection scheme based on assessment of local bilateral mammographic feature asymmetry - a preliminary evaluation.

    PubMed

    Kelder, Adam; Zigel, Yaniv; Lederman, Dror; Zheng, Bin

    2015-01-01

    Accurate segmentation of breast lesions depicting on two-dimensional projection mammograms has been proven very difficult and unreliable. In this study we investigated a new approach of a computer-aided detection (CAD) scheme of mammograms without lesion segmentation. Our scheme was developed based on the detection and analysis of region-of-interest (ROI)-based bilateral mammographic tissue or feature asymmetry. A bilateral image registration, image feature selection process, and naïve Bayes linear classifier were implemented in CAD scheme. CAD performance predicting the likelihood of either an ROI or a subject (case) being abnormal was evaluated using 161 subjects from the mini-MIAS database and a leave-one-out testing method. The results showed that areas under receiver operating characteristic (ROC) curves were 0.87 and 0.72 on the ROI-based and case-based evaluation, respectively. The study demonstrated that using ROI-based bilateral mammographic tissue asymmetry can provide supplementary information with high discriminatory power in order to improve CAD performance.

  16. On-line computer system to minimize laser injuries during surgery: preliminary system layout and proposal of the key features.

    PubMed

    Canestri, F

    1999-01-01

    The aim of this paper is to investigate some new user interface ideas and related application packages which aim to improve the degree of safety in an operating room during surgical operations in which an invasive laser beam is deployed. The overall value of the proposition is that a means is provided which ensures the successful completion of the surgical case while minimizing the risk of thermal and mechanical injuries to healthy tissues adjacent to the surgical field. According to surgeons operating with a variety of CO2 lasers available at both the National Cancer Institute in Milan, Italy, and the Sackler School of Medicine, Tel Aviv University, Israel, each laser device presents different cutting and coagulation properties. In order to identify which 'ideal' procedure might corroborate the subjective impression of each surgeon and also to provide one common tool to ensure procedures with a high level of safety, the author has worked for several months with surgeons and technicians of both Institutions to define the general design of a new on-line surgical operation planning and design system to be used during the pre-operative briefing activities and also as a consultation tool during operation. This software package will be developed and tested on both 'C' and FORTRAN compilers running on a commercially available PC which is driving a continuous wave (CW) CO2 laser device via its Instrument Bus interface. The present proposal describes the details of a software package called LCA (Laser-beam Controller and Adviser) which performs several controls in parallel on the key output parameters of a laser beam device during its utilization in delicate surgical operations. The required performances of this device needed during a given surgical operation are pre-simulated and compared against the well-known safety limits, which are stored in the computer's mass storage. If the surgeon's decision about the laser device set-up are considered to be too close to the

  17. Computer simulations of large asteroid impacts into oceanic and continental sites--preliminary results on atmospheric, cratering and ejecta dynamics

    USGS Publications Warehouse

    Roddy, D.J.; Schuster, S.H.; Rosenblatt, M.; Grant, L.B.; Hassig, P.J.; Kreyenhagen, K.N.

    1987-01-01

    Computer simulations have been completed that describe passage of a 10-km-diameter asteroid through the Earth's atmosphere and the subsequent cratering and ejecta dynamics caused by impact of the asteroid into both oceanic and continental sites. The asteroid was modeled as a spherical body moving vertically at 20 km/s with a kinetic energy of 2.6 ?? 1030 ergs (6.2 ?? 107 Mt ). Detailed material modeling of the asteroid, ocean, crustal units, sedimentary unit, and mantle included effects of strength and fracturing, generic asteroid and rock properties, porosity, saturation, lithostatic stresses, and geothermal contributions, each selected to simulate impact and geologic conditions that were as realistic as possible. Calculation of the passage of the asteroid through a U.S. Standard Atmosphere showed development of a strong bow shock wave followed by a highly shock compressed and heated air mass. Rapid expansion of this shocked air created a large low-density region that also expanded away from the impact area. Shock temperatures in air reached ???20,000 K near the surface of the uplifting crater rim and were as high as ???2000 K at more than 30 km range and 10 km altitude. Calculations to 30 s showed that the shock fronts in the air and in most of the expanding shocked air mass preceded the formation of the crater, ejecta, and rim uplift and did not interact with them. As cratering developed, uplifted rim and target material were ejected into the very low density, shock-heated air immediately above the forming crater, and complex interactions could be expected. Calculations of the impact events showed equally dramatic effects on the oceanic and continental targets through an interval of 120 s. Despite geologic differences in the targets, both cratering events developed comparable dynamic flow fields and by ???29 s had formed similar-sized transient craters ???39 km deep and ???62 km across. Transient-rim uplift of ocean and crust reached a maximum altitude of nearly

  18. SUPER-RESOLUTION ULTRASOUND TOMOGRAPHY: A PRELIMINARY STUDY WITH A RING ARRAY

    SciTech Connect

    HUANG, LIANJIE; SIMONETTI, FRANCESCO; DURIC, NEBOJSA; RAMA, OLSI

    2007-01-18

    Ultrasound tomography attempts to retrieve the structure of an objective by exploiting the interaction of acoustic waves with the object. A fundamental limit of ultrasound tomography is that features cannot be resolved if they are spaced less than {lambda}/2 apart, where {lambda} is wavelength of the probing wave, regardless of the degree of accuracy of the measurements. Therefore, since the attenuation of the probing wave with propagation distance increases as {lambda} decreases, resolution has to be traded against imaging depth. Recently, it has been shown that the {lambda}/2 limit is a consequence of the Born approximation (implicit in the imaging algorithms currently employed) which neglects the distortion of the probing wavefield as it travels through the medium to be imaged. On the other hand, such a distortion, which is due to the multiple scattering phenomenon, can encode unlimited resolution in the radiating component of the scattered field. Previously, a resolution better than {lambda}/3 has been reported in these proceedings [F. Simonetti, pp. 126 (2006)] in the case of elastic wave probing. In this paper, they demonstrate experimentally a resolution better than {lambda}/4 for objects immersed in a water bth probed by means of a ring array which excites and detects pressure waves in a full view configuration.

  19. Development and preliminary user testing of the DCIDA (Dynamic computer interactive decision application) for ‘nudging’ patients towards high quality decisions

    PubMed Central

    2014-01-01

    Background Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. Method The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a ‘think aloud’ protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Results Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68–85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature

  20. Prospective computer-assisted voice analysis for patients with early stage glottic cancer: a preliminary report of the functional result of laryngeal irradiation.

    PubMed

    Harrison, L B; Solomon, B; Miller, S; Fass, D E; Armstrong, J; Sessions, R B

    1990-07-01

    In January 1987 we began a prospective study aimed at evaluating objective parameters of vocal function for all patients treated with RT for early glottic cancer. All patients underwent vocal analysis using a voice analyzer interfaced with a computer. This allowed for the determination of percent voicing (%V) (normal = presence of phonation = 90-100%V). Other parameters such as breathiness (air turbulence or hoarseness) and strain (vocal cord tension) were also measured. Patients were recorded before RT, weekly during RT, and at set intervals after RT. There have been 25 patients studied. Eighteen (18) are evaluable at 9 months after treatment. All patients were male and ranged from 45-84 years old. Fourteen (14) and T1 lesions and received 66 GY/33 fractions to their larynx and 4 had T2 tumors and received 66-70 Gy/33-35 fractions. To date, all patients are locally controlled. Three distinct patterns of %V changes have been encountered. However, all patients demonstrated normal phonation pattern by 3 months after RT, and this is sustained at 9 months follow-up. In addition, 94% of patients have had significant decrease in breathiness after RT, which objectively documents diminished hoarseness. In 83%, breathiness is normal after RT. Most patients have had increased strain after RT, which documents increased vocal cord tension. However, strain remained within normal limits in 89%. Our preliminary analysis suggests that the majority of patients irradiated for early glottic cancer demonstrate a decrease in breathiness and an increase in strain after RT, and enjoy a resultant voice that has normal phonation maintained at 9 months after RT. Our data also demonstrate three distinct phonation patterns. Further follow-up will allow us to determine the prognostic significance, if any, of these patterns, and to continue to follow objective vocal parameters on larger numbers of patient. PMID:2380077

  1. SU-E-I-74: Image-Matching Technique of Computed Tomography Images for Personal Identification: A Preliminary Study Using Anthropomorphic Chest Phantoms

    SciTech Connect

    Matsunobu, Y; Shiotsuki, K; Morishita, J

    2015-06-15

    Purpose: Fingerprints, dental impressions, and DNA are used to identify unidentified bodies in forensic medicine. Cranial Computed tomography (CT) images and/or dental radiographs are also used for identification. Radiological identification is important, particularly in the absence of comparative fingerprints, dental impressions, and DNA samples. The development of an automated radiological identification system for unidentified bodies is desirable. We investigated the potential usefulness of bone structure for matching chest CT images. Methods: CT images of three anthropomorphic chest phantoms were obtained on different days in various settings. One of the phantoms was assumed to be an unidentified body. The bone image and the bone image with soft tissue (BST image) were extracted from the CT images. To examine the usefulness of the bone image and/or the BST image, the similarities between the two-dimensional (2D) or threedimensional (3D) images of the same and different phantoms were evaluated in terms of the normalized cross-correlation value (NCC). Results: For the 2D and 3D BST images, the NCCs obtained from the same phantom assumed to be an unidentified body (2D, 0.99; 3D, 0.93) were higher than those for the different phantoms (2D, 0.95 and 0.91; 3D, 0.89 and 0.80). The NCCs for the same phantom (2D, 0.95; 3D, 0.88) were greater compared to those of the different phantoms (2D, 0.61 and 0.25; 3D, 0.23 and 0.10) for the bone image. The difference in the NCCs between the same and different phantoms tended to be larger for the bone images than for the BST images. These findings suggest that the image-matching technique is more useful when utilizing the bone image than when utilizing the BST image to identify different people. Conclusion: This preliminary study indicated that evaluating the similarity of bone structure in 2D and 3D images is potentially useful for identifying of an unidentified body.

  2. ACCF/SCCT/ACR/AHA/ASE/ASNC/NASCI/SCAI/SCMR 2010 Appropriate Use Criteria for Cardiac Computed Tomography. A Report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the Society of Cardiovascular Computed Tomography, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the American Society of Nuclear Cardiology, the North American Society for Cardiovascular Imaging, the Society for Cardiovascular Angiography and Interventions, and the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D

    2010-11-23

    The American College of Cardiology Foundation, along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research. PMID:20975004

  3. ACCF/SCCT/ACR/AHA/ASE/ASNC/NASCI/SCAI/SCMR 2010 appropriate use criteria for cardiac computed tomography. A report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the Society of Cardiovascular Computed Tomography, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the American Society of Nuclear Cardiology, the North American Society for Cardiovascular Imaging, the Society for Cardiovascular Angiography and Interventions, and the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D; Kramer, Christopher M; Berman, Daniel; Brown, Alan; Chaudhry, Farooq A; Cury, Ricardo C; Desai, Milind Y; Einstein, Andrew J; Gomes, Antoinette S; Harrington, Robert; Hoffmann, Udo; Khare, Rahul; Lesser, John; McGann, Christopher; Rosenberg, Alan; Schwartz, Robert; Shelton, Marc; Smetana, Gerald W; Smith, Sidney C

    2010-11-23

    The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance (CMR) appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria (1). The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease (CAD) was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography (CT) for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research. PMID:21087721

  4. ACCF/SCCT/ACR/AHA/ASE/ASNC/NASCI/SCAI/SCMR 2010 Appropriate Use Criteria for Cardiac Computed Tomography. A Report of the American College of Cardiology Foundation Appropriate Use Criteria Task Force, the Society of Cardiovascular Computed Tomography, the American College of Radiology, the American Heart Association, the American Society of Echocardiography, the American Society of Nuclear Cardiology, the North American Society for Cardiovascular Imaging, the Society for Cardiovascular Angiography and Interventions, and the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D

    2010-01-01

    The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance (CMR) appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria (1). The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease (CAD) was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography (CT) for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research. PMID:21232696

  5. Measurement of Mandibular Growth Using Cone-Beam Computed Tomography: A Miniature Pig Model Study

    PubMed Central

    Li, Jia-Da; Lu, Tung-Wu; Chang, Hau-Hung; Hu, Chih-Chung

    2014-01-01

    The purpose of this study was to measure the long-term growth of the mandible in miniature pigs using 3D Cone-Beam Computerized Tomography (CBCT). The mandibles of the pigs were scanned monthly over 12 months using CBCT and the 3D mandibular models were reconstructed from the data. Seventeen anatomical landmarks were identified and classified into four groups of line segments, namely anteroposterior, superoinferior, mediolateral and anteroinferior. The inter-marker distances, inter-segmental angles, volume, monthly distance changes and percentage of changes were calculated to describe mandibular growth. The total changes of inter-marker distances were normalized to the initial values. All inter-marker distances increased over time, with the greatest mean normalized total changes in the superoinferior and anteroposterior groups (p<0.05). Monthly distance changes were greatest during the first four months and then reduced over time. Percentages of inter-marker distance changes were similar among the groups, reaching half of the overall growth around the 4th month. The mandibular volume growth increased non-linearly with time, accelerating during the first five months and slowing during the remaining months. The growth of the mandible was found to be anisotropic and non-homogeneous within the bone and non-linear over time, with faster growth in the ramus than in the body. These growth patterns appeared to be related to the development of the dentition, providing necessary space for the teeth to grow upward for occlusion and for the posterior teeth to erupt. PMID:24801528

  6. Prognostic Value of Epicardial Fat Volume Measurements by Computed Tomography: A Systematic Review of the Literature

    PubMed Central

    Spearman, James V.; Renker, Matthias; Schoepf, U. Joseph; Krazinski, Aleksander W.; Herbert, Teri L.; De Cecco, Carlo N.; Nietert, Paul J.; Meinel, Felix G.

    2015-01-01

    Objectives To perform a systematic review of the growing body of literature evaluating the prognostic value of epicardial fat volume (EFV) quantified by cross-sectional imaging for adverse clinical outcomes. Methods Two independent reviewers performed systematic searches on both PubMed and Scopus using search terms developed with a medical librarian. Peer-reviewed articles were selected based on the inclusion of outcome data, utilization of epicardial fat volume and sufficient reporting for analysis. Results A total of 411 studies were evaluated with 9 studies meeting the inclusion criteria. In all, the studies evaluated 10,252 patients. All 9 studies were based on CT measurements. Seven studies evaluated the prognostic value of EFV unadjusted for calcium score, and 6 of these studies found a significant association between EFV and clinical outcomes. Seven studies evaluated the incremental value of EFV beyond calcium scoring, and 6 of these studies found a significant association. Conclusions The majority of studies suggest that EFV quantification is significantly associated with clinical outcomes and provides incremental prognostic value over coronary artery calcium scoring. Future research should use a binary cut-off of 125mL for evaluation of EFV to provide consistency with other research. PMID:25925354

  7. Determination of the recipient vessels in the head and neck using multislice spiral computed tomography angiography before free flap surgery: a preliminary study.

    PubMed

    Tan, Onder; Kantarci, Mecit; Parmaksizoglu, Duygu; Uyanik, Ummugulsum; Durur, Irmak

    2007-11-01

    Preoperative assessment of the recipient vessels in free flap surgery directly affects the success rate of the operation by determining the flap type, pedicle length, orientation to the recipient site, and need for a vein graft. For this purpose, conventional angiographic methods are still being used with some disadvantages. The aims of this study were to evaluate the potential success of multislice computed tomography angiography in assessment of the recipient vessels before free flap surgery and to reveal if this may be an alternative to conventional angiography. The study was bilaterally carried out in 33 outpatients using a 16-detector spiral computed tomography scanner. In images of multiplanar reconstructions, maximum intensity projections, and three-dimensional volume renderings, the external carotid artery and its main branches were evaluated in terms of availability; patency, stenosis, or occlusion; maximal and minimal external diameters through their traces; variations involving ramification from another main vessel; and abnormal course. The superior thyroid artery was absent bilaterally in two patients (6.06%). The external carotid artery was stenotic on one side in two patients (6.06%) and on each side in one (3.03%). All the remaining vessels appeared without stenosis, occlusion, or variation. We think that multislice computed tomography angiography can provide detailed information about vascular structures and the remaining anatomic structures and their relationships with the recipient vessels. Therefore, multislice computed tomography angiography, as a less invasive vascular imaging method, can be a useful tool before planning free flap surgery. PMID:17993870

  8. Effects of computer-based graphic organizers to solve one-step word problems for middle school students with mild intellectual disability: A preliminary study.

    PubMed

    Sheriff, Kelli A; Boon, Richard T

    2014-08-01

    The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed.

  9. Preliminary Nuclear Electric Propulsion (NEP) reliability study

    NASA Technical Reports Server (NTRS)

    Hsieh, T. M.; Nakashima, A. M.; Mondt, J. F.

    1973-01-01

    A preliminary failure mode, failure effect, and criticality analysis of the major subsystems of nuclear electric propulsion is presented. Simplified reliability block diagrams are also given. A computer program was used to calculate the reliability of the heat rejection subsystem.

  10. Oceanic pipeline computations

    SciTech Connect

    Marks, A.

    1980-01-01

    yechnical and economic feasibility, design, and construction of oil, gas, and two-phase oceanic pipelines systems are discussed. In addition, formulae, references, examples, and programmable calculator software (Hewlett-Packard-67) are given. The contents include: preliminary pipeline sizing; fluid characteristics; preliminary hydraulics; oceanographis; preliminary corridor selection; route selection; final pipeline design; hydraulic design; wall thickness selection; oceanographic design computations; stress analysis; and construction parameters. (JMT)

  11. Computer processing of Landsat and spot images for the morpho-structural analysis of the Wei He Graben (Shaanxi-China). Preliminary results

    NASA Astrophysics Data System (ADS)

    Binelli-Chahine, M.; Vergely, P.; Masson, Ph.

    The Cenozoic Wei He Graben is the most southern of the northern China rift system. It is a half-graben, which southern limit are the ESE-WNW main boundary faults of the Qin Ling Shan Mountains. They correspond to the main eastward-trending faults of the Tibet northern border. A first geological, structural and morphological approach was conducted, based on Landsat multispectral data (MSS, 1:500,000 colour composite images). A geological photo-interpretation and the plotting of important faults (plurikilometric) show the geometry and the general structural pattern of the rift. The morphogical study accounts for vertical movements of large areas but major fault kinematics remains unknown. In order to specify the fault kinematics, a more accurate morpho-structural analysis was conducted, based on SPOT image processing and interpretation. The preliminary results show that during the Recent Quaternary up to Present, no important strike-slip fault movements (more than 20 m) exist along the Wei He Graben faults as suggested by the pull-apart graben model. In return, the vertical component movements (subsidence) prevail and this agrees with field data.

  12. Computer processing of Landsat and SPOT images for the morpho-structural analysis of the Wei He Graben (Shaanxi-China) - Preliminary results

    NASA Astrophysics Data System (ADS)

    Binelli-Chahine, M.; Vergely, P.; Masson, Ph.

    1990-10-01

    The Cenozoic Wei He Graben is the most southern of the northern China rift system. It is a half-graben, which southern limit are the ESE-WNW main boundary faults of the Qin Ling Shan mountains. They correspond to the main eastward-trending faults of the Tibet northern border. A first geological, structural and morphological approach was conducted, based on Landsat multi-spectral data (MSS, 1:500,000 color composite images). A geological photo-interpretation and the plotting of important faults (plurikilometric) show the geometry and the general structural pattern of the rift. The morphological study accounts for vertical movements of large areas but major fault kinematics remains unknown. In order to specify the fault kinematics, a more accurate morpho-structural analysis was conducted, based on SPOT image processing and interpretation. The preliminary results show that during the Recent Quaternary up to Present, no important strike-slip fault movements (more than 20 m) exist along the Wei He Graben faults as suggested by the pull-apart graben model. In return, the vertical component movements (subsidence) prevail and this agrees with field data.

  13. A web-based remote radiation treatment planning system using the remote desktop function of a computer operating system: a preliminary report.

    PubMed

    Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki

    2009-01-01

    We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.

  14. A web-based remote radiation treatment planning system using the remote desktop function of a computer operating system: a preliminary report.

    PubMed

    Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki

    2009-01-01

    We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable. PMID:19948709

  15. An Overview of Preliminary Computational and Experimental Results for the Semi-Span Super-Sonic Transport (S4T) Wind-Tunnel Model

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.

    2011-01-01

    A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.

  16. Effects of brain-computer interface-based functional electrical stimulation on balance and gait function in patients with stroke: preliminary results

    PubMed Central

    Chung, EunJung; Park, Sang-In; Jang, Yun-Yung; Lee, Byoung-Hee

    2015-01-01

    [Purpose] The purpose of this study was to determine the effects of brain-computer interface (BCI)-based functional electrical stimulation (FES) on balance and gait function in patients with stroke. [Subjects] Subjects were randomly allocated to a BCI-FES group (n=5) and a FES group (n=5). [Methods] The BCI-FES group received ankle dorsiflexion training with FES according to a BCI-based program for 30 minutes per day for 5 days. The FES group received ankle dorsiflexion training with FES for the same duration. [Results] Following the intervention, the BCI-FES group showed significant differences in Timed Up and Go test value, cadence, and step length on the affected side. The FES group showed no significant differences after the intervention. However, there were no significant differences between the 2 groups after the intervention. [Conclusion] The results of this study suggest that BCI-based FES training is a more effective exercise for balance and gait function than FES training alone in patients with stroke. PMID:25729205

  17. Analysis of vector models in quantification of artifacts produced by standard prosthetic inlays in Cone-Beam Computed Tomography (CBCT)--a preliminary study.

    PubMed

    Różyło-Kalinowska, Ingrid; Miechowicz, Sławomir; Sarna-Boś, Katarzyna; Borowicz, Janusz; Kalinowski, Paweł

    2014-11-17

    Cone-beam computed tomography (CBCT) is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany), at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL). The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI) of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071). Ceramic inlay with zirconium dioxide (Cera Post) as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post) produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin) and active (Flexi Flange). Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  18. Analysis of vector models in quantification of artifacts produced by standard prosthetic inlays in Cone-Beam Computed Tomography (CBCT)--a preliminary study.

    PubMed

    Różyło-Kalinowska, Ingrid; Miechowicz, Sławomir; Sarna-Boś, Katarzyna; Borowicz, Janusz; Kalinowski, Paweł

    2014-01-01

    Cone-beam computed tomography (CBCT) is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany), at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL). The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI) of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071). Ceramic inlay with zirconium dioxide (Cera Post) as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post) produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin) and active (Flexi Flange). Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality. PMID:25404623

  19. Comparison of Contrast-Enhanced Ultrasound and Computed Tomography in Classifying Endoleaks After Endovascular Treatment of Abdominal Aorta Aneurysms: Preliminary Experience

    SciTech Connect

    Carrafiello, Gianpaolo Lagana, Domenico; Recaldini, Chiara; Mangini, Monica; Bertolotti, Elena; Caronno, Roberto; Tozzi, Matteo; Piffaretti, Gabriele; Annibale Genovese, Eugenio; Fugazzola, Carlo

    2006-12-15

    The purpose of the study was to assess the effectiveness of contrast-enhanced ultrasonography (CEUS) in endoleak classification after endovascular treatment of an abdominal aortic aneurysm compared to computed tomography angiography (CTA). From May 2001 to April 2003, 10 patients with endoleaks already detected by CTA underwent CEUS with Sonovue (registered) to confirm the CTA classification or to reclassify the endoleak. In three conflicting cases, the patients were also studied with conventional angiography. CEUS confirmed the CTA classification in seven cases (type II endoleaks). Two CTA type III endoleaks were classified as type II using CEUS and one CTA type II endoleak was classified as type I by CEUS. Regarding the cases with discordant classification, conventional angiography confirmed the ultrasound classification. Additionally, CEUS documented the origin of type II endoleaks in all cases. After CEUS reclassification of endoleaks, a significant change in patient management occurred in three cases. CEUS allows a better attribution of the origin of the endoleak, as it shows the flow in real time. CEUS is more specific than CTA in endoleak classification and gives more accurate information in therapeutic planning.

  20. Numerical computation of gravitational field of infinitely thin axisymmetric disc with arbitrary surface mass density profile and its application to preliminary study of rotation curve of M33

    NASA Astrophysics Data System (ADS)

    Fukushima, Toshio

    2016-03-01

    We developed a numerical method to compute the gravitational field of an infinitely thin axisymmetric disc with an arbitrary surface mass density profile. We evaluate the gravitational potential by a split quadrature using the double exponential rule and obtain the acceleration vector by numerically differentiating the potential by Ridder's algorithm. The new method is of around 12 digit accuracy and sufficiently fast because requiring only one-dimensional integration. By using the new method, we show the rotation curves of some non-trivial discs: (i) truncated power-law discs, (ii) discs with a non-negligible centre hole, (iii) truncated Mestel discs with edge softening, (iv) double power-law discs, (v) exponentially damped power-law discs, and (vi) an exponential disc with a sinusoidal modulation of the density profile. Also, we present a couple of model fittings to the observed rotation curve of M33: (i) the standard deconvolution by assuming a spherical distribution of the dark matter and (ii) a direct fit of infinitely thin disc mass with a double power-law distribution of the surface mass density. Although the number of free parameters is a little larger, the latter model provides a significantly better fit. The FORTRAN 90 programs of the new method are electronically available.

  1. Mosaic tile model to compute gravitational field for infinitely thin non-axisymmetric objects and its application to preliminary analysis of gravitational field of M74

    NASA Astrophysics Data System (ADS)

    Fukushima, Toshio

    2016-07-01

    Using the analytical expressions of the Newtonian gravitational potential and the associated acceleration vector for an infinitely thin uniform rectangular plate, we developed a method to compute the gravitational field of a general infinitely thin object without assuming its axial symmetry when its surface mass density is known at evenly spaced rectangular grid points. We utilized the method in evaluating the gravitational field of the H I gas, dust, red stars, and blue stars components of M74 from its THINGS, 2MASS, PDSS1, and GALEX data. The non-axisymmetric feature of M74 including an asymmetric spiral structure is seen from (i) the contour maps of the determined gravitational potential, (ii) the vector maps of the associated acceleration vector, and (iii) the cross-section views of the gravitational field and the surface mass density along different directions. An x-mark pattern in the gravitational field is detected at the core of M74 from the analysis of its dust and red stars components. Meanwhile, along the east-west direction in the central region of the angular size of 1 arcmin, the rotation curve derived from the radial component of the acceleration vector caused by the red stars component matches well with that observed by the VENGA project. Thus the method will be useful in studying the dynamics of particles and fluids near and inside spiral galaxies with known photometry data. Electronically available are the table of the determined gravitational fields of M74 on its galactic plane as well as the FORTRAN 90 programs to produce them.

  2. Optimizing Hybrid Occlusion in Face-Jaw-Teeth Transplantation: A Preliminary Assessment of Real-Time Cephalometry as Part of the Computer-Assisted Planning and Execution Workstation for Craniomaxillofacial Surgery

    PubMed Central

    Murphy, Ryan J.; Basafa, Ehsan; Hashemi, Sepehr; Grant, Gerald T.; Liacouras, Peter; Susarla, Srinivas M.; Otake, Yoshito; Santiago, Gabriel; Armand, Mehran; Gordon, Chad R.

    2016-01-01

    Background The aesthetic and functional outcomes surrounding Le Fort–based, face-jaw-teeth transplantation have been suboptimal, often leading to posttransplant class II/III skeletal profiles, palatal defects, and “hybrid malocclusion.” Therefore, a novel technology—real-time cephalometry—was developed to provide the surgical team instantaneous, intraoperative knowledge of three-dimensional dentoskeletal parameters. Methods Mock face-jaw-teeth transplantation operations were performed on plastic and cadaveric human donor/recipient pairs (n = 2). Preoperatively, cephalometric landmarks were identified on donor/recipient skeletons using segmented computed tomographic scans. The computer-assisted planning and execution workstation tracked the position of the donor face-jaw-teeth segment in real time during the placement/inset onto recipient, reporting pertinent hybrid cephalometric parameters from any movement of donor tissue. The intraoperative data measured through real-time cephalometry were compared to posttransplant measurements for accuracy assessment. In addition, posttransplant cephalometric relationships were compared to planned outcomes to determine face-jaw-teeth transplantation success. Results Compared with postoperative data, the real-time cephalometry–calculated intraoperative measurement errors were 1.37 ± 1.11 mm and 0.45 ± 0.28 degrees for the plastic skull and 2.99 ± 2.24 mm and 2.63 ± 1.33 degrees for the human cadaver experiments. These results were comparable to the posttransplant relations to planned outcome (human cadaver experiment, 1.39 ± 1.81 mm and 2.18 ± 1.88 degrees; plastic skull experiment, 1.06 ± 0.63 mm and 0.53 ± 0.39 degrees). Conclusion Based on this preliminary testing, real-time cephalometry may be a valuable adjunct for adjusting and measuring “hybrid occlusion” in face-jaw-teeth transplantation and other orthognathic surgical procedures. PMID:26218382

  3. Preliminary experience on the implementation of computed tomography (CT)-based image guided brachytherapy (IGBT) of cervical cancer using high-dose-rate (HDR) Cobalt-60 source in University of Malaya Medical Centre (UMMC)

    NASA Astrophysics Data System (ADS)

    Jamalludin, Z.; Min, U. N.; Ishak, W. Z. Wan; Malik, R. Abdul

    2016-03-01

    This study presents our preliminary work of the computed tomography (CT) image guided brachytherapy (IGBT) implementation on cervical cancer patients. We developed a protocol in which patients undergo two Magnetic Resonance Imaging (MRI) examinations; a) prior to external beam radiotherapy (EBRT) and b) prior to intra-cavitary brachytherapy for tumour identification and delineation during IGBT planning and dosimetry. For each fraction, patients were simulated using CT simulator and images were transferred to the treatment planning system. The HR-CTV, IR-CTV, bladder and rectum were delineated on CT-based contouring for cervical cancer. Plans were optimised to achieve HR-CTV and IR-CTV dose (D90) of total EQD2 80Gy and 60Gy respectively, while limiting the minimum dose to the most irradiated 2cm3 volume (D2cc) of bladder and rectum to total EQD2 90Gy and 75Gy respectively. Data from seven insertions were analysed by comparing the volume-based with traditional point- based doses. Based on our data, there were differences between volume and point doses of HR- CTV, bladder and rectum organs. As the number of patients having the CT-based IGBT increases from day to day in our centre, it is expected that the treatment and dosimetry accuracy will be improved with the implementation.

  4. Preliminary Report: Results of Computed Tracer Concentrations over Eastern China, South Korea, and Japan for 01 March to 30 May 2007 Daily Simulated Releases from Taiyuan, China

    SciTech Connect

    Vogt, P

    2007-08-07

    In order to prepare for a proposed long range tracer experiment in China for the spring of 2008 time period, NARAC computed hypothetical PMCH concentrations over Eastern China, South Korea and Japan for simulated releases from Taiyuan, China. Normalized 1 kg of PMCH source strength releases were made twice a day, with wind input from global forecast weather model. We used 6-hour analysis fields valid at the start of each model run, resulting in four wind fields per day. The selected domain encompassed the region of interest over eastern Asia and the Western Pacific. Screening runs were made for each day at 0000 and 1200 UTC from 01 April, 2007 through 29 May, 2007 for a total of 90 days and 180 cases. 24-hour average air concentrations were evaluated at 22 sample cities in the three regions of interest for each case. 15 sample cities were selected to help quantify modeling results for experiment objectives. Any case that resulted in model predicted air concentrations exceeding 2.0E-02 fL/L at a sample city in all three regions was then selected for a detailed model run with source times six hours before and after evaluated in addition to the case time. The detailed runs used the same wind fields and model domain, but 6-hour average air concentrations were generated and analyzed for the 15 sample cities. Each of the 180 cases were ranked subjectively, based on whether or not the model prediction indicated the possibility that a release on that date and time might achieve the long range experiment objectives. Ranks used are High, Good, Low, Poor, and Bad. Of the 180 cases run, NARAC dispersion models predicted 6 instances of High possibility, 8 cases of Good, 32 of Low, 74 of Poor, and 60 cases of Bad probability. Detailed model runs were made for all 14 High or Good probability cases, a total of only 7.8% of all analyzed. Based on the results of this study we have identified a few dates on which a release of a reasonable amount of PMCH tracer (on the order of 500 kg

  5. A Preliminary Jupiter Model

    NASA Astrophysics Data System (ADS)

    Hubbard, W. B.; Militzer, B.

    2016-03-01

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen-helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen-helium-rich envelope with approximately three times solar metallicity.

  6. Preliminary DIAL model

    SciTech Connect

    Gentry, S.; Taylor, J.; Stephenson, D.

    1994-06-01

    A unique end-to-end LIDAR sensor model has been developed supporting the concept development stage of the CALIOPE UV DIAL and UV laser-induced-fluorescence (LIF) efforts. The model focuses on preserving the temporal and spectral nature of signals as they pass through the atmosphere, are collected by the optics, detected by the sensor, and processed by the sensor electronics and algorithms. This is done by developing accurate component sub-models with realistic inputs and outputs, as well as internal noise sources and operating parameters. These sub-models are then configured using data-flow diagrams to operate together to reflect the performance of the entire DIAL system. This modeling philosophy allows the developer to have a realistic indication of the nature of signals throughout the system and to design components and processing in a realistic environment. Current component models include atmospheric absorption and scattering losses, plume absorption and scattering losses, background, telescope and optical filter models, PMT (photomultiplier tube) with realistic noise sources, amplifier operation and noise, A/D converter operation, noise and distortion, pulse averaging, and DIAL computation. Preliminary results of the model will be presented indicating the expected model operation depicting the October field test at the NTS spill test facility. Indications will be given concerning near-term upgrades to the model.

  7. Absorbed Radiation Dose in Radiosensitive Organs Using 64- and 320-Row Multidetector Computed Tomography: A Comparative Study

    PubMed Central

    Khan, Atif N.; Nikolic, Boris; Khan, Mohammad K.; Kang, Jian; Khosa, Faisal

    2014-01-01

    Aim. To determine absorbed radiation dose (ARD) in radiosensitive organs during prospective and full phase dose modulation using ECG-gated MDCTA scanner under 64- and 320-row detector modes. Methods. Female phantom was used to measure organ radiation dose. Five DP-3 radiation detectors were used to measure ARD to lungs, breast, and thyroid using the Aquilion ONE scanner in 64- and 320-row modes using both prospective and dose modulation in full phase acquisition. Five measurements were made using three tube voltages: 100, 120, and 135 kVp at 400 mA at heart rate (HR) of 60 and 75 bpm for each protocol. Mean acquisition was recorded in milligrays (mGy). Results. Mean ARD was less for 320-row versus 64-row mode for each imaging protocol. Prospective EKG-gated imaging protocol resulted in a statistically lower ARD using 320-row versus 64-row modes for midbreast (6.728 versus 19.687 mGy, P < 0.001), lung (6.102 versus 21.841 mGy, P < 0.001), and thyroid gland (0.208 versus 0.913 mGy; P < 0.001). Retrospective imaging using 320- versus 64-row modes showed lower ARD for midbreast (10.839 versus 43.169 mGy, P < 0.001), lung (8.848 versus 47.877 mGy, P < 0.001), and thyroid gland (0.057 versus 2.091 mGy; P < 0.001). ARD reduction was observed at lower kVp and heart rate. Conclusions. Dose reduction to radiosensitive organs is achieved using 320-row compared to 64-row modes for both prospective and retrospective gating, whereas 64-row mode is equivalent to the same model 64-row MDCT scanner. PMID:25170427

  8. Dedicated dental volumetric and total body multislice computed tomography: a comparison of image quality and radiation dose

    NASA Astrophysics Data System (ADS)

    Strocchi, Sabina; Colli, Vittoria; Novario, Raffaele; Carrafiello, Gianpaolo; Giorgianni, Andrea; Macchi, Aldo; Fugazzola, Carlo; Conte, Leopoldo

    2007-03-01

    Aim of this work is to compare the performances of a Xoran Technologies i-CAT Cone Beam CT for dental applications with those of a standard total body multislice CT (Toshiba Aquilion 64 multislice) used for dental examinations. Image quality and doses to patients have been compared for the three main i-CAT protocols, the Toshiba standard protocol and a Toshiba modified protocol. Images of two phantoms have been acquired: a standard CT quality control phantom and an Alderson Rando ® anthropomorphic phantom. Image noise, Signal to Noise Ratio (SNR), Contrast to Noise Ratio (CNR) and geometric accuracy have been considered. Clinical image quality was assessed. Effective dose and doses to main head and neck organs were evaluated by means of thermo-luminescent dosimeters (TLD-100) placed in the anthropomorphic phantom. A Quality Index (QI), defined as the ratio of squared CNR to effective dose, has been evaluated. The evaluated effective doses range from 0.06 mSv (i-CAT 10 s protocol) to 2.37 mSv (Toshiba standard protocol). The Toshiba modified protocol (halved tube current, higher pitch value) imparts lower effective dose (0.99 mSv). The conventional CT device provides lower image noise and better SNR, but clinical effectiveness similar to that of dedicated dental CT (comparable CNR and clinical judgment). Consequently, QI values are much higher for this second CT scanner. No geometric distortion has been observed with both devices. As a conclusion, dental volumetric CT supplies adequate image quality to clinical purposes, at doses that are really lower than those imparted by a conventional CT device.

  9. Preliminary Drill Sites

    DOE Data Explorer

    Lane, Michael

    2013-06-28

    Preliminary locations for intermediate depth temperature gradient holes and/or resource confirmation wells based on compilation of geological, geophysical and geochemical data prior to carrying out the DOE-funded reflection seismic survey.

  10. Preliminary Response Analysis of AUV

    NASA Astrophysics Data System (ADS)

    Hariri, Azian; Basharie, Siti Mariam; Ghani, Mohamad Hanifah Abd.

    2010-06-01

    Development of Autonomous Unmanned Vehicle (AUV) involves a great task to fully understand the overall working principles of an UAV that needed time, experience and a wide range of intelligence to cover the entire scientific facts. This study is done by means to acquire the fundamental knowledge in understanding the stability and response of an UAV. The longitudinal response and stability of UAV owing to deflection of stern plane during trimmed equilibrium motion can be computed by solving the AUV equation of motion. In this study, the AUV equations of motion were rederived and the solution was computed with the aid of Matlab software. From the existing AUV, a new dimension, weight and speed were specified to be used in the rederivation of the linearised AUV longitudinal equations of motion. From the analysis done, the longitudinal response AUV shows the stern plane and thrust has relatively steady longitudinal control power and quick response characteristic. The results had successfully given a preliminary insight of the specified AUV response and dynamic stability.

  11. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  12. Cognitive remediation for adolescents with 22q11 deletion syndrome (22q11DS): A preliminary study examining effectiveness, feasibility, and fidelity of a hybrid strategy, remote and computer-based intervention

    PubMed Central

    Mariano, Margaret A.; Tang, Kerri; Kurtz, Matthew; Kates, Wendy R.

    2015-01-01

    Background 22q11DS is a multiple anomaly syndrome involving intellectual and behavioral deficits, and increased risk for schizophrenia. As cognitive remediation (CR) has recently been found to improve cognition in younger patients with schizophrenia, we investigated the efficacy, feasibility, and fidelity of a remote, hybrid strategy, computerized CR program in youth with 22q11DS. Methods A longitudinal design was implemented in which 21 participants served as their own controls. Following an eight month baseline period in which no interventions were provided, cognitive coaches met with participants remotely for CR via video conferencing three times a week over a targeted 8 month timeframe and facilitated their progress through the intervention, offering task-specific strategies. A subset of strategies were examined for fidelity. Outcomes were evaluated using a neurocognitive test battery at baseline, pre-treatment and post-treatment. Results All participants adhered to the intervention. The mean length of the treatment phase was 7.96 months. A moderately high correlation (intraclass correlation coefficient, 0.73) was found for amount and type of strategies offered by coaches. Participants exhibited significant improvements (ES = .36–.55, p ≤ .009) in working memory, shifting attention and cognitive flexibility. All significant models were driven by improvements in pre to post-treatment scores. Conclusions Based on our preliminary investigation, a remote, hybrid strategy, computerized CR program can be implemented with 22q11DS youth despite geographic location, health, and cognitive deficits. It appears effective in enhancing cognitive skills during the developmental period of adolescence, making this type of CR delivery useful for youth with 22q11DS transitioning into post-school environments. PMID:26044111

  13. Catehol-O-Methyltransferase gene Val158Met polymorphism as a potential predictor of response to computer-assisted delivery of cognitive-behavioral therapy among cocaine-dependent individuals: Preliminary findings from a randomized controlled trial

    PubMed Central

    Carroll, Kathleen M.; Herman, Aryeh; DeVito, Elise E.; Frankforter, Tami L.; Potenza, Marc N; Sofuoglu, Mehmet

    2015-01-01

    Background Findings from uncontrolled studies suggest that the COMT Val108/158Met polymorphism may affect response to cognitive behavioral therapy (CBT) in some populations. Using data from a randomized controlled trial evaluating computerized CBT (CBT4CBT), we evaluated treatment response by COMT genotype, with the a priori hypothesis that Val carriers would have improved response to computerized delivery of CBT. Methods 101 cocaine-dependent individuals, of whom 81 contributed analyzable genetic samples, were randomized to standard methadone maintenance treatment plus CBT4CBT or standard treatment alone in an 8-week trial. Results There was a significant genotype by time effect on frequency of cocaine use from baseline to the end of the 6-month follow-up, suggesting greater reductions over time for Val carriers relative to individuals with the Met/Met genotype. There was a significant treatment condition by genotype interactions for rates of participants attaining 21 or more days of continuous abstinence as well as self-reported percent days of abstinence, suggesting less cocaine use among Val carriers when assigned to CBT compared to standard treatment. Exploration of possible mechanisms using measures of attentional biased also pointed to greater change over time in these measures among the Val carriers assigned to CBT. Conclusion These are the first data from a randomized controlled trial indicating significant interactions of COMT polymorphism and behavioral therapy condition on treatment outcome, where Val carriers appeared to respond particularly well to computerized CBT. These preliminary data point to a potential biomarker of response to CBT linked to its putative mechanism of action, enhanced cognitive control. PMID:25930952

  14. Genesis Preliminary Examination Plans

    NASA Technical Reports Server (NTRS)

    McNamara, K. M.; Stansbery, E. K.

    2004-01-01

    The purpose of preliminary examination of the Genesis sample collectors is to provide information on the condition and availability of collector materials to the science community as a basis for allocation requests. Similarly, the information will be used by the Genesis Sample Allocation sub-committee of CAPTEM to determine the optimum allocation scheme, and by the Genesis Curator to determine the processing sequence for allocation production. The plan includes a decision process and detailed examination and documentation protocol for whole arrays and individual collectors (wafers, concentrator targets, bulk metallic glass, gold foil, and polished aluminum). It also includes a plan for communicating the information obtained to the scientific community. The plan does not include a detailed plan for preliminary examination of the SRC lid foil collectors, the process for removal of individual collectors from their frames, or for the subsequent subdivision of collector materials for allocation.

  15. Preliminary decommissioning study reports

    SciTech Connect

    Peretz, F.J.

    1984-09-01

    The Molten Salt Reactor Experiment (MSRE) is one of approximately 76 facilities currently managed by the ORNL Surplus Facilities Management Program (SFMP). This program, as part of the DOE national SFMP, is responsible for the maintenance and surveillance and the final decommissioning of radioactively-contaminated surplus ORNL facilities. A long range planning effort is being conducted that will outline the scope and objectives of the ORNL program and establish decommissioning priorities based on health and safety concerns, budget constraints, and other programmatic constraints. In support of this SFMP planning activity, preliminary engineering assessments are being conducted for each of the ORNL surplus facilities currently managed under the program. These efforts, in general, are designed to: (1) provide an initial assessment of the potential decommissioning alternatives; (2) choose a preferred alternative and provide a justification for that choice, and (3) provide a preliminary description of the decommissioning plan, including cost and schedule estimates. Because of several issues which cannot be evaluated quantitatively at this time, this report on the MSRE does not select a most probable decommissioning mode'' but rather discusses the issues and representative alternatives for disposal of the MSRE fuel salts and decommissioning of the facility. A budget and schedule representative of the types of activities likely to be required is also suggested for preliminary use in the SFMP Long Range Plan.

  16. On Preliminary Breakdown

    NASA Astrophysics Data System (ADS)

    Beasley, W. H.; Petersen, D.

    2013-12-01

    The preliminary breakdown phase of a negative cloud-to-ground lightning flash was observed in detail. Observations were made with a Photron SA1.1 high-speed video camera operating at 9,000 frames per second, fast optical sensors, a flat-plate electric field antenna covering the SLF to MF band, and VHF and UHF radio receivers with bandwidths of 20 MHz. Bright stepwise extensions of a negative leader were observed at an altitude of 8 km during the first few milliseconds of the flash, and were coincident with bipolar electric field pulses called 'characteristic pulses'. The 2-D step lengths of the preliminary processes were in excess of 100 meters, with some 2-D step lengths in excess of 200 meters. Smaller and shorter unipolar electric field pulses were superposed onto the bipolar electric field pulses, and were coincident with VHF and UHF radio pulses. After a few milliseconds, the emerging negative stepped leader system showed a marked decrease in luminosity, step length, and propagation velocity. Details of these events will be discussed, including the possibility that the preliminary breakdown phase consists not of a single developing lightning leader system, but of multiple smaller lightning leader systems that eventually join together into a single system.

  17. Computer-Based Education at Cedarville College: A White Paper.

    ERIC Educational Resources Information Center

    Rogers, Rex M.

    In order to provide a framework for addressing computer literacy and program development considerations, this document discusses issues in computer implementation on higher education campuses and outlines a preliminary plan for Cedarville College to meet computer-related needs of faculty and students. The growth of the computer industry and the…

  18. A preliminary weather model for optical communications through the atmosphere

    NASA Technical Reports Server (NTRS)

    Shaik, K. S.

    1988-01-01

    A preliminary weather model is presented for optical propagation through the atmosphere. It can be used to compute the attenuation loss due to the atmosphere for desired link availability statistics. The quantitative results that can be obtained from this model provide good estimates for the atmospheric link budget necessary for the design of an optical communication system. The result is extended to provide for the computation of joint attenuation probability for n sites with uncorrelated weather patterns.

  19. The ASTRO-1 preliminary design review coupled load analysis

    NASA Technical Reports Server (NTRS)

    Mcghee, D. S.

    1984-01-01

    Results of the ASTRO-1 preliminary design review coupled loads analysis are presented. The M6.0Y Generic Shuttle mathematical models were used. Internal accelerations, interface forces, relative displacements, and net e.g., accelerations were recovered for two ASTRO-1 payloads in a tandem configuration. Twenty-seven load cases were computed and summarized. Load exceedences were found and recommendations made.

  20. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  1. Modeling the complete Otto cycle - Preliminary version

    NASA Technical Reports Server (NTRS)

    Zeleznik, F. J.; Mcbride, B. J.

    1977-01-01

    A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

  2. [Use of Computers in Introductory Physics Teaching.

    ERIC Educational Resources Information Center

    Merrill, John R.

    This paper presents some of the preliminary results of Project COEXIST at Dartmouth College, an NSF sponsored project to investigate ways to use computers in introductory physics and mathematics teaching. Students use the computer in a number of ways on homework, on individual projects, and in the laboratory. Students write their own programs,…

  3. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  4. Ruiz Volcano: Preliminary report

    NASA Astrophysics Data System (ADS)

    Ruiz Volcano, Colombia (4.88°N, 75.32°W). All times are local (= GMT -5 hours).An explosive eruption on November 13, 1985, melted ice and snow in the summit area, generating lahars that flowed tens of kilometers down flank river valleys, killing more than 20,000 people. This is history's fourth largest single-eruption death toll, behind only Tambora in 1815 (92,000), Krakatau in 1883 (36,000), and Mount Pelée in May 1902 (28,000). The following briefly summarizes the very preliminary and inevitably conflicting information that had been received by press time.

  5. Environmental Survey preliminary report

    SciTech Connect

    Not Available

    1988-04-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Sandia National Laboratories conducted August 17 through September 4, 1987. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with Sandia National Laboratories-Albuquerque (SNLA). The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at SNLA, and interviews with site personnel. 85 refs., 49 figs., 48 tabs.

  6. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  7. Speckle-variance optical coherence tomography: a novel approach to skin cancer characterization using vascular patterns.

    PubMed

    Markowitz, Orit; Schwartz, Michelle; Minhas, Sumeet; Siegel, Daniel M

    2016-01-01

    Non-invasive imaging devices are currently being utilized in research and clinical settings to help visualize, characterize, anddiagnose cancers of the skin. Speckle-variance optical coherence tomography (svOCT) is one such technology that offers considerable promise for non-invasive, real time detection of skin cancers given its added ability to show changes in microvasculature. We present four early lesions of the face namely sebaceous hyperplasia, basal cell skin cancer, pigmented actinic keratosis, and malignant melanoma in situ that each display different important identification markers on svOCT. Up until now, svOCT has mainly been evaluated for lesion diagnosis using transversal (vertical) sections. Our preliminary svOCT findings use dynamic en face (horizontal) visualization to differentiate lesions based on their specific vascular organizations. These observed patterns further elucidate the potential of this imaging device to become a powerful tool in patient disease assessment. PMID:27617454

  8. Preliminary ISIS users manual

    NASA Technical Reports Server (NTRS)

    Grantham, C.

    1979-01-01

    The Interactive Software Invocation (ISIS), an interactive data management system, was developed to act as a buffer between the user and host computer system. The user is provided by ISIS with a powerful system for developing software or systems in the interactive environment. The user is protected from the idiosyncracies of the host computer system by providing such a complete range of capabilities that the user should have no need for direct access to the host computer. These capabilities are divided into four areas: desk top calculator, data editor, file manager, and tool invoker.

  9. 2-D Fused Image Reconstruction approach for Microwave Tomography: a theoretical assessment using FDTD Model.

    PubMed

    Bindu, G; Semenov, S

    2013-01-01

    This paper describes an efficient two-dimensional fused image reconstruction approach for Microwave Tomography (MWT). Finite Difference Time Domain (FDTD) models were created for a viable MWT experimental system having the transceivers modelled using thin wire approximation with resistive voltage sources. Born Iterative and Distorted Born Iterative methods have been employed for image reconstruction with the extremity imaging being done using a differential imaging technique. The forward solver in the imaging algorithm employs the FDTD method of solving the time domain Maxwell's equations with the regularisation parameter computed using a stochastic approach. The algorithm is tested with 10% noise inclusion and successful image reconstruction has been shown implying its robustness.

  10. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  11. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  12. In Situ Cryo-Electron Tomography: A Post-Reductionist Approach to Structural Biology.

    PubMed

    Asano, Shoh; Engel, Benjamin D; Baumeister, Wolfgang

    2016-01-29

    Cryo-electron tomography is a powerful technique that can faithfully image the native cellular environment at nanometer resolution. Unlike many other imaging approaches, cryo-electron tomography provides a label-free method of detecting biological structures, relying on the intrinsic contrast of frozen cellular material for direct identification of macromolecules. Recent advances in sample preparation, detector technology, and phase plate imaging have enabled the structural characterization of protein complexes within intact cells. Here, we review these technical developments and outline a detailed computational workflow for in situ structural analysis. Two recent studies are described to illustrate how this workflow can be adapted to examine both known and unknown cellular complexes. The stage is now set to realize the promise of visual proteomics--a complete structural description of the cell's native molecular landscape. PMID:26456135

  13. In Situ Cryo-Electron Tomography: A Post-Reductionist Approach to Structural Biology.

    PubMed

    Asano, Shoh; Engel, Benjamin D; Baumeister, Wolfgang

    2016-01-29

    Cryo-electron tomography is a powerful technique that can faithfully image the native cellular environment at nanometer resolution. Unlike many other imaging approaches, cryo-electron tomography provides a label-free method of detecting biological structures, relying on the intrinsic contrast of frozen cellular material for direct identification of macromolecules. Recent advances in sample preparation, detector technology, and phase plate imaging have enabled the structural characterization of protein complexes within intact cells. Here, we review these technical developments and outline a detailed computational workflow for in situ structural analysis. Two recent studies are described to illustrate how this workflow can be adapted to examine both known and unknown cellular complexes. The stage is now set to realize the promise of visual proteomics--a complete structural description of the cell's native molecular landscape.

  14. Optical computer motherboards

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Xu, Guoda; Bartha, John M.; Gruntman, Michael A.

    1997-09-01

    In this paper, we investigate the application of precision plastic optics into a communication/computer sub-system, such as a hybrid computer motherboard. We believe that using optical waveguides for next-generation computer motherboards can provide a high performance alternative for present multi-layer printed circuit motherboards. In response to this demand, we suggest our novel concept of a hybrid motherboard based on an internal-fiber-coupling (IFC) wavelength-division-multiplexing (WDM) optical backplane. The IFC/WDM backplane provides dedicated Tx/Rx connections, and applies low-cost, high-performance components, including CD LDs, GRIN plastic fibers, molding housing, and nonimaging optics connectors. Preliminary motherboard parameters are: speed 100 MHz/100 m, or 1 GHz/10 m; fiber loss approximately 0.01 dB/m; almost zero fan-out/fan-in optical power loss, and eight standard wavelength channels. The proposed hybrid computer motherboard, based on innovative optical backplane technology, should solve low-speed, low-parallelism bottlenecks in present electric computer motherboards.

  15. We Teach the Children: Computer Literacy as a Feminist Issue.

    ERIC Educational Resources Information Center

    Rampy, Leah Moran

    Preliminary research on the distribution and use of computers in public schools indicates the existance of socioeconomic, regional, and sex inequalities. Inequity is found in the ownership of computers by non-Southern schools and by schools serving students from middle and high socioeconomic levels. Within schools owning computers, inequity is…

  16. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  17. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  18. Reproducibility of Tear Meniscus Measurement by Fourier-Domain Optical Coherence Tomography: A Pilot Study

    PubMed Central

    Zhou, Sheng; Li, Yan; Lu, Ake Tzu-Hui; Liu, Pengfei; Tang, Maolong; Yiu, Samuel C.; Huang, David

    2009-01-01

    BACKGROUND AND OBJECTIVE To study the reproducibility of tear meniscus measurement with high-speed high-resolution Fourier-domain optical coherence tomography (FD-OCT). PATIENTS AND METHODS Twenty normal participants were enrolled in this prospective study. The lower tear meniscus in the right eye of each subject was imaged by vertical scans centered on the inferior cornea and the lower eyelid using an FD-OCT system (RTVue; Optovue, Inc., Fremont, CA) with a corneal adaptor. The system performs 26,000 axial scans per second and has a 5-micron axial resolution. Each subject was examined at two visits 30 to 60 days apart. Each eye was scanned twice on each visit. The scans were taken 2 seconds after a blink. The lower meniscus height, depth, and cornea-meniscus angle were measured with a computer caliper. The cross-sectional area was calculated using a two-triangle approximation. RESULTS The between-visits coefficient of variation was 17.5%, 18.0%, 35.5%, and 12.2% for meniscus height, depth, area, and angle, respectively. The intraclass correlations for these parameters were 0.605, 0.558, 0.567, and 0.367, respectively. CONCLUSION FD-OCT measures lower tear meniscus dimensions and area with higher between-visits reproducibility than previous OCT instruments. FD-OCT may be a useful way to measure dry eye severity and treatment effectiveness. PMID:19772266

  19. Tomography: A window on the role of sulfur in the structure of micrometeorites

    NASA Astrophysics Data System (ADS)

    Taylor, Susan; Jones, Keith W.; Herzog, Gregory F.; Hornig, Claire E.

    2011-10-01

    To determine the role played by sulfides in the formation of vesicles and FeNi metal beads, we mapped the locations and tabulated the numbers of sulfides, metal beads, and vesicles in 1583 sectioned micrometeorites (MMs) using conventional microscopy and in 190 whole MMs using synchrotron computed microtomography (SCMT). Both the section and the SCMT images show that sulfides melt, coalesce, and migrate to the MMs surface. The decomposition of sulfides may occur during all these stages. Given the sulfide morphologies and compositions that we see in section, we think the breakdown of Ni sulfides produces the FeNi beads. The SCMT images show that metal beads are common in melted MMs, >50% have them. Vesicles in porphyritic and scoriaceous MMs are also probably formed as sulfides decompose. Not only do sulfides abut the vesicles but also the temperatures at which sulfides decompose overlap those at which MM surfaces first melt and temporarily seal, suggesting that S gases could produce most of these vesicles. As the vesicle shapes and patterns of distribution differ among MM classes, tomography can be used to nondestructively screen for specific types of MMs. Tomography is a powerful tool for visualizing the three-dimensional distribution of metal beads, sulfides, mean densities, and vesicles in MMs.

  20. Enhanced preliminary assessment

    SciTech Connect

    Not Available

    1992-02-01

    An Enhanced Preliminary Assessment was conducted at Fort Benjamin Harrison (FBH) Indiana, which is located approximately 12 miles from downtown Indianapolis in Lawrence Township, Marion County. FBH contains 2,501 acres, of which approximately 1,069 acres is covered by woodlands. Activities at FBH include administration, training, housing, and support. Sensitive environments at FBH include wetlands, habitat areas for the endangered Indiana bat, endangered plants, and historically and archeologically significant areas. FBH is a U.S. Army Soldier Support Center under the jurisdiction of the U.S. Army Training and Doctrine Command (TRADOC). Based on information obtained during and subsequent to a site visit (15 through 18 October 1991), 36 types of Areas Requiring Environmental Evaluation (AREEs) were identified and grouped by the following categories: Facility Operations; Maintenance/Fueling Operations; Water Treatment Operations; Training Areas; Hazardous Materials Storage/Waste Handling Areas; Sanitary Wastewater Treatment Plants; Storage Tanks; Landfills/Incinerators; Medical Facilities; Burn Pit Areas; Spill Areas; Ammunition Storage; Coal Storage; and Facility-wide AREEs. This report presents a summary of findings for each AREE and recommendations for further action.

  1. Preliminary Analysis of Photoreading

    NASA Technical Reports Server (NTRS)

    McNamara, Danielle S.

    2000-01-01

    The purpose of this project was to provide a preliminary analysis of a reading strategy called PhotoReading. PhotoReading is a technique developed by Paul Scheele that claims to increase reading rate to 25,000 words per minute (Scheele, 1993). PhotoReading itself involves entering a "relaxed state" and looking at, but not reading, each page of a text for a brief moment (about I to 2 seconds). While this technique has received attention in the popular press, there had been no objective examinations of the technique's validity. To examine the effectiveness of PhotoReading, the principal investigator (i.e., trainee) participated in a PhotoReading workshop to learn the technique. Parallel versions of two standardized and three experimenter-created reading comprehension tests were administered to the trainee and an expert user of the PhotoReading technique to compare the use of normal reading strategies and the PhotoReading technique by both readers. The results for all measures yielded no benefits of using the PhotoReading technique. The extremely rapid reading rates claimed by PhotoReaders were not observed; indeed, the reading rates were generally comparable to those for normal reading. Moreover, the PhotoReading expert generally showed an increase in reading time when using the PhotoReading technique in comparison to when using normal reading strategies to process text. This increase in reading time when PhotoReading was accompanied by a decrease in text comprehension.

  2. Synthesis, Preliminary Bioevaluation and Computational Analysis of Caffeic Acid Analogues

    PubMed Central

    Liu, Zhiqian; Fu, Jianjun; Shan, Lei; Sun, Qingyan; Zhang, Weidong

    2014-01-01

    A series of caffeic acid amides were designed, synthesized and evaluated for anti-inflammatory activity. Most of them exhibited promising anti-inflammatory activity against nitric oxide (NO) generation in murine macrophage RAW264.7 cells. A 3D pharmacophore model was created based on the biological results for further structural optimization. Moreover, predication of the potential targets was also carried out by the PharmMapper server. These amide analogues represent a promising class of anti-inflammatory scaffold for further exploration and target identification. PMID:24857914

  3. Spectral-element global waveform tomography: A second-generation upper-mantle model

    NASA Astrophysics Data System (ADS)

    French, S. W.; Lekic, V.; Romanowicz, B. A.

    2012-12-01

    The SEMum model of Lekic and Romanowicz (2011a) was the first global upper-mantle VS model obtained using whole-waveform inversion with spectral element (SEM: Komatitsch and Vilotte, 1998) forward modeling of time domain three component waveforms. SEMum exhibits stronger amplitudes of heterogeneity in the upper 200km of the mantle compared to previous global models - particularly with respect to low-velocity anomalies. To make SEM-based waveform inversion tractable at global scales, SEMum was developed using: (1) a version of SEM coupled to 1D mode computation in the earth's core (C-SEM, Capdeville et al., 2003); (2) asymptotic normal-mode sensitivity kernels, incorporating multiple forward scattering and finite-frequency effects in the great-circle plane (NACT: Li and Romanowicz, 1995); and (3) a smooth anisotropic crustal layer of uniform 60km thickness, designed to match global surface-wave dispersion while reducing the cost of time integration in the SEM. The use of asymptotic kernels reduced the number of SEM computations considerably (≥ 3x) relative to purely numerical approaches (e.g. Tarantola, 1984), while remaining sufficiently accurate at the periods of interest (down to 60s). However, while the choice of a 60km crustal-layer thickness is justifiable in the continents, it can complicate interpretation of shallow oceanic upper-mantle structure. We here present an update to the SEMum model, designed primarily to address these concerns. The resulting model, SEMum2, was derived using a crustal layer that again fits global surface-wave dispersion, but with a more geologically consistent laterally varying thickness: approximately honoring Crust2.0 (Bassin, et al., 2000) Moho depth in the continents, while saturating at 30km in the oceans. We demonstrate that this approach does not bias our upper mantle model, which is constrained not only by fundamental mode surface waves, but also by overtone waveforms. We have also improved our data-selection and

  4. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  5. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  6. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  7. Female Computer

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Melba Roy heads the group of NASA mathematicians, known as 'computers,' who track the Echo satellites. Roy's computations help produce the orbital element timetables by which millions can view the satellite from Earth as it passes overhead.

  8. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  9. Quantum computing

    PubMed Central

    Li, Shu-Shen; Long, Gui-Lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization. PMID:11562459

  10. Computer Algebra.

    ERIC Educational Resources Information Center

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  11. Modeling aspects and computational methods for some recent problems of tomographic imaging

    NASA Astrophysics Data System (ADS)

    Allmaras, Moritz

    In this dissertation, two recent problems from tomographic imaging are studied, and results from numerical simulations with synthetic data are presented. The first part deals with ultrasound modulated optical tomography, a method for imaging interior optical properties of partially translucent media that combines optical contrast with ultrasound resolution. The primary application is the optical imaging of soft tissue, for which scattering and absorption rates contain important functional and structural information about the physiological state of tissue cells. We developed a mathematical model based on the diffusion approximation for photon propagation in highly scattering media. Simple reconstruction schemes for recovering optical absorption rates from boundary measurements with focused ultrasound are presented. We show numerical reconstructions from synthetic data generated for mathematical absorption phantoms. The results indicate that high resolution imaging with quantitatively correct values of absorption is possible. Synthetic focusing techniques are suggested that allow reconstruction from measurements with certain types of non-focused ultrasound signals. A preliminary stability analysis for a linearized model is given that provides an initial explanation for the observed stability of reconstruction. In the second part, backprojection schemes are proposed for the detection of small amounts of highly enriched nuclear material inside 3D volumes. These schemes rely on the geometrically singular structure that small radioactive sources represent, compared to natural background radiation. The details of the detection problem are explained, and two types of measurements, collimated and Compton-type measurements, are discussed. Computationally, we implemented backprojection by counting the number of particle trajectories intersecting each voxel of a regular rectangular grid covering the domain of detection. For collimated measurements, we derived confidence

  12. Preliminary result of Indonesian strain map based on geodetic measurements

    NASA Astrophysics Data System (ADS)

    Susilo, Meilano, Irwan; Abidin, Hasanuddin Z.; Sapiie, Benyamin; Efendi, Joni; Wijanarto, Antonius B.

    2016-05-01

    GPS measurements from 1993 until 2014 across Indonesia region are providing longer time series at 2 - 3 millimetre-level precision from which surface velocity estimates are derived. In this study, we use this GPS velocities field to construct a crustal strain rate map and not including the physical model yet. In our preliminary result, we only compute the magnitude of the strain rate. The strain map is useful to construct the deformation model in Indonesia and to support the Indonesia datum.

  13. Bacterial Identification Using Light Scattering Measurements: a Preliminary Report

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.

    1971-01-01

    The light scattering properties of single bacterial cells were examined as a possible means of identification. Three species were studied with streptococcus faecalis exhibiting a unique pattern; the light-scattering traces for staphylococcus aureus and escherichia coli were quite similar although differences existed. Based on preliminary investigations, the light scattering approach appeared promising with additional research needed to include a wide variety of bacterial species, computer capability to handle and analyze data, and expansion of light scattering theory to include bacterial cells.

  14. Preliminary heavy-light decay constants from the MILC Collaboration

    SciTech Connect

    Bernard, C.

    1994-12-01

    Preliminary results from the MILC Collaboration for f{sub B}, f{sub B{sub s}}, f{sub D}, f{sub D{sub s}} and their ratios are presented. We compute in the quenched approximation at {beta} = 6.3, 6.0 and 5.7 with Wilson light quarks and static and Wilson heavy quarks. We attempt to quantify all systematic errors other than quenching.

  15. Computer Ease.

    ERIC Educational Resources Information Center

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  16. Parallel computers

    SciTech Connect

    Treveaven, P.

    1989-01-01

    This book presents an introduction to object-oriented, functional, and logic parallel computing on which the fifth generation of computer systems will be based. Coverage includes concepts for parallel computing languages, a parallel object-oriented system (DOOM) and its language (POOL), an object-oriented multilevel VLSI simulator using POOL, and implementation of lazy functional languages on parallel architectures.

  17. Computer Manual.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This manual designed to provide the teacher with methods of understanding the computer and its potential in the classroom includes four units with exercises and an answer sheet. Unit 1 covers computer fundamentals, the mini computer, programming languages, an introduction to BASIC, and control instructions. Variable names and constants described…

  18. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  19. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  20. Photovoltaic stand-alone systems: Preliminary engineering design handbook

    NASA Astrophysics Data System (ADS)

    Macomber, H. L.; Ruzek, J. B.; Costello, F. A.

    1981-08-01

    Component design and engineering information, including estimation and reduction strategies, FV array characteristics, and material on batteries, power handling equipment, and back up systems are presented. The data needed to begin the design process and preliminary system design considerations are discussed. These considerations include analysis of insolation and siting, system sizing, feasibility assessment and reliability engineering approaches. Information on system design procedures and applicable codes and standards is presented. Information on system installation, operation, maintenance issues, personnel and facility safety requirements and various means of calculating insolation, including computer software and statistical computations are emphasized.

  1. [Pneumothorax revealed by postoperative computed tomography].

    PubMed

    Ikeda, Shizuka; Katori, Kiyoshi; Fujimoto, Minoru; Nitahara, Keiichi; Higa, Kazuo

    2005-11-01

    We report a case of pneumothorax revealed by postoperative computed tomography. A 39-year-old obese woman (height 153 cm, weight 70 kg) with fractures of the radius, ulna, clavicle, and femur in a traffic accident, was scheduled for osteosynthesis. Anesthesia was induced with thiopental and maintained with 50% nitrous oxide in oxygen and sevoflurane. The Spo2 decreased from 99% to 94% during the surgery. Bilateral chest sounds were symmetrical. The Spo2 increased to 100% after discontinuation of nitrous oxide. Pneumothorax was not evident on a postoperative chest X-ray, but computed tomography of the chest demonstrated right-sided pneumothorax. An ECG electrode had overlapped the fractured rib on the preoperative chest X-ray.

  2. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  3. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  4. Computer Literacy: Teaching Computer Ethics.

    ERIC Educational Resources Information Center

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  5. Computer Jet-Engine-Monitoring System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Ray, Ronald J.

    1992-01-01

    "Intelligent Computer Assistant for Engine Monitoring" (ICAEM), computer-based monitoring system intended to distill and display data on conditions of operation of two turbofan engines of F-18, is in preliminary state of development. System reduces burden on propulsion engineer by providing single display of summary information on statuses of engines and alerting engineer to anomalous conditions. Effective use of prior engine-monitoring system requires continuous attention to multiple displays.

  6. C-shaped mandibular primary first molar diagnosed with cone beam computed tomography: A novel case report and literature review of primary molars' root canal systems.

    PubMed

    Ozcan, Gozde; Sekerci, Ahmet Ercan; Kocoglu, Fatma

    2016-01-01

    Knowledge of the different anatomical variations in root canal system of dedicious dentition will improve the practice of the pediatric dentists. The teeth with C-shaped root canal configurations are definitely a problem in endodontic treatment. Dentists who are specialists of endodontics must have adequate knowledge about various root canal morphologies of primary tooth that have a tendency for rapid progression of dental caries to achieve a technically satisfactory outcome. This report presents an extraordinary case of unusual tooth morphology involving the mandibular first primary molar with a C-shaped configuration which has not yet been reported. PMID:27681406

  7. Patient understanding of radiation risk from medical computed tomography-A comparison of Hispanic vs. non-Hispanic emergency department populations.

    PubMed

    McNierney-Moore, Afton; Smith, Cynthia; Guardiola, Jose; Xu, K Tom; Richman, Peter B

    2015-01-01

    Background. Cultural differences and language barriers may adversely impact patients with respect to understanding the risks/benefits of medical testing. Objective. We hypothesized that there would be no difference in Hispanic vs. non-Hispanic patients' knowledge of radiation risk that results from CT of the abdomen/pelvis (CTAP). Methods. We enrolled a convenience sample of adults at an inner-city emergency department (ED). Patients provided written answers to rate agreement on a 10-point scale for two correct statements comparing radiation exposure equality between: CTAP and 5 years of background radiation (question 1); CTAP and 200 chest x-rays (question 3). Patients also rated their agreement that multiple CT scans increase the lifetime cancer risk (question 2). Scores of >8 were considered good knowledge. Multivariate logistic regression analyses were performed to estimate the independent effect of the Hispanic variable. Results. 600 patients in the study group; 63% Hispanic, mean age 39.2 ± 13.9 years. Hispanics and non-Hispanics whites were similar with respect to good knowledge-level answers to question 1 (17.3 vs. 15.1%; OR = 1.2; 95% CI [0.74-2.0]), question 2 (31.2 vs. 39.3%; OR = 0.76; 95% CI [0.54-1.1]), and question 3 (15.2 vs. 16.5%; OR = 1.1; 95% CI [0.66-1.8]). Compared to patients who earned <20,000, patients with income >40,000 were more likely to answer question 2 with good knowledge (OR = 1.96; 95% CI [1.2-3.1]). Conclusion. The study group's overall knowledge of radiation risk was poor, but we did not find significant differences between Hispanic vs. non-Hispanic patients.

  8. Assessment of systolic thickening with thallium-201 ECG-gated single-photon emission computed tomography: A parameter for local left ventricular function

    SciTech Connect

    Mochizuki, T.; Murase, K.; Fujiwara, Y.; Tanada, S.; Hamamoto, K.; Tauxe, W.N. )

    1991-08-01

    The authors measured left ventricular (LV) systolic thickening expressed as a systolic thickening ratio in 28 patients, using 201Tl ECG-gated SPECT. Five normals, 15 patients with prior myocardial infarction, 5 with hypertrophic cardiomyopathy, and 3 with dilated cardiomyopathy were studied. The systolic thickening ratio was calculated as ((end-systolic--end-diastolic pixel counts) divided by end-diastolic pixel counts), using the circumferential profile technique of both end-diastolic and end-systolic short axial images. Functional images of the systolic thickening ratio were also displayed with the bull's-eye method. The mean systolic thickening ratio thus calculated were as follows: normals, 0.53 {plus minus} 0.05 (mean {plus minus} 1 s.d.); non-transmural prior myocardial infarction, 0.33 {plus minus} 0.09; transmural prior myocardial infarction, 0.14 {plus minus} 0.05; hypertrophic cardiomyopathy in relatively nonhypertrophied areas, 0.56 {plus minus} 0.11; hypertrophic cardiomyopathy in hypertrophied areas, 0.23 {plus minus} 0.07; and dilated cardiomyopathy, 0.19 {plus minus} 0.02. The systolic thickening ratio analysis by gated thallium SPECT offers a unique approach for assessing LV function.

  9. Stardust Interstellar Preliminary Examination

    NASA Astrophysics Data System (ADS)

    Westphal, A.; Stardust Interstellar Preliminary Examation Team: http://www. ssl. berkeley. edu/~westphal/ISPE/

    2011-12-01

    A. J. Westphal, C. Allen, A. Ansari, S. Bajt, R. S. Bastien, H. A. Bechtel, J. Borg, F. E. Brenker, J. Bridges, D. E. Brownlee, M. Burchell, M. Burghammer, A. L. Butterworth, A. M. Davis, P. Cloetens, C. Floss, G. Flynn, D. Frank, Z. Gainsforth, E. Grün, P. R. Heck, J. K. Hillier, P. Hoppe, G. Huss, J. Huth, B. Hvide, A. Kearsley, A. J. King, B. Lai, J. Leitner, L. Lemelle, H. Leroux, R. Lettieri, W. Marchant, L. R. Nittler, R. Ogliore, F. Postberg, M. C. Price, S. A. Sandford, J.-A. Sans Tresseras, T. Schoonjans, S. Schmitz, G. Silversmit, A. Simionovici, V. A. Solé, R. Srama, T. Stephan, V. Sterken, J. Stodolna, R. M. Stroud, S. Sutton, M. Trieloff, P. Tsou, A. Tsuchiyama, T. Tyliszczak, B. Vekemans, L. Vincze, D. Zevin, M. E. Zolensky, >29,000 Stardust@home dusters ISPE author affiliations are at http://www.ssl.berkeley.edu/~westphal/ISPE/. In 2000 and 2002, a ~0.1m2 array of aerogel tiles and alumi-num foils onboard the Stardust spacecraft was exposed to the interstellar dust (ISD) stream for an integrated time of 200 days. The exposure took place in interplanetary space, beyond the orbit of Mars, and thus was free of the ubiquitous orbital debris in low-earth orbit that precludes effective searches for interstellar dust there. Despite the long exposure of the Stardust collector, <<100 ISD particles are expected to have been captured. The particles are thought to be ~1μm or less in size, and the total ISD collection is probably <10-6 by mass of the collection of cometary dust parti-cles captured in the Stardust cometary dust collector from the coma of the Jupiter-family comet Wild 2. Thus, although the first solid sample from the local interstellar medium is clearly of high interest, the diminutive size of the particles and the low numbers of particles present daunting challenges. Nevertheless, six recent developments have made a Preliminary Examination (PE) of this sample practical: (1) rapid automated digital optical scanning microscopy for three

  10. A Computer-Based Dietary Counseling System.

    ERIC Educational Resources Information Center

    Slack, Warner V.; And Others

    1976-01-01

    The preliminary trial of a program in which principles of patient-computer dialogue have been applied to dietary counseling is described. The program was designed to obtain historical information from overweight patients and to provide instruction and guidance regarding dietary behavior. Beginning with a teaching sequence, 25 non-overweight…

  11. Reducing the influence of spatial resolution to improve quantitative accuracy in emission tomography: A comparison of potential strategies

    NASA Astrophysics Data System (ADS)

    Hutton, B. F.; Olsson, A.; Som, S.; Erlandsson, K.; Braun, M.

    2006-12-01

    The goal of this paper is to compare strategies for reducing partial volume effects by either minimizing the cause (i.e. improving resolution) or correcting the effect. Correction for resolution loss can be achieved either by modelling the resolution for use in iterative reconstruction or by imposing constraints based on knowledge of the underlying anatomy. Approaches to partial volume correction largely rely on knowledge of the underlying anatomy, based on well-registered high-resolution anatomical imaging modalities (CT or MRI). Corrections can be applied by considering the signal loss that results by smoothing the high-resolution modality to the same resolution as obtained in emission tomography. A physical phantom representing the central brain structures was used to evaluate the quantitative accuracy of the various strategies for either improving resolution or correcting for partial volume effects. Inclusion of resolution in the reconstruction model improved the measured contrast for the central brain structures but still underestimated the true object contrast (˜0.70). Use of information on the boundaries of the structures in conjunction with a smoothing prior using maximum entropy reconstruction achieved some degree of contrast enhancement and improved the noise properties of the resulting images. Partial volume correction based on segmentation of registered anatomical images and knowledge of the reconstructed resolution permitted more accurate quantification of the target to background ratio for individual brain structures.

  12. PRELIMINARY DESIGN ANALYSIS OF AXIAL FLOW TURBINES

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1994-01-01

    A computer program has been developed for the preliminary design analysis of axial-flow turbines. Rapid approximate generalized procedures requiring minimum input are used to provide turbine overall geometry and performance adequate for screening studies. The computations are based on mean-diameter flow properties and a stage-average velocity diagram. Gas properties are assumed constant throughout the turbine. For any given turbine, all stages, except the first, are specified to have the same shape velocity diagram. The first stage differs only in the value of inlet flow angle. The velocity diagram shape depends upon the stage work factor value and the specified type of velocity diagram. Velocity diagrams can be specified as symmetrical, zero exit swirl, or impulse; or by inputting stage swirl split. Exit turning vanes can be included in the design. The 1991 update includes a generalized velocity diagram, a more flexible meanline path, a reheat model, a radial component of velocity, and a computation of free-vortex hub and tip velocity diagrams. Also, a loss-coefficient calibration was performed to provide recommended values for airbreathing engine turbines. Input design requirements include power or pressure ratio, mass flow rate, inlet temperature and pressure, and rotative speed. The design variables include inlet and exit diameters, stator angle or exit radius ratio, and number of stages. Gas properties are input as gas constant, specific heat ratio, and viscosity. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, blading angles, and last stage absolute and relative Mach numbers. This program is written in FORTRAN 77 and can be ported to any computer with a standard FORTRAN compiler which supports NAMELIST. It was originally developed on an IBM 7000 series computer running VM and has been implemented on IBM PC computers and compatibles running MS-DOS under Lahey FORTRAN, and

  13. Computational psychiatry.

    PubMed

    Montague, P Read; Dolan, Raymond J; Friston, Karl J; Dayan, Peter

    2012-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects.

  14. Heterogeneous concurrent computing with exportable services

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy

    1995-01-01

    Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.

  15. Computed Tomography

    NASA Astrophysics Data System (ADS)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  16. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  17. Concentrating solar collector subsystem: Preliminary design package

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Preliminary design data are presented for a concentrating solar collector including an attitude controller. Provided are schedules, technical status, all documents required for preliminary design, and other program activities.

  18. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  19. Cafeteria Computers.

    ERIC Educational Resources Information Center

    Dervarics, Charles

    1992-01-01

    By relying on new computer hardware and software, school food service departments can keep better records of daily food consumption, free and reduced-price meals, inventory, production, and other essentials. The most commonly used systems fall into two basic categories: point-of-sale computers and behind-the-counter systems. State funding efforts…

  20. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  1. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  2. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  3. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  4. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  5. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  6. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  7. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  8. Communication and Computability: The Case of Alan Mathison Turing.

    ERIC Educational Resources Information Center

    Chesebro, James W.

    1993-01-01

    Provides a preliminary examination of the relationships which exist between the disciplines of communication and computer science. Isolates the original principles which determined the development of computer science. Suggests how these early formation principles had and continue to have on the study of communication. Focuses on the seminal role…

  9. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Preliminary analysis. 158.345 Section... REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced by an integrated system, the applicant must provide a preliminary analysis of each technical grade...

  10. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Preliminary analysis. 161.170 Section... Preliminary analysis. (a) If the product is produced by an integrated system, the applicant must provide a preliminary analysis of each technical grade of active ingredient contained in the product to identify...

  11. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Preliminary analysis. 161.170 Section... Preliminary analysis. (a) If the product is produced by an integrated system, the applicant must provide a preliminary analysis of each technical grade of active ingredient contained in the product to identify...

  12. Preliminary evaluation of an automated metaphase finder

    SciTech Connect

    McFee, A.F.; Littlefield, L.G.

    1994-12-31

    Computer driven microscope units are available for the automated analysis of cytogenetic preparations. Their greatest benefit is the speed with which they can scan slides, locate metaphases, and display them for operator evaluation. We have preliminary evaluated the AKS-500 automated metaphase finding system (Imagenetics, Inc., Framingham, MA) to determine if it significantly improves the speed and/or efficiency with which metaphase figures can be located and evaluated. This unit accepts 8 microscope slides on a motorized stage and, following a 15-20 minute loading process, proceeds unattended to systematically scan all or a set portion of the slides, identifies metaphase figures, and stores their coordinate locations in memory. On command, metaphases are recalled and presented in the microscope field or evaluation, either in the search order in which they were encountered, or after a quality sort by the computer. Aberrations were scored in bone marrow metaphases from DMBA-treated mice which had been located by the machine, and compared to manual scoring of the same slides. Aberration rates did not differ when metaphases were scored in search order, but quality evaluation by the computer discriminated against heavily damaged metaphases. The time required to score 50 metaphases/slide was {approximately}30 min, compared with {approximately}3 hr for manual scoring of the same slides. On slides from human lymphocyte cultures, the machine located only about half as many metaphases as were identified in systematic manual searches of the slides, but the samples contained a higher proportion of scorable quality metaphases. A procedure has also been developed whereby Giesma-stained metaphases can be located, destained, and rapidly relocated after chromosome painting by FISH techniques.

  13. Finite element analyses of CCAT preliminary design

    NASA Astrophysics Data System (ADS)

    Sarawit, Andrew T.; Kan, Frank W.

    2014-07-01

    This paper describes the development of the CCAT telescope finite element model (FEM) and the analyses performed to support the preliminary design work. CCAT will be a 25 m diameter telescope operating in the 0.2 to 2 mm wavelength range. It will be located at an elevation of 5600 m on Cerro Chajnantor in Northern Chile, near ALMA. The telescope will be equipped with wide-field cameras and spectrometers mounted at the two Nasmyth foci. The telescope will be inside an enclosure to protect it from wind buffeting, direct solar heating, and bad weather. The main structures of the telescope include a steel Mount and a carbon-fiber-reinforced-plastic (CFRP) primary truss. The finite element model developed in this study was used to perform modal, frequency response, seismic response spectrum, stress, and deflection analyses of telescope. Modal analyses of telescope were performed to compute the structure natural frequencies and mode shapes and to obtain reduced order modal output at selected locations in the telescope structure to support the design of the Mount control system. Modal frequency response analyses were also performed to compute transfer functions at these selected locations. Seismic response spectrum analyses of the telescope subject to the Maximum Likely Earthquake were performed to compute peak accelerations and seismic demand stresses. Stress analyses were performed for gravity load to obtain gravity demand stresses. Deflection analyses for gravity load, thermal load, and differential elevation drive torque were performed so that the CCAT Observatory can verify that the structures meet the stringent telescope surface and pointing error requirements.

  14. Computer science concept inventories: past and future

    NASA Astrophysics Data System (ADS)

    Taylor, C.; Zingaro, D.; Porter, L.; Webb, K. C.; Lee, C. B.; Clancy, M.

    2014-10-01

    Concept Inventories (CIs) are assessments designed to measure student learning of core concepts. CIs have become well known for their major impact on pedagogical techniques in other sciences, especially physics. Presently, there are no widely used, validated CIs for computer science. However, considerable groundwork has been performed in the form of identifying core concepts, analyzing student misconceptions, and developing CI assessment questions. Although much of the work has been focused on CS1 and a CI has been developed for digital logic, some preliminary work on CIs is underway for other courses. This literature review examines CI work in other STEM disciplines, discusses the preliminary development of CIs in computer science, and outlines related research in computer science education that contributes to CI development.

  15. Mobile Computing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Swietek, Gregory E. (Technical Monitor)

    1994-01-01

    The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the

  16. Preliminary design studies for the DESCARTES and CIDER codes

    SciTech Connect

    Eslinger, P.W.; Miley, T.B.; Ouderkirk, S.J.; Nichols, W.E.

    1992-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) project is developing several computer codes to model the release and transport of radionuclides into the environment. This preliminary design addresses two of these codes: Dynamic Estimates of Concentrations and Radionuclides in Terrestrial Environments (DESCARTES) and Calculation of Individual Doses from Environmental Radionuclides (CIDER). The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways, given the output of the air transport code HATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. This document reports on preliminary design work performed by the code development team to determine if the requirements could be met for Descartes and CIDER. The document contains three major sections: (i) a data flow diagram and discussion for DESCARTES, (ii) a data flow diagram and discussion for CIDER, and (iii) a series of brief statements regarding the design approach required to address each code requirement.

  17. A preliminary design theory for polyphase impellers in unbounded flow

    NASA Astrophysics Data System (ADS)

    Yim, B.

    1982-01-01

    The main role of preliminary design for supercavitating propellers is to supply the basic data for the final design, such as: the hydrodynamic pitch angle, the radial load distributions, the approximate cavity length and the distribution of cavity source strengths which will help determine the three dimensional cavity source distribution. For this purpose, the effective use of supercavitating cascade theory with lifting line theory is discussed together with influences of neighboring cavities on cavity drag, the hydrodynamic pitch angle, inflow retardation and the optimum pitch distribution of the propeller. The computer program developed is applied to several existing propeller models. The results show that propeller efficiency is predicted well but pitch distribution is a little larger than for the model. The results are analyzed and compared with the results of a lifting surface design method which was developed for use with the preliminary design method.

  18. Intelligent redundant actuation system requirements and preliminary system design

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Geiger, L. J.; Harris, J.

    1985-01-01

    Several redundant actuation system configurations were designed and demonstrated to satisfy the stringent operational requirements of advanced flight control systems. However, this has been accomplished largely through brute force hardware redundancy, resulting in significantly increased computational requirements on the flight control computers which perform the failure analysis and reconfiguration management. Modern technology now provides powerful, low-cost microprocessors which are effective in performing failure isolation and configuration management at the local actuator level. One such concept, called an Intelligent Redundant Actuation System (IRAS), significantly reduces the flight control computer requirements and performs the local tasks more comprehensively than previously feasible. The requirements and preliminary design of an experimental laboratory system capable of demonstrating the concept and sufficiently flexible to explore a variety of configurations are discussed.

  19. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  20. Sort computation

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    1988-01-01

    Sorting has long been used to organize data in preparation for further computation, but sort computation allows some types of computation to be performed during the sort. Sort aggregation and sort distribution are the two basic forms of sort computation. Sort aggregation generates an accumulative or aggregate result for each group of records and places this result in one of the records. An aggregate operation can be any operation that is both associative and commutative, i.e., any operation whose result does not depend on the order of the operands or the order in which the operations are performed. Sort distribution copies the value from a field of a specific record in a group into that field in every record of that group.

  1. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  2. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  3. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  4. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  5. Space Elevators Preliminary Architectural View

    NASA Astrophysics Data System (ADS)

    Pullum, L.; Swan, P. A.

    Space Systems Architecture has been expanded into a process by the US Department of Defense for their large scale systems of systems development programs. This paper uses the steps in the process to establishes a framework for Space Elevator systems to be developed and provides a methodology to manage complexity. This new approach to developing a family of systems is based upon three architectural views: Operational View OV), Systems View (SV), and Technical Standards View (TV). The top level view of the process establishes the stages for the development of the first Space Elevator and is called Architectural View - 1, Overview and Summary. This paper will show the guidelines and steps of the process while focusing upon components of the Space Elevator Preliminary Architecture View. This Preliminary Architecture View is presented as a draft starting point for the Space Elevator Project.

  6. Computational gestalts and perception thresholds.

    PubMed

    Desolneux, Agnès; Moisan, Lionel; Morel, Jean-Michel

    2003-01-01

    In 1923, Max Wertheimer proposed a research programme and method in visual perception. He conjectured the existence of a small set of geometric grouping laws governing the perceptual synthesis of phenomenal objects, or "gestalt" from the atomic retina input. In this paper, we review this set of geometric grouping laws, using the works of Metzger, Kanizsa and their schools. In continuation, we explain why the Gestalt theory research programme can be translated into a Computer Vision programme. This translation is not straightforward, since Gestalt theory never addressed two fundamental matters: image sampling and image information measurements. Using these advances, we shall show that gestalt grouping laws can be translated into quantitative laws allowing the automatic computation of gestalts in digital images. From the psychophysical viewpoint, a main issue is raised: the computer vision gestalt detection methods deliver predictable perception thresholds. Thus, we are set in a position where we can build artificial images and check whether some kind of agreement can be found between the computationally predicted thresholds and the psychophysical ones. We describe and discuss two preliminary sets of experiments, where we compared the gestalt detection performance of several subjects with the predictable detection curve. In our opinion, the results of this experimental comparison support the idea of a much more systematic interaction between computational predictions in Computer Vision and psychophysical experiments. PMID:14766147

  7. Computational gestalts and perception thresholds.

    PubMed

    Desolneux, Agnès; Moisan, Lionel; Morel, Jean-Michel

    2003-01-01

    In 1923, Max Wertheimer proposed a research programme and method in visual perception. He conjectured the existence of a small set of geometric grouping laws governing the perceptual synthesis of phenomenal objects, or "gestalt" from the atomic retina input. In this paper, we review this set of geometric grouping laws, using the works of Metzger, Kanizsa and their schools. In continuation, we explain why the Gestalt theory research programme can be translated into a Computer Vision programme. This translation is not straightforward, since Gestalt theory never addressed two fundamental matters: image sampling and image information measurements. Using these advances, we shall show that gestalt grouping laws can be translated into quantitative laws allowing the automatic computation of gestalts in digital images. From the psychophysical viewpoint, a main issue is raised: the computer vision gestalt detection methods deliver predictable perception thresholds. Thus, we are set in a position where we can build artificial images and check whether some kind of agreement can be found between the computationally predicted thresholds and the psychophysical ones. We describe and discuss two preliminary sets of experiments, where we compared the gestalt detection performance of several subjects with the predictable detection curve. In our opinion, the results of this experimental comparison support the idea of a much more systematic interaction between computational predictions in Computer Vision and psychophysical experiments.

  8. Preliminary Parallaxes for Cool Subdwarfs

    NASA Astrophysics Data System (ADS)

    Dahn, Conard C.; Harris, Hugh C.

    2015-01-01

    Preliminary USNO CCD parallaxes are employed to locate 13 subdwarfs or subdwarf candidates with M_{v} > 14.0 in the M_{v} vs V-I, M_{K_{s}} vs I-K_{s} and/or M_{K_{s}} vs J-K_{s} absolute magnitude versus color diagrams. First parallax determinations are presented for the ultracool subdwarfs LEHPM2-59, LSR0822+17 LHS2100 and 2M1227-04

  9. Computational structures for robotic computations

    NASA Technical Reports Server (NTRS)

    Lee, C. S. G.; Chang, P. R.

    1987-01-01

    The computational problem of inverse kinematics and inverse dynamics of robot manipulators by taking advantage of parallelism and pipelining architectures is discussed. For the computation of inverse kinematic position solution, a maximum pipelined CORDIC architecture has been designed based on a functional decomposition of the closed-form joint equations. For the inverse dynamics computation, an efficient p-fold parallel algorithm to overcome the recurrence problem of the Newton-Euler equations of motion to achieve the time lower bound of O(log sub 2 n) has also been developed.

  10. Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.

    2011-01-01

    This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.

  11. [DNA computing].

    PubMed

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  12. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  13. Orbit Determination with Very Short Arcs: Preliminary Orbits and Identifications

    NASA Astrophysics Data System (ADS)

    Milani, A.; Gronchi, G. F.; Knezevic, Z.; Sansaturio, M. E.

    2004-05-01

    When the observation of a new asteroid are not enough to compute an orbit we can represent them with an attributable (two angles and their time derivatives). The undetermined range and range rate span an admissible region of solar system orbits, which can be represented by a set of Virtual Asteroids (VAs) selected by an optimal triangulation (see the presentation by G. Gronchi). The four coordinates of the attributable are the result of a fit and have a covariance matrix. Thus the predictions of future observations have a quasi-product structure (admissible region times confidence ellipsoid), approximated by a triangulation with a confidence ellipsoid for each node. If we have >2 observations we can also estimate the geodetic curvature and the acceleration of the observed path on the celestial sphere. If both are significantly measured they constrain the range and the range rate and may allow to reduce the size of the admissible region. To compute a a preliminary orbit starting from two attributables, for each VA (selected in the admissible region of the first arc) we consider the prediction at the time of the second and its covariance matrix, and we compare them with the attributable of the second arc with its covariance. By using the identification penalty (as in the algorithms for orbit identification) we can select as a preliminary orbit the VAs which fits together both arcs in the 8-dimensional space. Two attributables may not be enough to compute an orbit with convergent differential corrections. The preliminary orbit is used in a constrained differential correction, providing solutions along the Line Of Variations, to be used as second generation VAs to predict the observations at the time of a third arc. In general the identification with a third arc ensures a well determined orbit.

  14. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  15. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  16. GRIMD: distributed computing for chemists and biologists

    PubMed Central

    Piotto, Stefano; Biasi, Luigi Di; Concilio, Simona; Castiglione, Aniello; Cattaneo, Giuseppe

    2014-01-01

    Motivation: Biologists and chemists are facing problems of high computational complexity that require the use of several computers organized in clusters or in specialized grids. Examples of such problems can be found in molecular dynamics (MD), in silico screening, and genome analysis. Grid Computing and Cloud Computing are becoming prevalent mainly because of their competitive performance/cost ratio. Regrettably, the diffusion of Grid Computing is strongly limited because two main limitations: it is confined to scientists with strong Computer Science background and the analyses of the large amount of data produced can be cumbersome it. We have developed a package named GRIMD to provide an easy and flexible implementation of distributed computing for the Bioinformatics community. GRIMD is very easy to install and maintain, and it does not require any specific Computer Science skill. Moreover, permits preliminary analysis on the distributed machines to reduce the amount of data to transfer. GRIMD is very flexible because it shields the typical computational biologist from the need to write specific code for tasks such as molecular dynamics or docking calculations. Furthermore, it permits an efficient use of GPU cards whenever is possible. GRIMD calculations scale almost linearly and, therefore, permits to exploit efficiently each machine in the network. Here, we provide few examples of grid computing in computational biology (MD and docking) and bioinformatics (proteome analysis). Availability GRIMD is available for free for noncommercial research at www.yadamp.unisa.it/grimd Supplementary information www.yadamp.unisa.it/grimd/howto.aspx PMID:24516326

  17. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  18. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  19. Computational vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  20. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  1. Preliminary estimates of operating costs for lighter than air transports

    NASA Technical Reports Server (NTRS)

    Smith, C. L.; Ardema, M. D.

    1975-01-01

    A preliminary set of operating cost relationships are presented for airship transports. The starting point for the development of the relationships is the direct operating cost formulae and the indirect operating cost categories commonly used for estimating costs of heavier than air commercial transports. Modifications are made to the relationships to account for the unique features of airships. To illustrate the cost estimating method, the operating costs of selected airship cargo transports are computed. Conventional fully buoyant and hybrid semi-buoyant systems are investigated for a variety of speeds, payloads, ranges, and altitudes. Comparisons are made with aircraft transports for a range of cargo densities.

  2. Preliminary estimates of operating costs for lighter than air transports

    NASA Technical Reports Server (NTRS)

    Smith, C. L.; Ardema, M. D.

    1975-01-01

    Presented is a preliminary set of operating cost relationships for airship transports. The starting point for the development of the relationships is the direct operating cost formulae and the indirect operating cost categories commonly used for estimating costs of heavier than air commercial transports. Modifications are made to the relationships to account for the unique features of airships. To illustrate the cost estimating method, the operating costs of selected airship cargo transports are computed. Conventional fully buoyant and hybrid semi-buoyant systems are investigated for a variety of speeds, payloads, ranges, and altitudes. Comparisons are made with aircraft transports for a range of cargo densities.

  3. A preliminary experiment definition for video landmark acquisition and tracking

    NASA Technical Reports Server (NTRS)

    Schappell, R. T.; Tietz, J. C.; Hulstrom, R. L.; Cunningham, R. A.; Reel, G. M.

    1976-01-01

    Six scientific objectives/experiments were derived which consisted of agriculture/forestry/range resources, land use, geology/mineral resources, water resources, marine resources and environmental surveys. Computer calculations were then made of the spectral radiance signature of each of 25 candidate targets as seen by a satellite sensor system. An imaging system capable of recognizing, acquiring and tracking specific generic type surface features was defined. A preliminary experiment definition and design of a video Landmark Acquisition and Tracking system is given. This device will search a 10-mile swath while orbiting the earth, looking for land/water interfaces such as coastlines and rivers.

  4. Prostate elastography: preliminary in vivo results

    NASA Astrophysics Data System (ADS)

    Alam, S. K.; Feleppa, E. J.; Kalisz, A.; Ramchandran, S.; Ennis, R. D.; Lizzi, Frederick L.; Wuu, C.-S.; Ketterling, Jeffrey A.

    2005-04-01

    We report preliminary results from our investigation of in vivo prostate elastography. Fewer than 50% of all prostate cancers are typically visible in current clinical imaging modalities. Elastography displays a map of strain that results when tissue is externally compressed. Thus, elastography is ideal for imaging prostate cancers because they are generally stiffer than the surrounding tissue and stiffer regions usually exhibit lower strain in elastograms. In our study, digital radio-frequency (RF) ultrasound echo data were acquired from prostate-cancer patients undergoing brachytherapy. Seed placement is guided by a transrectal ultrasound (TRUS) probe, which is held in a mechanical fixture. The probe can be moved in XYZ directions and tilted. The probe face, in contact with the rectal wall, is used to apply a compression force to the immediately adjacent prostate. We also used a water-filled (acoustic) coupling balloon to compress the prostate by increasing the water volume inside the balloon. In each scan plane (transverse), we acquired RF data from successive scans at the scanner frame rate as the deformation force on the rectal wall was continuously increased. We computed strain using 1D RF cross-correlation analysis. The compression method based on fixture displacement produced low-noise elastograms that beautifully displayed the prostate architecture and emphasized stiff areas. Balloon-based compression also produced low-noise elastograms. Initial results demonstrate that elastography may be useful in the detection and evaluation of prostate cancers, occult in conventional imaging modalities.

  5. Computer Routing.

    ERIC Educational Resources Information Center

    Malone, Roger

    1991-01-01

    Computerized bus-routing systems plot the most efficient routes, cut the time it takes to draw routes, and generate reports quickly and accurately. However, school districts often underestimate the amount of work necessary to get information into the computer database. (MLF)

  6. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  7. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  8. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  9. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the future. (TW)

  10. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  11. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  12. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  13. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  14. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  15. Preliminary ECLSS waste water model

    NASA Technical Reports Server (NTRS)

    Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.

    1991-01-01

    A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.

  16. Dielectric cure monitoring: Preliminary studies

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Semmel, M. L.

    1984-01-01

    Preliminary studies have been conducted on two types of dielectric cure monitoring systems employing both epoxy resins and phenolic composites. An Audrey System was used for 23 cure monitoring runs with very limited success. Nine complete cure monitoring runs have been investigated using a Micromet System. Two additional measurements were performed to investigate the Micromet's sensitivity to water absorption in a post-cure carbon-phenolic material. While further work is needed to determine data significance, the Micromet system appears to show promise as a feedback control device during processing.

  17. Preliminary considerations concerning actinide solubilities

    SciTech Connect

    Newton, T.W.; Bayhurst, B.P.; Daniels, W.R.; Erdal, B.R.; Ogard, A.E.

    1980-01-01

    Work at the Los Alamos Scientific Laboratory on the fundamental solution chemistry of the actinides has thus far been confined to preliminary considerations of the problems involved in developing an understanding of the precipitation and dissolution behavior of actinide compounds under environmental conditions. Attempts have been made to calculate solubility as a function of Eh and pH using the appropriate thermodynamic data; results have been presented in terms of contour maps showing lines of constant solubility as a function of Eh and pH. Possible methods of control of the redox potential of rock-groundwater systems by the use of Eh buffers (redox couples) is presented.

  18. A preliminary optical visibility model

    NASA Technical Reports Server (NTRS)

    Cowles, K.; Levine, B. M.

    1994-01-01

    A model is being created to describe the effect of weather on optical communications links between space and ground sites. This article describes the process by which the model is developed and gives preliminary results for two sites. The results indicate nighttime attenuation of optical transmission at five wavelengths. It is representative of a sampling of nights at Table Mountain Observatory from January to June and Mount Lemmon Observatory from May and June. The results are designed to predict attenuation probabilities for optical communications links.

  19. Computer-Assisted Instruction in Second-Language Learning: An Alberta Project

    ERIC Educational Resources Information Center

    McEwen, Nelly

    1977-01-01

    A computer-assisted instruction program in French is described. Preliminary analysis suggests the program was successful and has potential for use in a regular French course. Advantages to the student as a method of individualized instruction are noted. (CHK)

  20. 15 CFR 270.101 - Preliminary reconnaissance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NATIONAL CONSTRUCTION SAFETY TEAMS NATIONAL CONSTRUCTION SAFETY TEAMS Establishment and Deployment of Teams § 270.101 Preliminary reconnaissance. (a)...

  1. Amorphous Computing

    NASA Astrophysics Data System (ADS)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  2. Descent Advisor Preliminary Field Test

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Vivona, Robert A.; Sanford, Beverly

    1995-01-01

    A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were: (1) to evaluate the accuracy of DA trajectory predictions for conventional and flight-management system equipped jet transports, (2) to identify significant sources of trajectory prediction error, and (3) to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 seconds late with a standard deviation of 13.1 seconds. This paper describes the field test and presents preliminary results for the commercial flights.

  3. Descent advisor preliminary field test

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Vivona, Robert A.; Sanford, Beverly

    1995-01-01

    A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were to evaluate the accuracy of DA trajectory predictions for conventional- and flight-management-system-equipped jet transports, to identify significant sources of trajectory prediction error, and to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 sec late with a standard deviation of 13.1 sec. This paper describes the field test and presents preliminary results for the commercial flights.

  4. Topaz II preliminary safety assessment

    SciTech Connect

    Marshall, A.C. ); Standley, V. ); Voss, S.S. ); Haskin, E. )

    1993-01-10

    The Strategic Defense Initiative Organization (SDIO) decided to investigate the possibility of launching a Russian Topaz II space nuclear power system. A preliminary safety assessment was conducted to determine whether or not a space mission could be conducted safely and within budget constraints. As part of this assessment, a safety policy and safety functional requirements were developed to guide both the safety assessment and future Topaz II activities. A review of the Russian flight safety program was conducted and documented. Our preliminary safety assessment included a top level event tree, neutronic analysis of normal and accident configurations, an evaluation of temperature coefficients of reactivity, a reentry and disposal analysis, and analysis of postulated launch abort impact accidents, and an analysis of postulated propellant fire and explosion accidents. Based on the assessment, it appears that it will be possible to safely launch the Topaz II system in the U.S. with some possible system modifications. The principal system modifications will probably include design changes to preclude water flooded criticality and to assure intact reentry.

  5. Topaz II preliminary safety assessment

    SciTech Connect

    Marshall, A.C. ); Standley, V. ); Voss, S.S. ); Haskin, E. . Dept. of Chemical and Nuclear Engineering)

    1992-01-01

    The Strategic Defense Initiative Organization (SDIO) decided to investigate the possibility of launching a Russian Topaz 11 space nuclear power system. A preliminary safety assessment was conducted to determine whether or not a space mission could be conducted safely and within budget constraints. As part of this assessment, a safety policy and safety functional requirements were developed to guide both the safely assessment and future Topaz II activities. A review of the Russian flight safety program was conducted and documented. Our preliminary safety assessment included a top level event tree, neutronic analysis of normal and accident configurations, an evaluation of temperature coefficients of reactivity, a reentry and disposal analysis, and analysis of postulated launch abort impact accidents, and an analysis of postulated propellant fire and explosion accidents. Based on the assessment, it appears that it will be possible to safely launch the Topaz II system in the US with some possible system modifications. The principal system modifications will probably include design changes to preclude water flooded criticality and to assure intact reentry.

  6. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  7. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  8. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  9. Computer Game

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Using NASA studies of advanced lunar exploration and colonization, KDT Industries, Inc. and Wesson International have developed MOONBASE, a computer game. The player, or team commander, must build and operate a lunar base using NASA technology. He has 10 years to explore the surface, select a site and assemble structures brought from Earth into an efficient base. The game was introduced in 1991 by Texas Space Grant Consortium.

  10. Computer centers

    NASA Astrophysics Data System (ADS)

    The National Science Foundation has renewed grants to four of its five supercomputer centers. Average annual funding will rise from $10 million to $14 million so facilities can be upgraded and training and education expanded. As cooperative projects, the centers also receive money from states, universities, computer vendors and industry. The centers support research in fluid dynamics, atmospheric modeling, engineering geophysics and many other scientific disciplines.

  11. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  12. Aerodynamic preliminary analysis system 2. Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    Sova, G.; Divan, P.; Spacht, L.

    1991-01-01

    An aerodynamic analysis system based on potential theory at subsonic and/or supersonic speeds and impact type finite element solutions at hypersonic conditions is described. Three dimensional configurations have multiple nonplanar surfaces of arbitrary planforms and bodies of noncircular contour may be analyzed. Static, rotary, and control longitudinal and lateral-directional characteristics may be generated. The analysis was implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis. Computation times on an IBM 3081 are typically less than one minute of CPU/Mach number at subsonic, supersonic, or hypersonic speeds. This is a user manual for the computer programming.

  13. 19 CFR 202.3 - Preliminary inquiry.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Preliminary inquiry. 202.3 Section 202.3 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF COSTS OF PRODUCTION § 202.3 Preliminary inquiry. Upon the receipt of an application properly filed,...

  14. 19 CFR 202.3 - Preliminary inquiry.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Preliminary inquiry. 202.3 Section 202.3 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION NONADJUDICATIVE INVESTIGATIONS INVESTIGATIONS OF COSTS OF PRODUCTION § 202.3 Preliminary inquiry. Upon the receipt of an application properly filed,...

  15. Plutonium Immobilization Can Loading Preliminary Specifications

    SciTech Connect

    Kriikku, E.

    1998-11-25

    This report discusses the Plutonium Immobilization can loading preliminary equipment specifications and includes a process block diagram, process description, equipment list, preliminary equipment specifications, plan and elevation sketches, and some commercial catalogs. This report identifies loading pucks into cans and backfilling cans with helium as the top priority can loading development areas.

  16. 33 CFR 116.10 - Preliminary review.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Preliminary review. 116.10... ALTERATION OF UNREASONABLY OBSTRUCTIVE BRIDGES § 116.10 Preliminary review. (a) Upon receipt of a written complaint, the District Commander will review the complaint to determine if, in the District...

  17. 33 CFR 116.10 - Preliminary review.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Preliminary review. 116.10... ALTERATION OF UNREASONABLY OBSTRUCTIVE BRIDGES § 116.10 Preliminary review. (a) Upon receipt of a written complaint, the District Commander will review the complaint to determine if, in the District...

  18. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  19. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  20. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  1. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  2. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  3. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  4. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  5. 40 CFR 158.345 - Preliminary analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Preliminary analysis. 158.345 Section 158.345 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.345 Preliminary analysis. (a) If the product is produced...

  6. 18 CFR 1b.6 - Preliminary investigations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Preliminary investigations. 1b.6 Section 1b.6 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.6 Preliminary investigations....

  7. Automatic system for computer program documentation

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.; Elliott, R. W.; Arseven, S.; Colunga, D.

    1972-01-01

    Work done on a project to design an automatic system for computer program documentation aids was made to determine what existing programs could be used effectively to document computer programs. Results of the study are included in the form of an extensive bibliography and working papers on appropriate operating systems, text editors, program editors, data structures, standards, decision tables, flowchart systems, and proprietary documentation aids. The preliminary design for an automated documentation system is also included. An actual program has been documented in detail to demonstrate the types of output that can be produced by the proposed system.

  8. Classical problems in computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.

  9. Orbital transfer rocket engine technology 7.5K-LB thrust rocket engine preliminary design

    NASA Technical Reports Server (NTRS)

    Harmon, T. J.; Roschak, E.

    1993-01-01

    A preliminary design of an advanced LOX/LH2 expander cycle rocket engine producing 7,500 lbf thrust for Orbital Transfer vehicle missions was completed. Engine system, component and turbomachinery analysis at both on design and off design conditions were completed. The preliminary design analysis results showed engine requirements and performance goals were met. Computer models are described and model outputs are presented. Engine system assembly layouts, component layouts and valve and control system analysis are presented. Major design technologies were identified and remaining issues and concerns were listed.

  10. New computer architectures

    SciTech Connect

    Tiberghien, J.

    1984-01-01

    This book presents papers on supercomputers. Topics considered include decentralized computer architecture, new programming languages, data flow computers, reduction computers, parallel prefix calculations, structural and behavioral descriptions of digital systems, instruction sets, software generation, personal computing, and computer architecture education.

  11. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  12. Psychohistory and Slavery: Preliminary Issues.

    PubMed

    Adams, Kenneth Alan

    2015-01-01

    "Psychohistory and Slavery: Preliminary Issues," begins an examination of slavery in the antebellum South. The paper suggests that how slavery and the group-fantasy of white male supremacy were perpetuated among slaveholders is a question of fundamental importance for psychohistorians. The family and childrearing are the focus of attention. Given the ferocity of slavery, it is argued that the psychological and emotional consequences of this barbarism were not limited to the slaves themselves, but had significant impact on the slaveholders as well-their parenting, their children, and their children's parenting of the next generation. In each generation the trauma of slavery was injected into slaveholder children and became a fundamental component of elite Southern personality.

  13. Monsoon '90 - Preliminary SAR results

    NASA Technical Reports Server (NTRS)

    Dubois, Pascale C.; Van Zyl, Jakob J.; Guerra, Abel G.

    1992-01-01

    Multifrequency polarimetric synthetic aperture radar (SAR) images of the Walnut Gulch watershed near Tombstone, Arizona were acquired on 28 Mar. 1990 and on 1 Aug. 1990. Trihedral corner reflectors were deployed prior to both overflights to allow calibration of the two SAR data sets. During both overflights, gravimetric soil moisture and dielectric constant measurements were made. Detailed vegetation height, density, and water content measurements were made as part of the Monsoon 1990 Experiment. Preliminary results based on analysis of the multitemporal polarimetric SAR data are presented. Only the C-band data (5.7-cm wavelength) radar images show significant difference between Mar. and Aug., with the strongest difference observed in the HV images. Based on the radar data analysis and the in situ measurements, we conclude that these differences are mainly due to changes in the vegetation and not due to the soil moisture changes.

  14. Monsoon 1990: Preliminary SAR results

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob J.; Dubois, Pascale; Guerra, Abel

    1991-01-01

    Multifrequency polarimetric synthetic aperture radar (SAR) images of the Walnut Gulch watershed near Tombstone, Arizona were acquired on 28 Mar. 1990 and on 1 Aug. 1990. Trihedral corner reflectors were deployed prior to both overflights to allow calibration of the two SAR data sets. During both overflights, gravimetric soil moisture and dielectric constant measurements were made. Detailed vegetation height, density, and water content measurements were made as part of the Monsoon 1990 Experiment. Preliminary results based on analysis of the multitemporal polarimetric SAR data are presented. Only the C-band data (5.7-cm wavelength) radar images show significant difference between Mar. and Aug., with the strongest difference observed in the HV images. Based on the radar data analysis and the in situ measurements, we conclude that these differences are mainly due to changes in the vegetation and not due to the soil moisture changes.

  15. Surveyor 3 Preliminary Science Results

    NASA Technical Reports Server (NTRS)

    1967-01-01

    Surveyor III soft-landed on the Moon at 00:04 GMT on April 20, 1967. Data obtained have significantly increased our knowledge of the Moon. The Surveyor III spacecraft was similar to Surveyor I; the only major change in scientific instrumentation was the addition of a soil mechanics surface sampler. Surveyor III results at this preliminary evaluation of data give valuable information about the relation between the surface skin of under-dense material responsible for the photometric properties and the deeper layers of material whose properties resemble those of ordinary terrestrial soils. In addition, they provide new insight into the relation between the general lunar surface as seen by Surveyor I and the interior of a large subdued crater. The new results have also contributed to our understanding of the mechanism of downhill transport. Many critical questions cannot, however, be answered until final reduction of experimental data.

  16. Genesis Preliminary Examination: Ellipsometry Overview

    NASA Technical Reports Server (NTRS)

    Stansbery, E. K.; McNamara, K. M.

    2005-01-01

    The Genesis spacecraft returned to Earth on September 8, 2004, experiencing a non-nominal reentry in which both the drogue and main parachutes failed to deploy causing the capsule to impact the surface of the UTTR desert at a speed of approximately 310 kph (193 mph). The impact caused severe damage to the capsule and a breach of the science canister in the field. The science canister was recovered and transported to the cleanroom at UTTR within approximately 8 hours of reentry. Although the ground water table did not rise to canister level before removal, damp soil and debris from the heat shield and other spacecraft components did enter the canister and contaminate some collector surfaces. The objective of preliminary examination of the Genesis collectors is to provide the science community with the information necessary to request the most useful samples for their analysis.

  17. Psychohistory and Slavery: Preliminary Issues.

    PubMed

    Adams, Kenneth Alan

    2015-01-01

    "Psychohistory and Slavery: Preliminary Issues," begins an examination of slavery in the antebellum South. The paper suggests that how slavery and the group-fantasy of white male supremacy were perpetuated among slaveholders is a question of fundamental importance for psychohistorians. The family and childrearing are the focus of attention. Given the ferocity of slavery, it is argued that the psychological and emotional consequences of this barbarism were not limited to the slaves themselves, but had significant impact on the slaveholders as well-their parenting, their children, and their children's parenting of the next generation. In each generation the trauma of slavery was injected into slaveholder children and became a fundamental component of elite Southern personality. PMID:26462403

  18. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  19. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  20. Stardust interstellar preliminary examination (ISPE).

    SciTech Connect

    Westphal, A.J.; Allen, C.; Bajt, S.; Basset, R.; Flynn, G.L.; Sutton, S.

    2009-03-23

    The Stardust Interstellar Preliminary Examination (ISPE) is a three-year effort to characterize the Stardust interstellar dust collection and collector using non-destructive techniques. We summarize the status of the ISPE. In January 2006 the Stardust sample return capsule returned to Earth bearing the first solid samples from a primitive solar system body, Comet 81P/Wild2, and a collector dedicated to the capture and return of contemporary interstellar dust. Both collectors were {approx}0.1 m{sup 2} in area and were composed of aerogel tiles (85% of the collecting area) and aluminum foils. The Stardust Interstellar Dust Collector (SIDC) was exposed to the interstellar dust stream for a total exposure factor of 20 m{sup 2}-day during two periods before the cometary encounter. The Stardust Interstellar Preliminary Examination (ISPE) is a three-year effort to characterize the collection using nondestructive techniques. The goals and restrictions of the ISPE are described in Westphal et al. The ISPE consists of six interdependent projects: (1) Candidate identification through automated digital microscopy and a massively distributed, calibrated search; (2) Candidate extraction and photodocumentation; (3) Characterization of candidates through synchrotron-based Fourier-Tranform Infrared Spectroscopy (FTIR), Scanning X-Ray Fluoresence Microscopy (SXRF), and Scanning Transmission X-ray Microscopy (STXM); (4) Search for and analysis of craters in foils through FESEM scanning, Auger Spectroscopy and synchrotron-based Photoemission Electron Microscopy (PEEM); (5) Modeling of interstellar dust transport in the solar system; and (6) Laboratory simulations of hypervelocity dust impacts into the collecting media.

  1. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  2. Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Astsatryan, H. V.

    2015-07-01

    Present astronomical archives that contain billions of objects, both Galactic and extragalactic, and the vast amount of data on them allow new studies and discoveries. Astrophysical Virtual Observatories (VO) use available databases and current observing material as a collection of interoperating data archives and software tools to form a research environment in which complex research programs can be conducted. Most of the modern databases give at present VO access to the stored information, which makes possible also a fast analysis and managing of these data. Cross-correlations result in revealing new objects and new samples. Very often dozens of thousands of sources hide a few very interesting ones that are needed to be discovered by comparison of various physical characteristics. VO is a prototype of Grid technologies that allows distributed data computation, analysis and imaging. Particularly important are data reduction and analysis systems: spectral analysis, SED building and fitting, modelling, variability studies, cross correlations, etc. Computational astrophysics has become an indissoluble part of astronomy and most of modern research is being done by means of it.

  3. Computational Asteroseismology

    NASA Astrophysics Data System (ADS)

    Metcalfe, Travis Scott

    2001-10-01

    White dwarf asteroseismology offers the opportunity to probe the structure and composition of stellar objects governed by relatively simple physical principles. The observational requirements of asteroseismology have been addressed by the development of the Whole Earth Telescope, but the analysis procedures still need to be refined before this technique can yield the complete physical insight that the data can provide. We have applied an optimization method utilizing a genetic algorithm to the problem of fitting white dwarf pulsation models to the observed frequencies of the most thoroughly characterized helium-atmosphere pulsator, GD 358. The free parameters in this initial study included the stellar mass, the effective temperature, the surface helium layer mass, the core composition, and the internal chemical profile. For many years, astronomers have promised that the study of pulsating white dwarfs would ultimately lead to useful information about the physics of matter under extreme conditions of temperature and pressure. The optimization approach developed in this dissertation has allowed us to finally make good on that promise by exploiting the sensitivity of our models to the core composition. We empirically determine that the central oxygen abundance in GD 358 is 84+/-3 percent. We use this value to place a preliminary constraint on the 12C(alpha ,gamma)16O nuclear reaction cross-section of S300=295+/-15 keV barns. We find a thick helium-layer solution for GD 358 that provides a better match to the data than previous fits, and helps to resolve a problem with the evolutionary connection between PG 1159 stars and DBVs. We show that the pulsation modes of our best-fit model probe down to the inner few percent of the stellar mass. We demonstrate the feasibility of reconstructing the internal chemical profiles of white dwarfs from asteroseismological data, and we find an oxygen profile for GD 358 that is qualitatively similar to recent theoretical calculations

  4. Preliminary results of radiation measurements on EURECA

    NASA Technical Reports Server (NTRS)

    Benton, E. V.; Frank, A. L.

    1995-01-01

    The eleven-month duration of the EURECA mission allows long term radiation effects to be studied similarly to those of the Long Duration Exposure Facility (LDEF). Basic data can be generated for projections of crew doses and electronic and computer reliability on spacecraft missions. A radiation experiment has been designed for EURECA which uses passive integrating detectors to measure average radiation levels. The components include a Trackoscope, which employs fourteen plastic nuclear track detector (PNTD) stacks to measure the angular dependence of LET (greater than or equal to 6 keV/microns) radiation. Also included are TLD's for total absorbed doses, thermal/resonance neutron detectors (TRND's) for low energy neutron fluences and a thick PNTD stack for depth dependence measurements. LET spectra are derived from the PNTD measurements. Preliminary TLD results from seven levels within the detector array show that integrated doses inside the flight canister varied from 18.8 plus or minus 0.6 cGy to 38.9 plus or minus 1.2 cGy. The TLD's oriented toward the least shielded direction averaged 53 percent higher in dose than those oriented away from the least shielded direction (minimum shielding toward the least shielded direction varied from 1.13 to 7.9 g/cm(exp 2), Al equivalent). The maximum dose rate on EURECA (1.16 mGy/day) was 37 percent of the maximum measured on LDEF and dose rates at all depths were less than measured on LDEF. The shielding external to the flight canister covered a greater solid angle about the canister than in the LDEF experiments.

  5. Preliminary results of radiation measurements on EURECA

    NASA Technical Reports Server (NTRS)

    Benton, E. V.; Frank, A. L.

    1995-01-01

    The eleven-month duration of the EURECA mission allows long-term radiation effects to be studied similarly to those of the Long Duration Exposure Facility (LDEF). Basic data can be generated for projections to crew doses and electronic and computer reliability on spacecraft missions. A radiation experiment has been designed for EURECA which uses passive integrating detectors to measure average radiation levels. The components include a Trackoscope, which employs fourteen plastic nuclear track detector (PNTD) stacks to measure the angular dependence of high LET (greater than or equal to 6 keV/micro m) radiation. Also included are TLD's for total absorbed doses, thermal/resonance neutron detectors (TRND's) for low energy neutron fluences and a thick PNTD stack for depth dependence measurements. LET spectra are derived from the PNTD measurements. Preliminary TLD results from seven levels within the detector array show that integrated does inside the flight canister varied from 18.8 +/- 0.6 cGy to 38.9 +/- 1.2 cGy. The TLD's oriented toward the least shielded direction averaged 53% higher in dose than those oriented away from the least shielded direction (minimum shielding toward the least shielded direction varied from 1.13 to 7.9 g/cm(exp 2), Al equivalent). The maximum dose rate on EURECA (1.16 mGy/day) was 37% of the maximum measured on LDEF and dose rates at all depths were less than measured on LDEF. The shielding external to the flight canister covered a greater solid angle about the canister than the LDEF experiments.

  6. General aviation design synthesis utilizing interactive computer graphics

    NASA Technical Reports Server (NTRS)

    Galloway, T. L.; Smith, M. R.

    1976-01-01

    Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.

  7. Computational introspection

    SciTech Connect

    Batali, J.

    1983-02-01

    Introspection is the process of thinking about one's own thoughts and feelings. In this paper, the author discusses recent attempts to make computational systems that exhibit introspective behavior. Each presents a system capable of manipulating representations of its own program and current context. He argues that introspective ability is crucial for intelligent systems--without it an agent cannot represent certain problems that it must be able to solve. A theory of intelligent action would describe how and why certain actions intelligently achieve an agent's goals. The agent would both embody and represent this theory: it would be implemented as the program for the agent; and the importance of introspection suggests that the agent represent its theory of action to itself.

  8. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  9. To get the most out of high resolution X-ray tomography: A review of the post-reconstruction analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yijin; Kiss, Andrew M.; Larsson, Daniel H.; Yang, Feifei; Pianetta, Piero

    2016-03-01

    X-ray microscopy has been well-recognized as one of the most important techniques for research in a wide range of scientific disciplines including materials science, geoscience, and bio-medical science. Advances in X-ray sources, optics, detectors, and imaging methodologies have made significant improvements to non-destructive reconstructions of the three dimensional (3D) structure of specimens over a wide range of length scales with different contrast mechanisms. A strength of 3D imaging is a "seeing is believing" way of reporting and analyzing data to better understand the structure/function characteristics of a sample. In addition to the excellent visualization capability, X-ray computed tomography has a lot more to offer. In this article, we review some of the experimental and analytical methods that enrich and extract scientifically relevant information from tomographic data. Several scientific cases are discussed along with how they enhance the tomographic dataset.

  10. Computer Use and Academic Development in Secondary Schools

    ERIC Educational Resources Information Center

    Lee, Sang Min; Brescia, William; Kissinger, Dan

    2009-01-01

    Several studies provide preliminary evidence that computer use is positively related to academic performance; however, no clear relationship has yet been established. Using a national database, we analyzed how students' school behavior (i.e., evaluated by English and math teachers) and standardized test scores (e.g., math and reading) are related…

  11. Working Together: Computers and People with Mobility Impairments.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle.

    This brief paper describes several computing tools that have been effectively used by individuals with mobility impairments. Emphasis is on tasks to be completed and how the individuals abilities (not disabilities), with possible assistance from technology, can be used to accomplish them. Preliminary information addresses the importance of…

  12. Philosophical and metamathematical considerations of quantum mechanical computers

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John; Shamir, Joseph

    1990-07-01

    We ask and give only very preliminary answers to two questions which must arise when we consider quantum mechanical computers with significant quantunt indeterminacy. First, how does this impact our belief in Church's thesis? Second, how does this impact our belief in freedom of thought?

  13. Computed tomography of the temporo-mandibular joint.

    PubMed

    Avrahami, E; Horowitz, I; Cohn, D F

    1984-01-01

    Computed tomography (CT) of the temporo-mandibular joint (TMJ) has only been occasionally reported. As CT techniques improve, more detailed information is available. Though conventional radiography of the TMJ can supply sufficient diagnostic detail, it may well be replaced by CT as a preliminary examination, as it is able to offer more information with less radiation and minimal patient discomfort.

  14. Stardust Interstellar Preliminary Examination (ISPE)

    NASA Technical Reports Server (NTRS)

    Westphal, A. J.; Allen, C.; Bajt, S.; Basset, R.; Bastien, R.; Bechtel, H.; Bleuet, P.; Borg, J.; Brenker F.; Bridges, J.

    2009-01-01

    In January 2006 the Stardust sample return capsule returned to Earth bearing the first solid samples from a primitive solar system body, C omet 81P/Wild2, and a collector dedicated to the capture and return o f contemporary interstellar dust. Both collectors were approximately 0.1m(exp 2) in area and were composed of aerogel tiles (85% of the co llecting area) and aluminum foils. The Stardust Interstellar Dust Col lector (SIDC) was exposed to the interstellar dust stream for a total exposure factor of 20 m(exp 2-) day during two periods before the co metary encounter. The Stardust Interstellar Preliminary Examination ( ISPE) is a three-year effort to characterize the collection using no ndestructive techniques. The ISPE consists of six interdependent proj ects: (1) Candidate identification through automated digital microsco py and a massively distributed, calibrated search (2) Candidate extr action and photodocumentation (3) Characterization of candidates thro ugh synchrotronbased FourierTranform Infrared Spectroscopy (FTIR), S canning XRay Fluoresence Microscopy (SXRF), and Scanning Transmission Xray Microscopy (STXM) (4) Search for and analysis of craters in f oils through FESEM scanning, Auger Spectroscopy and synchrotronbased Photoemission Electron Microscopy (PEEM) (5) Modeling of interstell ar dust transport in the solar system (6) Laboratory simulations of h ypervelocity dust impacts into the collecting media

  15. Advanced space engine preliminary design

    NASA Technical Reports Server (NTRS)

    Cuffe, J. P. B.; Bradie, R. E.

    1973-01-01

    A preliminary design was completed for an O2/H2, 89 kN (20,000 lb) thrust staged combustion rocket engine that has a single-bell nozzle with an overall expansion ratio of 400:1. The engine has a best estimate vacuum specific impulse of 4623.8 N-s/kg (471.5 sec) at full thrust and mixture ratio = 6.0. The engine employs gear-driven, low pressure pumps to provide low NPSH capability while individual turbine-driven, high-speed main pumps provide the system pressures required for high-chamber pressure operation. The engine design dry weight for the fixed-nozzle configuration is 206.9 kg (456.3 lb). Engine overall length is 234 cm (92.1 in.). The extendible nozzle version has a stowed length of 141.5 cm (55.7 in.). Critical technology items in the development of the engine were defined. Development program plans and their costs for development, production, operation, and flight support of the ASE were established for minimum cost and minimum time programs.

  16. EUPORIAS: plans and preliminary results

    NASA Astrophysics Data System (ADS)

    Buontempo, C.

    2013-12-01

    Recent advances in our understanding and ability to forecast climate variability have meant that skilful predictions are beginning to be routinely made on seasonal to decadal (s2d) timescales. Such forecasts have the potential to be of great value to a wide range of decision-making, where outcomes are strongly influenced by variations in the climate. In 2012 the European Commission funded EUPORIAS, a four year long project to develop prototype end-to-end climate impact prediction services operating on a seasonal to decadal timescale, and assess their value in informing decision-making. EUPORIAS commenced on 1 November 2012, coordinated by the UK Met Office leading a consortium of 24 organisations representing world-class European climate research and climate service centres, expertise in impacts assessments and seasonal predictions, two United Nations agencies, specialists in new media, and commercial companies in climate-vulnerable sectors such as energy, water and tourism. The poster describes the setup of the project, its main outcome and some of the very preliminary results.

  17. Space station preliminary design report

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The results of a 3 month preliminary design and analysis effort is presented. The configuration that emerged consists of a very stiff deployable truss structure with an overall triangular cross section having universal modules attached at the apexes. Sufficient analysis was performed to show feasibility of the configuration. An evaluation of the structure shows that desirable attributes of the configuration are: (1) the solar cells, radiators, and antennas will be mounted to stiff structure to minimize control problems during orbit maintenance and correction, docking, and attitude control; (2) large flat areas are available for mounting and servicing of equipment; (3) Large mass items can be mounted near the center of gravity of the system to minimize gravity gradient torques; (4) the trusses are lightweight structures and can be transported into orbit in one Shuttle flight; (5) the trusses are expandable and will require a minimum of EVA; and (6) the modules are anticipated to be structurally identical except for internal equipment to minimize cost.

  18. Electrohysterography during pregnancy: preliminary report.

    PubMed

    Gondry, J; Marque, C; Duchene, J; Cabrol, D

    1993-01-01

    The purpose of this study was to test the ability of uterine electrical activity recorded by electrohysterography (EHG) from abdominal electrodes during pregnancy to provide reliable information about uterine contractions. In this preliminary study, abdominal EHG was used to monitor the uterine contractions of eight women, three of whom were having spontaneous contractions related to preterm labor and five of whom were having medical abortions after intrauterine fetal death. The EHG signal consisting of one electrical burst (EB) correlated with a single episode of mechanical activity (MA) in more than 66% of the recorded contractions. When mechanical or electrical activity identified as artifactual was excluded, the temporal correlation of EBs with MA was found in 89% of the recorded contractions. Furthermore, the electrical bursts detected had temporal and spectral characteristics similar to those described previously. Reliable detection of mechanical activity during early pregnancy remains problematic. Nevertheless, abdominal EHG appears suitable for noninvasive monitoring of pregnancies at risk. Further studies are needed to elucidate the significance of the EHG signal in both normal and abnormal pregnancies. It may eventually be possible to use EHG as an ambulatory monitoring tool for the early diagnosis of preterm labor. PMID:8369867

  19. Preliminary results of ANAIS-25

    NASA Astrophysics Data System (ADS)

    Amaré, J.; Cebrián, S.; Cuesta, C.; García, E.; Ginestra, C.; Martínez, M.; Oliván, M. A.; Ortigoza, Y.; Ortiz de Solórzano, A.; Pobes, C.; Puimedón, J.; Sarsa, M. L.; Villar, J. A.; Villar, P.

    2014-04-01

    The ANAIS (Annual Modulation with NaI(Tl) Scintillators) experiment aims at the confirmation of the DAMA/LIBRA signal using the same target and technique at the Canfranc Underground Laboratory. 250 kg of ultrapure NaI(Tl) crystals will be used as a target, divided into 20 modules, each coupled to two photomultipliers. Two NaI(Tl) crystals of 12.5 kg each, grown by Alpha Spectra from a powder having a potassium level under the limit of our analytical techniques, form the ANAIS-25 set-up. The background contributions are being carefully studied and preliminary results are presented: their natural potassium content in the bulk has been quantified, as well as the uranium and thorium radioactive chains presence in the bulk through the discrimination of the corresponding alpha events by PSA, and due to the fast commissioning, the contribution from cosmogenic activated isotopes is clearly identified and their decay observed along the first months of data taking. Following the procedures established with ANAIS-0 and previous prototypes, bulk NaI(Tl) scintillation events selection and light collection efficiency have been also studied in ANAIS-25.

  20. Preliminary Iron Distribution on Vesta

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David W.; Mittlefehldt, David W.

    2013-01-01

    The distribution of iron on the surface of the asteroid Vesta was investigated using Dawn's Gamma Ray and Neutron Detector (GRaND) [1,2]. Iron varies predictably with rock type for the howardite, eucrite, and diogenite (HED) meteorites, thought to be representative of Vesta. The abundance of Fe in howardites ranges from about 12 to 15 wt.%. Basaltic eucrites have the highest abundance, whereas, lower crustal and upper mantle materials (cumulate eucrites and diogenites) have the lowest, and howardites are intermediate [3]. We have completed a mapping study of 7.6 MeV gamma rays produced by neutron capture by Fe as measured by the bismuth germanate (BGO) detector of GRaND [1]. The procedures to determine Fe counting rates are presented in detail here, along with a preliminary distribution map, constituting the necessary initial step to quantification of Fe abundances. We find that the global distribution of Fe counting rates is generally consistent with independent mineralogical and compositional inferences obtained by other instruments on Dawn such as measurements of pyroxene absorption bands by the Visual and Infrared Spectrometer (VIR) [4] and Framing Camera (FC) [5] and neutron absorption measurements by GRaND [6].

  1. Renewable Energy Park - Preliminary Feasibility & Engineering Report

    SciTech Connect

    Ariwite, Roderick

    2015-07-31

    This "Renewable Energy Park - Preliminary Feasibility & Engineering Report" seeks to provide an overall assessment and review of renewable energy development opportunities on the Fallon Indian Reservation and Colony Lands.

  2. 7 CFR 1735.90 - Preliminary approvals.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... schedule for completion of the proposed action leaves insufficient time for RUS to prepare and process the... until the documentation is completed to RUS's satisfaction. (b) Consideration of preliminary...

  3. 42 CFR 457.925 - Preliminary investigation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES Program Integrity § 457.925 Preliminary investigation. If the State agency receives a complaint of fraud or...

  4. 42 CFR 457.925 - Preliminary investigation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES Program Integrity § 457.925 Preliminary investigation. If the State agency receives a complaint of fraud or...

  5. 42 CFR 457.925 - Preliminary investigation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES Program Integrity § 457.925 Preliminary investigation. If the State agency receives a complaint of fraud or...

  6. Podcasting: A Preliminary Classroom Study

    ERIC Educational Resources Information Center

    Aristizabal, Alexander

    2009-01-01

    Podcasting is a term introduced through the use of Apple Computer, Inc.'s iPod, a term which denotes how a portable audio player can be used to download audio files, mostly MP3s, and be heard at the user's convenience. Initially such an operation was intended for entertainment; however, it has proven itself to be an important tool in the field of…

  7. Preliminary Design Reviews Project: SAPHIRE 8

    SciTech Connect

    Kurt G. Vedros; Curtis L. Smith

    2011-09-01

    The Preliminary Design Review (PDR) is intended to be performed at the conceptual phase of a design request. The design request is initiated with a Design Specification document which includes a problem statement, design details, a design checklist and supporting documentation and/or projected sample output. In addition to this, the design specification has a chapter devoted to the completion of the Preliminary Design Review. This document describes the process of documentation of the PDR in the Design Specification.

  8. 12 CFR 611.1250 - Preliminary exit fee estimate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Preliminary exit fee estimate. 611.1250 Section... System Institution Status § 611.1250 Preliminary exit fee estimate. (a) Preliminary exit fee estimate—terminating association. You must provide a preliminary exit fee estimate to us when you submit the plan...

  9. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  10. Computer security in DOE distributed computing systems

    SciTech Connect

    Hunteman, W.J.

    1990-01-01

    The modernization of DOE facilities amid limited funding is creating pressure on DOE facilities to find innovative approaches to their daily activities. Distributed computing systems are becoming cost-effective solutions to improved productivity. This paper defines and describes typical distributed computing systems in the DOE. The special computer security problems present in distributed computing systems are identified and compared with traditional computer systems. The existing DOE computer security policy supports only basic networks and traditional computer systems and does not address distributed computing systems. A review of the existing policy requirements is followed by an analysis of the policy as it applies to distributed computing systems. Suggested changes in the DOE computer security policy are identified and discussed. The long lead time in updating DOE policy will require guidelines for applying the existing policy to distributed systems. Some possible interim approaches are identified and discussed. 2 refs.

  11. Comparative Soot Diagnostics: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Urban, David L.; Griffin, DeVon W.; Gard, Melissa Y.

    1997-01-01

    detected and suppressed. Prior to CSD, no combustion-generated particulate samples had been collected near the flame zone for well-developed microgravity flames. All of the extant data either came from drop tower tests and therefore only corresponded to the early stages of a fire or were collected far from the flame zone. The fuel sources in the drop tower tests were restricted to laminar gas-jet diffusion flames and very rapidly overheated wire insulation. The gas-jet tests indicated, through thermophoretic sampling, (2) that soot primaries and aggregates (groups of primary particles) in low-gravity may be significantly larger than those in normal gravity (1-g). This raises new scientific questions about soot processes as well as practical issues for particulate size sensitivity and detection alarm threshold levels used in on-orbit smoke detectors. Preliminary tests in the 2.2 second drop tower suggest that particulate generated by overheated wire insulation may be larger in low-g than in 1-g. Transmission Electron Microscope (TEM) grids downstream of the fire region in the Wire Insulation Flammability experiment as well as visual observation of long string-like aggregates, further confirm this suggestion. The combined impact of these limited results and theoretical predictions is that, as opposed to extrapolation from l-g data, direct knowledge of low-g combustion particulate is needed for more confident design of smoke detectors for spacecraft. This paper describes the operation and preliminary results of the CSD, a project conceived and developed at NASA Lewis Research Center. The CSD flight experiment was conducted in the Middeck Glovebox Facility (MGBX) on USMP-3. The project is support by NASA Headquarters Microgravity Science and Applications Division and Code Q. The results presented here are from the microgravity portion of the experiment, including the temporal response of the detectors and average sizes of the primary and aggregate particles captured on the

  12. Computers and occupational therapy.

    PubMed

    English, C B

    1975-01-01

    The benefits and applications of computer science for occupational therapy are explored and a basic, functional description of the computer and computer programming is presented. Potential problems and advantages of computer utilization are compared and examples of existing computer systems in health fields are cited. Methods for successfully introducing computers are discussed.

  13. Preliminary LISA Telescope Spacer Design

    NASA Technical Reports Server (NTRS)

    Livas, J.; Arsenovic, P.; Catellucci, K.; Generie, J.; Howard, J.; Stebbins, R. T.

    2010-01-01

    The Laser Interferometric Space Antenna (LISA) mission observes gravitational waves by measuring the separations between freely floating proof masses located 5 million kilometers apart with an accuracy of approximately 10 picometers. The separations are measured interferometrically. The telescope is an afocal Cassegrain style design with a magnification of 80x. The entrance pupil has a 40 cm diameter and will either be centered on-axis or de-centered off-axis to avoid obscurations. Its two main purposes are to transform the small diameter beam used on the optical bench to a diffraction limited collimated beam to efficiently transfer the metrology laser between spacecraft, and to receive the incoming light from the far spacecraft. It transmits and receives simultaneously. The basic optical design and requirements are well understood for a conventional telescope design for imaging applications, but the LISA design is complicated by the additional requirement that the total optical path through the telescope must remain stable at the picometer level over the measurement band during the mission to meet the measurement accuracy. This poster describes the requirements for the telescope and the preliminary work that has been done to understand the materials and mechanical issues associated with the design of a passive metering structure to support the telescope and to maintain the spacing between the primary and secondary mirrors in the LISA on-orbit environment. This includes the requirements flowdown from the science goals, thermal modeling of the spacecraft and telescope to determine the expected temperature distribution,layout options for the telescope including an on- and off-axis design, and plans for fabrication and testing.

  14. A Teacher's Exploration of Personal Computer Animation for the Mathematics Classroom.

    ERIC Educational Resources Information Center

    Kaljumagi, Eric A.

    1992-01-01

    Presents the results of a preliminary investigation into the feasibility of teachers constructing their own animations for the mathematics classroom using computer animation tools. Provides examples of animation constructed using "Ani ST" and "Mathematica" programs. Concludes that computer animation has potential to be a practical instructional…

  15. Operations analysis (study 2.6). Volume 4: Computer specification; logistics of orbiting vehicle servicing (LOVES)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The logistics of orbital vehicle servicing computer specifications was developed and a number of alternatives to improve utilization of the space shuttle and the tug were investigated. Preliminary results indicate that space servicing offers a potential for reducing future operational and program costs over ground refurbishment of satellites. A computer code which could be developed to simulate space servicing is presented.

  16. Preliminary design of a redundant strapped down inertial navigation unit using two-degree-of-freedom tuned-gimbal gyroscopes

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This redundant strapdown INS preliminary design study demonstrates the practicality of a skewed sensor system configuration by means of: (1) devising a practical system mechanization utilizing proven strapdown instruments, (2) thoroughly analyzing the skewed sensor redundancy management concept to determine optimum geometry, data processing requirements, and realistic reliability estimates, and (3) implementing the redundant computers into a low-cost, maintainable configuration.

  17. The use of mixed observations from one station to determine the preliminary orbits

    NASA Astrophysics Data System (ADS)

    Baghos, B. B.

    The use of mixed observations from a laser tracking station whose geodetic coordinates are known, to determine a preliminary orbit is examined. The method has the advantage of being able to determine an osculating orbit without using iterative processes or the truncated f and g series. By using mixed data obtained from Ondrejov Observatory for the satellite, Starlette, the numerical stability of the method is discussed. A computer routine in basic FORTRAN, which illlustrates the procedure, is given.

  18. Near-term hybrid vehicle program, phase 1. Appendix C: Preliminary design data package

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The design methodology, the design decision rationale, the vehicle preliminary design summary, and the advanced technology developments are presented. The detailed vehicle design, the vehicle ride and handling and front structural crashworthiness analysis, the microcomputer control of the propulsion system, the design study of the battery switching circuit, the field chopper, and the battery charger, and the recent program refinements and computer results are presented.

  19. TU-A-9A-11: Gold Nanoparticles Enhanced Diffuse Optical Tomography: A Proof of Concept Study

    SciTech Connect

    Yang, Y; Dogan, N

    2014-06-15

    Purpose: To investigate the feasibility and potential of gold nanoparticles (GNP) enhanced diffuse optical tomography (DOT) as a novel imaging strategy for tumor detection. Methods: Simulation was performed on a digital homogeneous cylindrical phantom of 30mm×30mm. Gold nanorods (GNR) with aspect ratio 3.9 and effective radius 21.86nm were used as contrast agent. The peak light absorption for these GNR occurs at 842nm within the near-infrared region, with the absorption cross-section of 1.97×10{sup -14}m{sup -2} and scatter cross-section of 1.07×10{sup -14}m{sup -2}. A 6mmdiameter sphere of GNR solution was positioned inside the tissue-simulating phantom. Simulations were performed at the GNR concentration level of 1nM, 100pM and 10pM, respectively. The points representing laser sources and light detectors were around the phantom with 30o apart tangentially and 1mm apart axially for 9 rows. As one point being source, all the other points within the current row and nearby four rows become detectors. Hence there are 108 source points in total and 55 detector points corresponding to each source. Forward light transport at 842nm wavelength was run on a three-dimensional mesh of 33186 nodes (∼0.5mm resolution) to acquire the light emission data. Reconstruction was performed on a coarse mesh of 19408 nodes (∼1mm resolution) with ∼20minutes on a 2.4GHz CPU and 8GB RAM computer. Results: The position of the GNR solution at 1nM, 100pM and 10pM concentration was reconstructed, respectively, with <1mm error. The GNR concentration was interpreted from the reconstructed absorption coefficient within the enhanced volume. The reconstructed maximum concentrations were 0.3nM, 120.0pM, and 5.3pM, respectively. Conclusion: To the best of our knowledge, this is the first time to apply GNP to enhance DOT. The simulation results showed the high sensitivity of GNP enhanced DOT, which is in pM concentration level, compared to the μM level for MRI agents and nM level for PET

  20. Phoebe: A preliminary control network and rotational elements

    NASA Technical Reports Server (NTRS)

    Colvin, Tim R.; Davies, Merton E.; Rogers, Patricia G.; Heller, Jeanne (Editor)

    1989-01-01

    A preliminary control network for the Saturnian satellite Phoebe was determined based upon 6 distinct albedo features mapped on 16 Voyager 2 images. Using an existing map and an analytical triangulation program which minimized the measurement error, the north pole of Phoebe was calculated to be alpha sub 0 = 355.0 deg + or - 9.6 deg, delta sub 0 = 68.7 deg + or - 7.9 deg, where alpha sub 0, delta sub 0 are standard equatorial coordinates with equinox J2000 at epoch J2000. The prime meridian of Phoebe was computed to be W = 304.7 deg + 930.833872d, where d is the interval in days from JD 2451545.0 TDB.

  1. A preliminary design of the collinear dielectric wakefield accelerator

    NASA Astrophysics Data System (ADS)

    Zholents, A.; Gai, W.; Doran, S.; Lindberg, R.; Power, J. G.; Strelnikov, N.; Sun, Y.; Trakhtenberg, E.; Vasserman, I.; Jing, C.; Kanareykin, A.; Li, Y.; Gao, Q.; Shchegolkov, D. Y.; Simakov, E. I.

    2016-09-01

    A preliminary design of the multi-meter long collinear dielectric wakefield accelerator that achieves a highly efficient transfer of the drive bunch energy to the wakefields and to the witness bunch is considered. It is made from ~0.5 m long accelerator modules containing a vacuum chamber with dielectric-lined walls, a quadrupole wiggler, an rf coupler, and BPM assembly. The single bunch breakup instability is a major limiting factor for accelerator efficiency, and the BNS damping is applied to obtain the stable multi-meter long propagation of a drive bunch. Numerical simulations using a 6D particle tracking computer code are performed and tolerances to various errors are defined.

  2. Preliminary Numerical and Experimental Analysis of the Spallation Phenomenon

    NASA Technical Reports Server (NTRS)

    Martin, Alexandre; Bailey, Sean C. C.; Panerai, Francesco; Davuluri, Raghava S. C.; Vazsonyi, Alexander R.; Zhang, Huaibao; Lippay, Zachary S.; Mansour, Nagi N.; Inman, Jennifer A.; Bathel, Brett F.; Splinter, Scott C.; Danehy, Paul M.

    2015-01-01

    The spallation phenomenon was studied through numerical analysis using a coupled Lagrangian particle tracking code and a hypersonic aerothermodynamics computational fluid dynamics solver. The results show that carbon emission from spalled particles results in a significant modification of the gas composition of the post shock layer. Preliminary results from a test-campaign at the NASA Langley HYMETS facility are presented. Using an automated image processing of high-speed images, two-dimensional velocity vectors of the spalled particles were calculated. In a 30 second test at 100 W/cm2 of cold-wall heat-flux, more than 1300 particles were detected, with an average velocity of 102 m/s, and most frequent observed velocity of 60 m/s.

  3. Aerodynamic preliminary analysis system. Part 1: Theory. [linearized potential theory

    NASA Technical Reports Server (NTRS)

    Bonner, E.; Clever, W.; Dunn, K.

    1978-01-01

    A comprehensive aerodynamic analysis program based on linearized potential theory is described. The solution treats thickness and attitude problems at subsonic and supersonic speeds. Three dimensional configurations with or without jet flaps having multiple non-planar surfaces of arbitrary planform and open or closed slender bodies of non-circular contour may be analyzed. Longitudinal and lateral-directional static and rotary derivative solutions may be generated. The analysis was implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis problem. Nominal case computation time of 45 CPU seconds on the CDC 175 for a 200 panel simulation indicates the program provides an efficient analysis for systematically performing various aerodynamic configuration tradeoff and evaluation studies.

  4. Helicopter rotor and engine sizing for preliminary performance estimation

    NASA Technical Reports Server (NTRS)

    Talbot, P. D.; Bowles, J. V.; Lee, H. C.

    1986-01-01

    Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.

  5. The engineering design integration (EDIN) system. [digital computer program complex

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  6. The Computer Aided Aircraft-design Package (CAAP)

    NASA Technical Reports Server (NTRS)

    Yalif, Guy U.

    1994-01-01

    The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.

  7. ERIS: preliminary design phase overview

    NASA Astrophysics Data System (ADS)

    Kuntschner, Harald; Jochum, Lieselotte; Amico, Paola; Dekker, Johannes K.; Kerber, Florian; Marchetti, Enrico; Accardo, Matteo; Brast, Roland; Brinkmann, Martin; Conzelmann, Ralf D.; Delabre, Bernard A.; Duchateau, Michel; Fedrigo, Enrico; Finger, Gert; Frank, Christoph; Rodriguez, Fernando G.; Klein, Barbara; Knudstrup, Jens; Le Louarn, Miska; Lundin, Lars; Modigliani, Andrea; Müller, Michael; Neeser, Mark; Tordo, Sebastien; Valenti, Elena; Eisenhauer, Frank; Sturm, Eckhard; Feuchtgruber, Helmut; George, Elisabeth M.; Hartl, Michael; Hofmann, Reiner; Huber, Heinrich; Plattner, Markus P.; Schubert, Josef; Tarantik, Karl; Wiezorrek, Erich; Meyer, Michael R.; Quanz, Sascha P.; Glauser, Adrian M.; Weisz, Harald; Esposito, Simone; Xompero, Marco; Agapito, Guido; Antichi, Jacopo; Biliotti, Valdemaro; Bonaglia, Marco; Briguglio, Runa; Carbonaro, Luca; Cresci, Giovanni; Fini, Luca; Pinna, Enrico; Puglisi, Alfio T.; Quirós-Pacheco, Fernando; Riccardi, Armando; Di Rico, Gianluca; Arcidiacono, Carmelo; Dolci, Mauro

    2014-07-01

    The Enhanced Resolution Imager and Spectrograph (ERIS) is the next-generation adaptive optics near-IR imager and spectrograph for the Cassegrain focus of the Very Large Telescope (VLT) Unit Telescope 4, which will soon make full use of the Adaptive Optics Facility (AOF). It is a high-Strehl AO-assisted instrument that will use the Deformable Secondary Mirror (DSM) and the new Laser Guide Star Facility (4LGSF). The project has been approved for construction and has entered its preliminary design phase. ERIS will be constructed in a collaboration including the Max- Planck Institut für Extraterrestrische Physik, the Eidgenössische Technische Hochschule Zürich and the Osservatorio Astrofisico di Arcetri and will offer 1 - 5 μm imaging and 1 - 2.5 μm integral field spectroscopic capabilities with a high Strehl performance. Wavefront sensing can be carried out with an optical high-order NGS Pyramid wavefront sensor, or with a single laser in either an optical low-order NGS mode, or with a near-IR low-order mode sensor. Due to its highly sensitive visible wavefront sensor, and separate near-IR low-order mode, ERIS provides a large sky coverage with its 1' patrol field radius that can even include AO stars embedded in dust-enshrouded environments. As such it will replace, with a much improved single conjugated AO correction, the most scientifically important imaging modes offered by NACO (diffraction limited imaging in the J to M bands, Sparse Aperture Masking and Apodizing Phase Plate (APP) coronagraphy) and the integral field spectroscopy modes of SINFONI, whose instrumental module, SPIFFI, will be upgraded and re-used in ERIS. As part of the SPIFFI upgrade a new higher resolution grating and a science detector replacement are envisaged, as well as PLC driven motors. To accommodate ERIS at the Cassegrain focus, an extension of the telescope back focal length is required, with modifications of the guider arm assembly. In this paper we report on the status of the

  8. Visualizing ultrasound through computational modeling

    NASA Technical Reports Server (NTRS)

    Guo, Theresa W.

    2004-01-01

    The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.

  9. Preliminary structural sizing of a Mach 3.0 high-speed civil transport model

    NASA Technical Reports Server (NTRS)

    Blackburn, Charles L.

    1992-01-01

    An analysis has been performed pertaining to the structural resizing of a candidate Mach 3.0 High Speed Civil Transport (HSCT) conceptual design using a computer program called EZDESIT. EZDESIT is a computer program which integrates the PATRAN finite element modeling program to the COMET finite element analysis program for the purpose of calculating element sizes or cross sectional dimensions. The purpose of the present report is to document the procedure used in accomplishing the preliminary structural sizing and to present the corresponding results.

  10. COMSAC: Computational Methods for Stability and Control. Part 1

    NASA Technical Reports Server (NTRS)

    Fremaux, C. Michael (Compiler); Hall, Robert M. (Compiler)

    2004-01-01

    Work on stability and control included the following reports:Introductory Remarks; Introduction to Computational Methods for Stability and Control (COMSAC); Stability & Control Challenges for COMSAC: a NASA Langley Perspective; Emerging CFD Capabilities and Outlook A NASA Langley Perspective; The Role for Computational Fluid Dynamics for Stability and Control:Is it Time?; Northrop Grumman Perspective on COMSAC; Boeing Integrated Defense Systems Perspective on COMSAC; Computational Methods in Stability and Control:WPAFB Perspective; Perspective: Raytheon Aircraft Company; A Greybeard's View of the State of Aerodynamic Prediction; Computational Methods for Stability and Control: A Perspective; Boeing TacAir Stability and Control Issues for Computational Fluid Dynamics; NAVAIR S&C Issues for CFD; An S&C Perspective on CFD; Issues, Challenges & Payoffs: A Boeing User s Perspective on CFD for S&C; and Stability and Control in Computational Simulations for Conceptual and Preliminary Design: the Past, Today, and Future?

  11. Eye-screen distance monitoring for computer use.

    PubMed

    Eastwood-Sutherland, Caillin; Gale, Timothy J

    2011-01-01

    The extended period many people now spend looking at computer screens is thought to affect eyesight over the long term. In this paper we are concerned with developing and initial evaluation of a wireless camera-based tracking system providing quantitative assessment of computer screen interaction. The system utilizes a stereo camera system and wireless XBee based infrared markers and enables unobtrusive monitoring. Preliminary results indicate that the system is an excellent method of monitoring eye-screen distance. This type of system will enable future studies of eye-screen distance for computer users. PMID:22254767

  12. Heliogyro Preliminary Design, Phase 2

    NASA Technical Reports Server (NTRS)

    1978-01-01

    There are 12 blades in the Heliogyro design, and each blade is envisioned to be 8 meters in width and 7,500 meters in length. The blades are expected to be composed primarily of a thin membrane constructed of material such as Kapton film with an aluminum reflective coating on one side and an infrared emissive coating on the other. The present Phase 2 Final Report covers work done on the following six topics: (1) Design and analysis of a stowable circular lattice batten for the Heliogyro blade. (2) Design and analysis of a biaxially tensioned blade panel. (3) Definition of a research program for micrometeoroid damage to tendons. (4) A conceptual design for a flight test model of the Heliogyro. (5) Definition of modifications to the NASTRAN computer program required to provide improved analysis of the Heliogyro. (6) A User's Manual covering applications of NASTRAN to the Heliogyro.

  13. The assumptions of computing

    SciTech Connect

    Huggins, J.K.

    1994-12-31

    The use of computers, like any technological activity, is not content-neutral. Users of computers constantly interact with assumptions regarding worthwhile activity which are embedded in any computing system. Directly questioning these assumptions in the context of computing allows us to develop an understanding of responsible computing.

  14. Computers in Teaching English.

    ERIC Educational Resources Information Center

    Davis, James E., Ed.; Davis, Hazel K., Ed.

    1983-01-01

    The 26 articles in this journal issue discuss the use of the computer in the English classroom. Among the topics and applications discussed are (1) computer assisted invention, (2) word processing, (3) overcoming computer anxiety, (4) using computers in technical writing classes, (5) grading student essays by computer, (6) the experiences of an…

  15. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  16. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  17. Computers for Everybody.

    ERIC Educational Resources Information Center

    Willis, Jerry; Miller, Merl

    This book explains how computers can be used in the home, office or school, and provides a consumer's guide to computer equipment for the novice user. The first sections of the book offer a brief sketch of computer history, a listing of entertaining and easily available computer programs, a step-by-step guide to buying a computer, and advice on…

  18. GeoComputation 2009

    SciTech Connect

    Xue, Yong; Hoffman, Forrest M; Liu, Dingsheng

    2009-01-01

    The tremendous computing requirements of today's algorithms and the high costs of high-performance supercomputers drive us to share computing resources. The emerging computational Grid technologies are expected to make feasible the creation of a computational environment handling many PetaBytes of distributed data, tens of thousands of heterogeneous computing resources, and thousands of simultaneous users from multiple research institutions.

  19. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  20. Computational thinking and thinking about computing.

    PubMed

    Wing, Jeannette M

    2008-10-28

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  1. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  2. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  3. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  4. PR Educators Stress Computers.

    ERIC Educational Resources Information Center

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  5. On Teaching Computer Programming.

    ERIC Educational Resources Information Center

    Er, M. C.

    1984-01-01

    Points out difficulties associated with teaching introductory computer programing courses, discussing the importance of computer programing and explains activities associated with its use. Possible solutions to help teachers resolve problem areas in computer instruction are also offered. (ML)

  6. Computers: Instruments of Change.

    ERIC Educational Resources Information Center

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  7. Selecting Appropriate Computing Tools.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    1990-01-01

    Selecting computer tools requires analyzing information requirements and audiences, assessing existing institutional research and computing capacities, creating or improving a planning database, using computer experts, determining software needs, obtaining sufficient resources for independent operations, acquiring quality, and insisting on…

  8. Avoiding Computer Viruses.

    ERIC Educational Resources Information Center

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  9. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  10. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  11. Preliminary melter performance assessment report

    SciTech Connect

    Elliott, M.L.; Eyler, L.L.; Mahoney, L.A.; Cooper, M.F.; Whitney, L.D.; Shafer, P.J.

    1994-08-01

    The Melter Performance Assessment activity, a component of the Pacific Northwest Laboratory`s (PNL) Vitrification Technology Development (PVTD) effort, was designed to determine the impact of noble metals on the operational life of the reference Hanford Waste Vitrification Plant (HWVP) melter. The melter performance assessment consisted of several activities, including a literature review of all work done with noble metals in glass, gradient furnace testing to study the behavior of noble metals during the melting process, research-scale and engineering-scale melter testing to evaluate effects of noble metals on melter operation, and computer modeling that used the experimental data to predict effects of noble metals on the full-scale melter. Feed used in these tests simulated neutralized current acid waste (NCAW) feed. This report summarizes the results of the melter performance assessment and predicts the lifetime of the HWVP melter. It should be noted that this work was conducted before the recent Tri-Party Agreement changes, so the reference melter referred to here is the Defense Waste Processing Facility (DWPF) melter design.

  12. Symbolic and algebraic computation

    SciTech Connect

    Not Available

    1990-01-01

    This book contains subjects under the following headings: Foundations of symbolic computation; Computational logics; systems Algorithms on polynormal; Integrative and differential equations; and Differential equations.

  13. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  14. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  15. Computational aerodynamics and supercomputers

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1984-01-01

    Some of the progress in computational aerodynamics over the last decade is reviewed. The Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans are described.

  16. Optics in neural computation

    NASA Astrophysics Data System (ADS)

    Levene, Michael John

    In all attempts to emulate the considerable powers of the brain, one is struck by both its immense size, parallelism, and complexity. While the fields of neural networks, artificial intelligence, and neuromorphic engineering have all attempted oversimplifications on the considerable complexity, all three can benefit from the inherent scalability and parallelism of optics. This thesis looks at specific aspects of three modes in which optics, and particularly volume holography, can play a part in neural computation. First, holography serves as the basis of highly-parallel correlators, which are the foundation of optical neural networks. The huge input capability of optical neural networks make them most useful for image processing and image recognition and tracking. These tasks benefit from the shift invariance of optical correlators. In this thesis, I analyze the capacity of correlators, and then present several techniques for controlling the amount of shift invariance. Of particular interest is the Fresnel correlator, in which the hologram is displaced from the Fourier plane. In this case, the amount of shift invariance is limited not just by the thickness of the hologram, but by the distance of the hologram from the Fourier plane. Second, volume holography can provide the huge storage capacity and high speed, parallel read-out necessary to support large artificial intelligence systems. However, previous methods for storing data in volume holograms have relied on awkward beam-steering or on as-yet non- existent cheap, wide-bandwidth, tunable laser sources. This thesis presents a new technique, shift multiplexing, which is capable of very high densities, but which has the advantage of a very simple implementation. In shift multiplexing, the reference wave consists of a focused spot a few millimeters in front of the hologram. Multiplexing is achieved by simply translating the hologram a few tens of microns or less. This thesis describes the theory for how shift

  17. Computed tomography of human joints and radioactive waste drums

    SciTech Connect

    Ashby, E; Bernardi, R; Hollerbach, K; Logan, C; Martz, H; Roberson, G P

    1999-06-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have been increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed. (1) The computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. (2) They are developing NDE and NDE techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity.

  18. Computed tomography of human joints and radioactive waste drums

    SciTech Connect

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-12-02

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity.

  19. Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model

    USGS Publications Warehouse

    Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William

    1982-01-01

    The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.

  20. Seismic Hazard Maps for the Maltese Archipelago: Preliminary Results

    NASA Astrophysics Data System (ADS)

    D'Amico, S.; Panzera, F.; Galea, P. M.

    2013-12-01

    The Maltese islands form an archipelago of three major islands lying in the Sicily channel at about 140 km south of Sicily and 300 km north of Libya. So far very few investigations have been carried out on seismicity around the Maltese islands and no maps of seismic hazard for the archipelago are available. Assessing the seismic hazard for the region is currently of prime interest for the near-future development of industrial and touristic facilities as well as for urban expansion. A culture of seismic risk awareness has never really been developed in the country, and the public perception is that the islands are relatively safe, and that any earthquake phenomena are mild and infrequent. However, the Archipelago has been struck by several moderate/large events. Although recent constructions of a certain structural and strategic importance have been built according to high engineering standards, the same probably cannot be said for all residential buildings, many higher than 3 storeys, which have mushroomed rapidly in recent years. Such buildings are mostly of unreinforced masonry, with heavy concrete floor slabs, which are known to be highly vulnerable to even moderate ground shaking. We can surely state that in this context planning and design should be based on available national hazard maps. Unfortunately, these kinds of maps are not available for the Maltese islands. In this paper we attempt to compute a first and preliminary probabilistic seismic hazard assessment of the Maltese islands in terms of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) at different periods. Seismic hazard has been computed using the Esteva-Cornell (1968) approach which is the most widely utilized probabilistic method. It is a zone-dependent approach: seismotectonic and geological data are used coupled with earthquake catalogues to identify seismogenic zones within which earthquakes occur at certain rates. Therefore the earthquake catalogues can be reduced to the

  1. Preliminary design package for Sunair SEC-601 solar collector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The preliminary design of the Owens-Illinois model Sunair SEC-601 tubular air solar collector is presented. Information in this package includes the subsystem design and development approaches, hazard analysis, and detailed drawings available as the preliminary design review.

  2. Preliminary designs: passive solar manufactured housing. Technical status report

    SciTech Connect

    Not Available

    1980-05-12

    The criteria established to guide the development of the preliminary designs are listed. Three preliminary designs incorporating direct gain and/or sunspace are presented. Costs, drawings, and supporting calculations are included. (MHR)

  3. 40 CFR 1033.210 - Preliminary approval.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 1033.210 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM LOCOMOTIVES Certifying Engine Families § 1033.210 Preliminary approval. (a) If... determinations for questions related to engine family definitions, auxiliary emission-control...

  4. 40 CFR 1033.210 - Preliminary approval.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 1033.210 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM LOCOMOTIVES Certifying Engine Families § 1033.210 Preliminary approval. (a) If... determinations for questions related to engine family definitions, auxiliary emission-control...

  5. 32 CFR 1908.11 - Preliminary information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Preliminary information. 1908.11 Section 1908.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC REQUESTS FOR MANDATORY DECLASSIFICATION REVIEW OF CLASSIFIED INFORMATION PURSUANT TO § 3.6 OF EXECUTIVE...

  6. 32 CFR 1908.11 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1908.11 Section 1908.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC REQUESTS FOR MANDATORY DECLASSIFICATION REVIEW OF CLASSIFIED INFORMATION PURSUANT TO § 3.6 OF EXECUTIVE...

  7. 29 CFR 1955.31 - Preliminary conference.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 9 2012-07-01 2012-07-01 false Preliminary conference. 1955.31 Section 1955.31 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... those not disposed of by admissions or agreements, and control the subsequent course of the...

  8. 29 CFR 1955.31 - Preliminary conference.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Preliminary conference. 1955.31 Section 1955.31 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... those not disposed of by admissions or agreements, and control the subsequent course of the...

  9. 29 CFR 1955.31 - Preliminary conference.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 9 2013-07-01 2013-07-01 false Preliminary conference. 1955.31 Section 1955.31 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... those not disposed of by admissions or agreements, and control the subsequent course of the...

  10. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2010-07-01 2010-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  11. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS... outstanding fees for information services at this or other federal agencies will not be accepted and action...

  12. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS... outstanding fees for information services at this or other federal agencies will not be accepted and action...

  13. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS... outstanding fees for information services at this or other federal agencies will not be accepted and action...

  14. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS... outstanding fees for information services at this or other federal agencies will not be accepted and action...

  15. 32 CFR 1901.11 - Preliminary information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Preliminary information. 1901.11 Section 1901.11 National Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PUBLIC RIGHTS... outstanding fees for information services at this or other federal agencies will not be accepted and action...

  16. 15 CFR 270.101 - Preliminary reconnaissance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NATIONAL CONSTRUCTION SAFETY TEAMS NATIONAL CONSTRUCTION SAFETY TEAMS Establishment and Deployment of Teams § 270.101 Preliminary reconnaissance. (a) To... the site of a building failure. The Director may establish and deploy a Team to conduct...

  17. Two Gifted Rapid Readers--Preliminary Study.

    ERIC Educational Resources Information Center

    Schale, Florence C.

    The "page-at-a-glance" reading phenomenon in two gifted readers using only monocular vision was investigated. The specific questions to be answered in this preliminary study were (1) What is the average duration of fixations made by gifted readers while reading a somewhat familiar article? and (2) What degree of comprehension on materials of…

  18. 18 CFR 154.105 - Preliminary statement.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Preliminary statement. 154.105 Section 154.105 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER NATURAL GAS ACT RATE SCHEDULES AND TARIFFS Form and Composition...

  19. 18 CFR 154.105 - Preliminary statement.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Preliminary statement. 154.105 Section 154.105 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER NATURAL GAS ACT RATE SCHEDULES AND TARIFFS Form and Composition...

  20. 40 CFR 1042.210 - Preliminary approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... specific provisions that apply for deterioration factors. Decisions made under this section are considered... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Preliminary approval. 1042.210 Section 1042.210 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION...

  1. Preliminary LC Records for Monographs in OCLC.

    ERIC Educational Resources Information Center

    Preece, Barbara G.; Fox, Mary Anne

    1992-01-01

    Discusses the decision by Library of Congress (LC) to include preliminary cataloging records for monographs as part of its tape distribution service. The records' impact on work flow in a research library's cataloging department that uses OCLC is described, and a survey of OCLC/ARL (Association of Research Libraries) members is discussed. (eight…

  2. Preliminary Safety Analysis for the IRIS Reactor

    SciTech Connect

    Ricotti, M.E.; Cammi, A.; Cioncolini, A.; Lombardi, C.; Cipollaro, A.; Orioto, F.; Conway, L.E.; Barroso, A.C.

    2002-07-01

    A deterministic analysis of the IRIS safety features has been carried out by means of the best-estimate code RELAP (ver. RELAP5 mod3.2). First, the main system components were modeled and tested separately, namely: the Reactor Pressure Vessel (RPV), the modular helical-coil Steam Generators (SG) and the Passive (natural circulation) Emergency Heat Removal System (PEHRS). Then, a preliminary set of accident transients for the whole primary and safety systems was investigated. Since the project was in a conceptual phase, the reported analyses must be considered preliminary. In fact, neither the reactor components, nor the safety systems and the reactor signal logics were completely defined at that time. Three 'conventional' design basis accidents have been preliminary evaluated: a Loss Of primary Flow Accident, a Loss Of Coolant Accident and a Loss Of Feed Water accident. The results show the effectiveness of the safety systems also in LOCA conditions; the core remains covered for the required grace period. This provides the basis to move forward to the preliminary design. (authors)

  3. 5 CFR 838.1007 - Preliminary review.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (CONTINUED) COURT ORDERS AFFECTING RETIREMENT BENEFITS Court Orders Affecting Civil Service Retirement Benefits § 838.1007 Preliminary review. (a)(1) Upon receipt of a court order and documentation required by... OPM has received the court order and documentation. The court order and documentation will be...

  4. 5 CFR 838.1007 - Preliminary review.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (CONTINUED) COURT ORDERS AFFECTING RETIREMENT BENEFITS Court Orders Affecting Civil Service Retirement Benefits § 838.1007 Preliminary review. (a)(1) Upon receipt of a court order and documentation required by... OPM has received the court order and documentation. The court order and documentation will be...

  5. 5 CFR 838.1007 - Preliminary review.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (CONTINUED) COURT ORDERS AFFECTING RETIREMENT BENEFITS Court Orders Affecting Civil Service Retirement Benefits § 838.1007 Preliminary review. (a)(1) Upon receipt of a court order and documentation required by... OPM has received the court order and documentation. The court order and documentation will be...

  6. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... regularly for the utility in its own work and that the costs are reasonable. (c) The procedures in 23 CFR... 23 Highways 1 2011-04-01 2011-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC...

  7. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... regularly for the utility in its own work and that the costs are reasonable. (c) The procedures in 23 CFR... 23 Highways 1 2010-04-01 2010-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC...

  8. Preliminary tests of the electrostatic plasma accelerator

    NASA Technical Reports Server (NTRS)

    Aston, G.; Acker, T.

    1990-01-01

    This report describes the results of a program to verify an electrostatic plasma acceleration concept and to identify those parameters most important in optimizing an Electrostatic Plasma Accelerator (EPA) thruster based upon this thrust mechanism. Preliminary performance measurements of thrust, specific impulse and efficiency were obtained using a unique plasma exhaust momentum probe. Reliable EPA thruster operation was achieved using one power supply.

  9. 43 CFR 2450.2 - Preliminary determination.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Preliminary determination. 2450.2 Section 2450.2 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) PETITION-APPLICATION...

  10. Preliminary Findings on Rural Homelessness in Ohio.

    ERIC Educational Resources Information Center

    First, Richard J.; And Others

    This report is designed to present preliminary findings from the first comprehensive study of rural homelessness in the United States. The study was conducted during the first 6 months of 1990, and data were collected from interviews with 921 homeless adults in 21 randomly selected rural counties in Ohio. The sample counties represent 26% of the…

  11. Plutonium Immobilization Rack and Magazine Preliminary Design

    SciTech Connect

    Stokes, M.W.

    1998-12-11

    The purpose of this report is to document our current preliminary design for the Can-in-Canister rack and magazine. Since this is a developmental project with testing still ongoing, these designs will probably change as we become more knowledgeable of the functions, reliability, and cost of these designs.

  12. Preliminary radiation shielding design for BOOMERANG

    SciTech Connect

    Donahue, Richard J.

    2002-10-23

    Preliminary radiation shielding specifications are presented here for the 3 GeV BOOMERANG Australian synchrotron light source project. At this time the bulk shield walls for the storage ring and injection system (100 MeV Linac and 3 GeV Booster) are considered for siting purposes.

  13. Polarized electrons in ELSA (preliminary results)

    NASA Astrophysics Data System (ADS)

    Nakamura, S.; von Drachenfels, W.; Durek, D.; Frommberger, F.; Hoffmann, M.; Husmann, D.; Kiel, B.; Klein, F. J.; Menze, D.; Michel, T.; Nakanishi, T.; Naumann, J.; Reichelt, T.; Steier, C.; Toyama, T.; Voigt, S.; Westermann, M.

    1998-01-01

    Polarized electrons have been accelerated in the electron stretcher accelerator ELSA for the first time. Up to 2.1 GeV the polarization of the electron beam supplied by the 120 keV polarized electron source has been measured with a Mo/ller polarimeter. Preliminary results of polarization measurements at high energies and the performance of the source are presented.

  14. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Preliminary analysis. 161.170 Section 161.170 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements §...

  15. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Preliminary analysis. 161.170 Section 161.170 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements §...

  16. 33 CFR 116.15 - Preliminary investigation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Investigation, to the Administrator, Bridge Administration Program. (b) The Preliminary Investigation Report will include a description of the nature and extent of the obstruction, the alterations to the bridge believed necessary to meet the reasonable needs of existing and future navigation, the type and volume...

  17. 33 CFR 116.15 - Preliminary investigation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Investigation, to the Administrator, Office of Bridge Programs. (b) The Preliminary Investigation Report will include a description of the nature and extent of the obstruction, the alterations to the bridge believed necessary to meet the reasonable needs of existing and future navigation, the type and volume of...

  18. 33 CFR 116.15 - Preliminary investigation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Investigation, to the Administrator, Office of Bridge Programs. (b) The Preliminary Investigation Report will include a description of the nature and extent of the obstruction, the alterations to the bridge believed necessary to meet the reasonable needs of existing and future navigation, the type and volume of...

  19. 33 CFR 116.15 - Preliminary investigation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Investigation, to the Administrator, Bridge Administration Program. (b) The Preliminary Investigation Report will include a description of the nature and extent of the obstruction, the alterations to the bridge believed necessary to meet the reasonable needs of existing and future navigation, the type and volume...

  20. 33 CFR 116.15 - Preliminary investigation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Investigation, to the Administrator, Office of Bridge Programs. (b) The Preliminary Investigation Report will include a description of the nature and extent of the obstruction, the alterations to the bridge believed necessary to meet the reasonable needs of existing and future navigation, the type and volume of...

  1. 32 CFR 1900.11 - Preliminary Information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TO CIA RECORDS UNDER THE FREEDOM OF INFORMATION ACT (FOIA) Filing of Foia Requests § 1900.11 Preliminary Information. Members of the public shall address all communications to the CIA Coordinator as specified at 32 CFR 1900.03 and clearly delineate the communication as a request under the Freedom...

  2. 29 CFR 18.104 - Preliminary questions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... OF ADMINISTRATIVE LAW JUDGES Rules of Evidence § 18.104 Preliminary questions. (a) Questions of... existence of a privilege, or the admissibility of evidence shall be determined by the judge, subject to the... of evidence except those with respect to privileges. (b) Relevance conditioned on fact. When...

  3. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... regularly for the utility in its own work and that the costs are reasonable. (c) The procedures in 23 CFR... 23 Highways 1 2012-04-01 2012-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC...

  4. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... regularly for the utility in its own work and that the costs are reasonable. (c) The procedures in 23 CFR... 23 Highways 1 2013-04-01 2013-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC...

  5. 23 CFR 645.109 - Preliminary engineering.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... regularly for the utility in its own work and that the costs are reasonable. (c) The procedures in 23 CFR... 23 Highways 1 2014-04-01 2014-04-01 false Preliminary engineering. 645.109 Section 645.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC...

  6. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2013-07-01 2013-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  7. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2014-07-01 2014-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  8. 32 CFR 1700.4 - Preliminary information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... receive a FOIA request shall expeditiously forward the request to the Director, Information Management... 32 National Defense 6 2012-07-01 2012-07-01 false Preliminary information. 1700.4 Section 1700.4... INTELLIGENCE PROCEDURES FOR DISCLOSURE OF RECORDS PURSUANT TO THE FREEDOM OF INFORMATION ACT §...

  9. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  10. Computational physics with PetaFlops computers

    NASA Astrophysics Data System (ADS)

    Attig, Norbert

    2009-04-01

    Driven by technology, Scientific Computing is rapidly entering the PetaFlops era. The Jülich Supercomputing Centre (JSC), one of three German national supercomputing centres, is focusing on the IBM Blue Gene architecture to provide computer resources of this class to its users, the majority of whom are computational physicists. Details of the system will be discussed and applications will be described which significantly benefit from this new architecture.

  11. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Preliminary documented safety analysis. 830.206 Section... Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor... category 1, 2, or 3 DOE nuclear facility must: (a) Prepare a preliminary documented safety analysis for...

  12. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Preliminary documented safety analysis. 830.206 Section... Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor... category 1, 2, or 3 DOE nuclear facility must: (a) Prepare a preliminary documented safety analysis for...

  13. 37 CFR 1.482 - International preliminary examination fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false International preliminary... OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES International Processing Provisions International Preliminary Examination § 1.482 International preliminary examination fees. (a)...

  14. 37 CFR 1.482 - International preliminary examination fees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false International preliminary... OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES International Processing Provisions International Preliminary Examination § 1.482 International preliminary examination fees. (a)...

  15. 37 CFR 1.482 - International preliminary examination fees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false International preliminary... OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES International Processing Provisions International Preliminary Examination § 1.482 International preliminary examination fees. (a)...

  16. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Preliminary real estate work. 644.30 Section 644... § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action taken... real estate work on Army projects will be conducted as soon as design has progressed to the point...

  17. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 4 2013-07-01 2013-07-01 false Preliminary real estate work. 644.30 Section 644... § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action taken... real estate work on Army projects will be conducted as soon as design has progressed to the point...

  18. 32 CFR 644.30 - Preliminary real estate work.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 4 2012-07-01 2011-07-01 true Preliminary real estate work. 644.30 Section 644... § 644.30 Preliminary real estate work. (a) Preliminary real estate work is defined as that action taken... real estate work on Army projects will be conducted as soon as design has progressed to the point...

  19. Computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of computational fluid dynamics (CFD) activities at the Langley Research Center is given. The role of supercomputers in CFD research, algorithm development, multigrid approaches to computational fluid flows, aerodynamics computer programs, computational grid generation, turbulence research, and studies of rarefied gas flows are among the topics that are briefly surveyed.

  20. Computer-Based Learning.

    ERIC Educational Resources Information Center

    Brown, Peggy, Ed.

    1981-01-01

    Three essays on the ways in which colleges and universities use the computer as a teaching tool are presented, along with descriptions of 10 school programs that reflect the diversity of computer applications across the United States. In "A Place for Computing in Liberal Education," Karl L. Zinn likens the computer to personal resource tools, such…

  1. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation and the Emerging Home Computer Market" (Freeman); and…

  2. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  3. My Computer Romance

    ERIC Educational Resources Information Center

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  4. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  5. Computer Innovations in Education.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    Computers in education are put in context by a brief review of current social and technological trends, a short history of the development of computers and the vast expansion of their use, and a brief description of computers and their use. Further chapters describe instructional applications, administrative uses, uses of computers for libraries…

  6. Elementary School Computer Literacy.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  7. The Computer Manpower Evolution

    ERIC Educational Resources Information Center

    Rooney, Joseph J.

    1975-01-01

    Advances and employment outlook in the field of computer science are discussed as well as the problems related to improving the quality of computer education. Specific computer jobs discussed include: data processing machine repairers, systems analysts, programmers, computer and peripheral equipment operators, and keypunch operators. (EA)

  8. The Glass Computer

    ERIC Educational Resources Information Center

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  9. How Computer Graphics Work.

    ERIC Educational Resources Information Center

    Prosise, Jeff

    This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…

  10. Computers and Conceptual Change.

    ERIC Educational Resources Information Center

    Olson, John

    A systematic study was conducted with a group of 10- to 12-year-olds using computer assisted instruction in a unit on fire which included a computer simulation of combustion. Three research questions were addressed to learn more about how the computer experience challenged the students' preconceptions: what the students thought the computer knew,…

  11. French Computer Terminology.

    ERIC Educational Resources Information Center

    Gray, Eugene F.

    1985-01-01

    Characteristics, idiosyncrasies, borrowings, and other aspects of the French terminology for computers and computer-related matters are discussed and placed in the context of French computer use. A glossary provides French equivalent terms or translations of English computer terminology. (MSE)

  12. The Story of Computers.

    ERIC Educational Resources Information Center

    Meadow, Charles T.

    The aim of this book is to interest young people from the ages of ten to fourteen in computers, particularly to show them that computers are exciting machines, controlled by people, and to dispel myths that computers can do magic. This is not a detailed exposition of computers, nor is it a textbook. It is an attempt to impart flavor and general…

  13. COMPSIZE - PRELIMINARY DESIGN METHOD FOR FIBER REINFORCED COMPOSITE STRUCTURES

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1994-01-01

    The Composite Structure Preliminary Sizing program, COMPSIZE, is an analytical tool which structural designers can use when doing approximate stress analysis to select or verify preliminary sizing choices for composite structural members. It is useful in the beginning stages of design concept definition, when it is helpful to have quick and convenient approximate stress analysis tools available so that a wide variety of structural configurations can be sketched out and checked for feasibility. At this stage of the design process the stress/strain analysis does not need to be particularly accurate because any configurations tentatively defined as feasible will later be analyzed in detail by stress analysis specialists. The emphasis is on fast, user-friendly methods so that rough but technically sound evaluation of a broad variety of conceptual designs can be accomplished. Analysis equations used are, in most cases, widely known basic structural analysis methods. All the equations used in this program assume elastic deformation only. The default material selection is intermediate strength graphite/epoxy laid up in a quasi-isotropic laminate. A general flat laminate analysis subroutine is included for analyzing arbitrary laminates. However, COMPSIZE should be sufficient for most users to presume a quasi-isotropic layup and use the familiar basic structural analysis methods for isotropic materials, after estimating an appropriate elastic modulus. Homogeneous materials can be analyzed as simplified cases. The COMPSIZE program is written in IBM BASICA. The program format is interactive. It was designed on an IBM Personal Computer operating under DOS with a central memory requirement of approximately 128K. It has been implemented on an IBM compatible with GW-BASIC under DOS 3.2. COMPSIZE was developed in 1985.

  14. Smart isolation mount for army guns: I. Preliminary results

    NASA Astrophysics Data System (ADS)

    Allaei, Daryoush; Tarnowski, David J.; Mattice, Michael S.; Testa, Robert C.

    2000-06-01

    The work reported in this paper is focused on an effective and efficient solution, namely Smart Isolation Mount for Army Guns (SIMAG), to the weapon stabilization and fire control issues facing US Army guns. SIMAG is composed of the optimum integration of two innovative technologies. Vibration Control by Confinement and smart senor/actuator/active control systems. The combined approach may also be applied to a gun barrel to reduce its undesired vibratory motions excited by external and internal disturbances, such as gun firing action. SIMAG reconfigures the distribution and propagation of excess vibration energy and confines vibrations to certain non-critical regions or modes within a structure. Concentrated passive, active, or smart damping elements or cancellation techniques may be applied to more effectively dissipate or cancel the trapped vibrations and to prevent build up in the assembly. As the active elements, an array of collocated, PZT-based sensor- actuator sets is recommended for incorporation in SIMAG. Part of the active elements is used for spatially managing excess vibration energy while the other part is utilized for energy dissipation and cancellation. The preliminary result of our feasibility work on the SIMAG concept is demonstrated via computer simulations. It is shown that the insertion of a preliminary version of SIMAG in a 30mm gun system onboard an attack helicopter reduces the fluctuating loads and deformations measured across the helicopter bottom shell by 40 to 50 percent. SIMAG makes significant progress towards solving the firing control problems with affordable weight and power penalties by compensating for all errors in one of the two places, the turret-aircraft interface or gun barrel. Even thought the initial target application of SIMAG is airborne guns, a modified version can be incorporated into ground armors, such as tanks and humvees.

  15. Parallel computing works

    SciTech Connect

    Not Available

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  16. Advanced Free Flight Planner and Dispatcher's Workstation: Preliminary Design Specification

    NASA Technical Reports Server (NTRS)

    Wilson, J.; Wright, C.; Couluris, G. J.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) has implemented the Advanced Air Transportation Technology (AATT) program to investigate future improvements to the national and international air traffic management systems. This research, as part of the AATT program, developed preliminary design requirements for an advanced Airline Operations Control (AOC) dispatcher's workstation, with emphasis on flight planning. This design will support the implementation of an experimental workstation in NASA laboratories that would emulate AOC dispatch operations. The work developed an airline flight plan data base and specified requirements for: a computer tool for generation and evaluation of free flight, user preferred trajectories (UPT); the kernel of an advanced flight planning system to be incorporated into the UPT-generation tool; and an AOC workstation to house the UPT-generation tool and to provide a real-time testing environment. A prototype for the advanced flight plan optimization kernel was developed and demonstrated. The flight planner uses dynamic programming to search a four-dimensional wind and temperature grid to identify the optimal route, altitude and speed for successive segments of a flight. An iterative process is employed in which a series of trajectories are successively refined until the LTPT is identified. The flight planner is designed to function in the current operational environment as well as in free flight. The free flight environment would enable greater flexibility in UPT selection based on alleviation of current procedural constraints. The prototype also takes advantage of advanced computer processing capabilities to implement more powerful optimization routines than would be possible with older computer systems.

  17. (Computer vision and robotics)

    SciTech Connect

    Jones, J.P.

    1989-02-13

    The traveler attended the Fourth Aalborg International Symposium on Computer Vision at Aalborg University, Aalborg, Denmark. The traveler presented three invited lectures entitled, Concurrent Computer Vision on a Hypercube Multicomputer'', The Butterfly Accumulator and its Application in Concurrent Computer Vision on Hypercube Multicomputers'', and Concurrency in Mobile Robotics at ORNL'', and a ten-minute editorial entitled, It Concurrency an Issue in Computer Vision.'' The traveler obtained information on current R D efforts elsewhere in concurrent computer vision.

  18. Preliminary noise tradeoff study of a Mach 2.7 cruise aircraft

    NASA Technical Reports Server (NTRS)

    Mascitti, V. R.; Maglieri, D. J. (Editor); Raney, J. P. (Editor)

    1979-01-01

    NASA computer codes in the areas of preliminary sizing and enroute performance, takeoff and landing performance, aircraft noise prediction, and economics were used in a preliminary noise tradeoff study for a Mach 2.7 design supersonic cruise concept. Aerodynamic configuration data were based on wind-tunnel model tests and related analyses. Aircraft structural characteristics and weight were based on advanced structural design methodologies, assuming conventional titanium technology. The most advanced noise prediction techniques available were used, and aircraft operating costs were estimated using accepted industry methods. The 4-engines cycles included in the study were based on assumed 1985 technology levels. Propulsion data was provided by aircraft manufacturers. Additional empirical data is needed to define both noise reduction features and other operating characteristics of all engine cycles under study. Data on VCE design parameters, coannular nozzle inverted flow noise reduction and advanced mechanical suppressors are urgently needed to reduce the present uncertainties in studies of this type.

  19. Preliminary design studies for the DESCARTES and CIDER codes. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Eslinger, P.W.; Miley, T.B.; Ouderkirk, S.J.; Nichols, W.E.

    1992-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) project is developing several computer codes to model the release and transport of radionuclides into the environment. This preliminary design addresses two of these codes: Dynamic Estimates of Concentrations and Radionuclides in Terrestrial Environments (DESCARTES) and Calculation of Individual Doses from Environmental Radionuclides (CIDER). The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways, given the output of the air transport code HATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. This document reports on preliminary design work performed by the code development team to determine if the requirements could be met for Descartes and CIDER. The document contains three major sections: (i) a data flow diagram and discussion for DESCARTES, (ii) a data flow diagram and discussion for CIDER, and (iii) a series of brief statements regarding the design approach required to address each code requirement.

  20. Preliminary Assessment of a Neptune Aerocapture Mission Using an Integrated Design Tool

    NASA Technical Reports Server (NTRS)

    Gage, Peter J.; Wercinski, Paul F.

    1998-01-01

    Aerocapture is an efficient orbit insertion technique that uses the planet's atmosphere to decelerate an arriving spacecraft. With current technology and for vehicles of reasonable mass, it is the only technique that might deliver the high delta-V's required for insertion to orbits around the outer planets. Preliminary design studies for outer planet orbital missions must evaluate aerocapture strategies, and must therefore consider the coupling between vehicle geometry, aerodynamics, aerocapture trajectory, heating and thermal protection system mass. The analyses have been linked into an integrated design environment, with the critical parameters grouped in a global database. The designer is free to use single point evaluations, parametric variation, and numerical optimization to evaluate a range of vehicle shapes and insertion trajectories. The application of this design tool to a preliminary study for Neptune aerocapture has implications for the use of such computational environments for any atmospheric entry mission.