Pulsed laser ablation of IC packages for device failure analyses
NASA Astrophysics Data System (ADS)
Hong, Ming Hui; Mai, ZhiHong; Chen, G. X.; Thiam, Thomas; Song, Wen D.; Lu, Yongfeng; Soh, Chye E.; Chong, Tow Chong
2002-06-01
Pulsed laser ablation of mold compounds for IC packaging in air and with steam assistance is investigated. It is applied to decap IC packages and expose computer CPU dies for the device failure analyses. Compared with chemical decapping, the laser ablation has advantages of being fast speed, non- contact and dry processing. Laser ablation with the steam assistance results in higher ablation rate and wider ablated crater with much smoother surface morphology. It implies that the steam assisted laser ablation can achieve a faster and better quality laser processing. Audible acoustic wave and plasma optical signal diagnostics are also carried out to have a better understanding of the mechanisms behind. Light wavelength and laser fluence applied in the decapping are two important parameters. The 532 nm Nd:YAG laser decapping at a low laser fluence can achieve a large decapping area with a fine ablation profile. IC packages decapped by the laser ablation show good quality for the device failure analyses.
LDEF: Dosimetric measurement results (AO 138-7 experiment)
NASA Technical Reports Server (NTRS)
Bourrieau, J.
1993-01-01
One of the objectives of the AO 138-7 experiment on board the Long Duration Exposure Facility (LDEF) was a total dose measurement with Thermo Luminescent Detectors (TLD 100). Two identical packages, both of them including five TLD's inside various aluminum shields, are exposed to the space environment in order to obtain the absorbed dose profile. Radiation fluence received during the total mission length was computed, taking into account the trapped particles (AE8 and AP8 models during solar maximum and minimum periods) and the cosmic rays; due to the magnetospheric shielding the solar proton fluences are negligible on the LDEF orbit. The total dose induced by these radiations inside a semi infinite plane shield of aluminum are computed with the radiation transport codes available at DERTS. The dose profile obtained is in good agreement with the evaluation by E.V. Benton. TLD readings are performed after flight; due to the mission duration increase a post flight calibration was necessary in order to cover the range of the in flight induced dose. The results obtained, similar (plus or minus 30 percent) for both packages, are compared with the dose profile computation. For thick shields it seems that the measurements exceed the forecast (about 40 percent). That can be due to a cosmic ray and trapped proton contributions coming from the backside (assumed as perfectly shielded by the LDEF structure in the computation), or to an underestimate of the proton or cosmic ray fluences. A fine structural shielding analysis should be necessary in order to determine the origin of this slight discrepancy between forecast and in flight measurements. For the less shielded dosimeters, mainly exposed to the trapped electron flux, a slight overestimation of the dose (less than 40 percent) appears. Due to the dispersion of the TLD's response, this cannot be confirmed. In practice these results obtained on board LDEF, with less than a factor 1.4 between measurements and forecast, reinforce the validity of the computation methods and models used for the long term evaluation of the radiation levels (flux and dose) encountered in space on low inclination and altitude Earth orbits.
LDEF: Dosimetric measurement results (AO 138-7 experiment)
NASA Astrophysics Data System (ADS)
Bourrieau, J.
1993-04-01
One of the objectives of the AO 138-7 experiment on board the Long Duration Exposure Facility (LDEF) was a total dose measurement with Thermo Luminescent Detectors (TLD 100). Two identical packages, both of them including five TLD's inside various aluminum shields, are exposed to the space environment in order to obtain the absorbed dose profile. Radiation fluence received during the total mission length was computed, taking into account the trapped particles (AE8 and AP8 models during solar maximum and minimum periods) and the cosmic rays; due to the magnetospheric shielding the solar proton fluences are negligible on the LDEF orbit. The total dose induced by these radiations inside a semi infinite plane shield of aluminum are computed with the radiation transport codes available at DERTS. The dose profile obtained is in good agreement with the evaluation by E.V. Benton. TLD readings are performed after flight; due to the mission duration increase a post flight calibration was necessary in order to cover the range of the in flight induced dose. The results obtained, similar (plus or minus 30 percent) for both packages, are compared with the dose profile computation. For thick shields it seems that the measurements exceed the forecast (about 40 percent). That can be due to a cosmic ray and trapped proton contributions coming from the backside (assumed as perfectly shielded by the LDEF structure in the computation), or to an underestimate of the proton or cosmic ray fluences. A fine structural shielding analysis should be necessary in order to determine the origin of this slight discrepancy between forecast and in flight measurements. For the less shielded dosimeters, mainly exposed to the trapped electron flux, a slight overestimation of the dose (less than 40 percent) appears. Due to the dispersion of the TLD's response, this cannot be confirmed. In practice these results obtained on board LDEF, with less than a factor 1.4 between measurements and forecast, reinforce the validity of the computation methods and models used for the long term evaluation of the radiation levels (flux and dose) encountered in space on low inclination and altitude Earth orbits.
Excore Modeling with VERAShift
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandya, Tara M.; Evans, Thomas M.
It is important to be able to accurately predict the neutron flux outside the immediate reactor core for a variety of safety and material analyses. Monte Carlo radiation transport calculations are required to produce the high fidelity excore responses. Under this milestone VERA (specifically the VERAShift package) has been extended to perform excore calculations by running radiation transport calculations with Shift. This package couples VERA-CS with Shift to perform excore tallies for multiple state points concurrently, with each component capable of parallel execution on independent domains. Specifically, this package performs fluence calculations in the core barrel and vessel, or, performsmore » the requested tallies in any user-defined excore regions. VERAShift takes advantage of the general geometry package in Shift. This gives VERAShift the flexibility to explicitly model features outside the core barrel, including detailed vessel models, detectors, and power plant details. A very limited set of experimental and numerical benchmarks is available for excore simulation comparison. The Consortium for the Advanced Simulation of Light Water Reactors (CASL) has developed a set of excore benchmark problems to include as part of the VERA-CS verification and validation (V&V) problems. The excore capability in VERAShift has been tested on small representative assembly problems, multiassembly problems, and quarter-core problems. VERAView has also been extended to visualize these vessel fluence results from VERAShift. Preliminary vessel fluence results for quarter-core multistate calculations look very promising. Further development is needed to determine the details relevant to excore simulations. Validation of VERA for fluence and excore detectors still needs to be performed against experimental and numerical results.« less
Measurement of the main and critical parameters for optimal laser treatment of heart disease
NASA Astrophysics Data System (ADS)
Kabeya, FB; Abrahamse, H.; Karsten, AE
2017-10-01
Laser light is frequently used in the diagnosis and treatment of patients. As in traditional treatments such as medication, bypass surgery, and minimally invasive ways, laser treatment can also fail and present serious side effects. The true reason for laser treatment failure or the side effects thereof, remains unknown. From the literature review conducted, and experimental results generated we conclude that an optimal laser treatment for coronary artery disease (named heart disease) can be obtained if certain critical parameters are correctly measured and understood. These parameters include the laser power, the laser beam profile, the fluence rate, the treatment time, as well as the absorption and scattering coefficients of the target treatment tissue. Therefore, this paper proposes different, accurate methods for the measurement of these critical parameters to determine the optimal laser treatment of heart disease with a minimal risk of side effects. The results from the measurement of absorption and scattering properties can be used in a computer simulation package to predict the fluence rate. The computing technique is a program based on the random number (Monte Carlo) process and probability statistics to track the propagation of photons through a biological tissue.
Sensing circuits for multiwire proportional chambers
NASA Technical Reports Server (NTRS)
Peterson, H. T.; Worley, E. R.
1977-01-01
Integrated sensing circuits were designed, fabricated, and packaged for use in determining the direction and fluence of ionizing radiation passing through a multiwire proportional chamber. CMOS on sapphire was selected because of its high speed and low power capabilities. The design of the proposed circuits is described and the results of computer simulations are presented. The fabrication processes for the CMOS on sapphire sensing circuits and hybrid substrates are outlined. Several design options are described and the cost implications of each discussed. To be most effective, each chip should handle not more than 32 inputs, and should be mounted on its own hybrid substrate.
SU-F-T-258: Efficacy of Exit Fluence-Based Dose Calculation for Prostate Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siebers, J; Gardner, J; Neal, B
Purpose: To investigate the efficacy of exit-fluence-based dose computation for prostate radiotherapy by determining if it estimates true dose more accurately than the original planning dose. Methods: Virtual exit-fluencebased dose computation was performed for 19 patients, each with 9–12 repeat CT images. For each patient, a 78 Gy treatment plan was created utilizing 5 mm CTV-to-PTV and OAR-to-PRV margins. A Monte Carlo framework was used to compute dose and exit-fluence images for the planning image and for each repeat CT image based on boney-anatomyaligned and prostate-centroid-aligned CTs. Identical source particles were used for the MC dose-computations on the planning andmore » repeat CTs to maximize correlation. The exit-fluence-based dose and image were computed by multiplying source particle weights by FC(x,y)=FP(x,y)/FT(x,y), where (x,y) are the source particle coordinates projected to the exit-fluence plane and we denote the dose/fluence from the plan by (DP,FP), from the repeat-CT as (DT,FT), and the exit-fluence computation by (DFC,FFC). DFC mimics exit-fluence backprojection through the planning image as FT=FFC. Dose estimates were intercompared to judge the efficacy of exit-fluence-based dose computation. Results: Boney- and prostate-centroid aligned results are combined as there is no statistical difference between them, yielding 420 dose comparisons per dose-volume metric. DFC is more accurate than DP for 46%, 33%, and 44% of cases in estimating CTV D98, D50, and D2 respectively. DFC improved rectum D50 and D2 estimates 54% and 49% respectively and bladder D50 and D2 47 and 49% respectively. While averaged over all patients and images DFC and DP were within 3.1% of DT, they differed from DT by as much as 22% for GTV D98, 71% for the Bladder D50, 17% for Bladder D2, 19% for Rectum D2. Conclusion: Exit-fluence based dose computations infrequently improve CTV or OAR dose estimates and should be used with caution. Research supported in part by Varian Medical Systems.« less
NASA Technical Reports Server (NTRS)
Benton, E. V.; Henke, R. P.
1973-01-01
The high energy multicharged cosmic-ray-particle exposure of the Microbial Ecology Evaluation Device package on board the Apollo 16 spacecraft was monitored using cellulose nitrate, Lexan polycarbonate, nuclear emulsion, and silver chloride crystal nuclear-track detectors. The results of the analysis of these detectors include the measured particle fluences, the linear energy transfer spectra, and the integral atomic number spectrum of stopping particle density. The linear energy transfer spectrum is used to compute the fractional cell loss in human kidney (T1) cells caused by heavy particles. Because the Microbial Ecology Evaluation Device was better shielded, the high-energy multicharged particle exposure was less than that measured on the crew passive dosimeters.
Svatos, M.; Zankowski, C.; Bednarz, B.
2016-01-01
Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship within a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead. PMID:27277051
Characterization of gamma rays existing in the NMIJ standard neutron field.
Harano, H; Matsumoto, T; Ito, Y; Uritani, A; Kudo, K
2004-01-01
Our laboratory provides national standards on fast neutron fluence. Neutron fields are always accompanied by gamma rays produced in neutron sources and surroundings. We have characterised these gamma rays in the 5.0 MeV standard neutron field. Gamma ray measurement was performed using an NE213 liquid scintillator. Pulse shape discrimination was incorporated to separate the events induced by gamma rays from those by neutrons. The measured gamma ray spectra were unfolded with the HEPRO program package to obtain the spectral fluences using the response matrix prepared with the EGS4 code. Corrections were made for the gamma rays produced by neutrons in the detector assembly using the MCNP4C code. The effective dose equivalents were estimated to be of the order of 25 microSv at the neutron fluence of 10(7) neutrons cm(-2).
SU-C-BRC-06: OpenCL-Based Cross-Platform Monte Carlo Simulation Package for Carbon Ion Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qin, N; Tian, Z; Pompos, A
2016-06-15
Purpose: Monte Carlo (MC) simulation is considered to be the most accurate method for calculation of absorbed dose and fundamental physical quantities related to biological effects in carbon ion therapy. Its long computation time impedes clinical and research applications. We have developed an MC package, goCMC, on parallel processing platforms, aiming at achieving accurate and efficient simulations for carbon therapy. Methods: goCMC was developed under OpenCL framework. It supported transport simulation in voxelized geometry with kinetic energy up to 450 MeV/u. Class II condensed history algorithm was employed for charged particle transport with stopping power computed via Bethe-Bloch equation. Secondarymore » electrons were not transported with their energy locally deposited. Energy straggling and multiple scattering were modeled. Production of secondary charged particles from nuclear interactions was implemented based on cross section and yield data from Geant4. They were transported via the condensed history scheme. goCMC supported scoring various quantities of interest e.g. physical dose, particle fluence, spectrum, linear energy transfer, and positron emitting nuclei. Results: goCMC has been benchmarked against Geant4 with different phantoms and beam energies. For 100 MeV/u, 250 MeV/u and 400 MeV/u beams impinging to a water phantom, range difference was 0.03 mm, 0.20 mm and 0.53 mm, and mean dose difference was 0.47%, 0.72% and 0.79%, respectively. goCMC can run on various computing devices. Depending on the beam energy and voxel size, it took 20∼100 seconds to simulate 10{sup 7} carbons on an AMD Radeon GPU card. The corresponding CPU time for Geant4 with the same setup was 60∼100 hours. Conclusion: We have developed an OpenCL-based cross-platform carbon MC simulation package, goCMC. Its accuracy, efficiency and portability make goCMC attractive for research and clinical applications in carbon therapy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y. M., E-mail: ymingy@gmail.com; Bednarz, B.; Svatos, M.
Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship withinmore » a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead.« less
NASA Astrophysics Data System (ADS)
Suresh, K.; Balaji, S.; Saravanan, K.; Navas, J.; David, C.; Panigrahi, B. K.
2018-02-01
We developed a simple, low cost user-friendly automated indirect ion beam fluence measurement system for ion irradiation and analysis experiments requiring indirect beam fluence measurements unperturbed by sample conditions like low temperature, high temperature, sample biasing as well as in regular ion implantation experiments in the ion implanters and electrostatic accelerators with continuous beam. The system, which uses simple, low cost, off-the-shelf components/systems and two distinct layers of in-house built softwarenot only eliminates the need for costly data acquisition systems but also overcomes difficulties in using properietry software. The hardware of the system is centered around a personal computer, a PIC16F887 based embedded system, a Faraday cup drive cum monitor circuit, a pair of Faraday Cups and a beam current integrator and the in-house developed software include C based microcontroller firmware and LABVIEW based virtual instrument automation software. The automatic fluence measurement involves two important phases, a current sampling phase lasting over 20-30 seconds during which the ion beam current is continuously measured by intercepting the ion beam and the averaged beam current value is computed. A subsequent charge computation phase lasting 700-900 seconds is executed making the ion beam to irradiate the samples and the incremental fluence received by the sampleis estimated usingthe latest averaged beam current value from the ion beam current sampling phase. The cycle of current sampling-charge computation is repeated till the required fluence is reached. Besides simplicity and cost-effectiveness, other important advantages of the developed system include easy reconfiguration of the system to suit customisation of experiments, scalability, easy debug and maintenance of the hardware/software, ability to work as a standalone system. The system was tested with different set of samples and ion fluences and the results were verified using Rutherford backscattering technique which showed the satisfactory functioning of the system. The accuracy of the fluence measurements is found to be less than 2% which meets the demands of the irradiation experiments undertaken using the developed set up. The system was incorporated for regular use at the existing ultra high vacuum (UHV) ion irradiation chamber of 1.7 MV Tandem accelerator and several ion implantation experiments on a variety of samples like SS304, D9, ODS alloys have been successfully carried out.
Rapid Monte Carlo simulation of detector DQE(f)
Star-Lack, Josh; Sun, Mingshan; Meyer, Andre; Morf, Daniel; Constantin, Dragos; Fahrig, Rebecca; Abel, Eric
2014-01-01
Purpose: Performance optimization of indirect x-ray detectors requires proper characterization of both ionizing (gamma) and optical photon transport in a heterogeneous medium. As the tool of choice for modeling detector physics, Monte Carlo methods have failed to gain traction as a design utility, due mostly to excessive simulation times and a lack of convenient simulation packages. The most important figure-of-merit in assessing detector performance is the detective quantum efficiency (DQE), for which most of the computational burden has traditionally been associated with the determination of the noise power spectrum (NPS) from an ensemble of flood images, each conventionally having 107 − 109 detected gamma photons. In this work, the authors show that the idealized conditions inherent in a numerical simulation allow for a dramatic reduction in the number of gamma and optical photons required to accurately predict the NPS. Methods: The authors derived an expression for the mean squared error (MSE) of a simulated NPS when computed using the International Electrotechnical Commission-recommended technique based on taking the 2D Fourier transform of flood images. It is shown that the MSE is inversely proportional to the number of flood images, and is independent of the input fluence provided that the input fluence is above a minimal value that avoids biasing the estimate. The authors then propose to further lower the input fluence so that each event creates a point-spread function rather than a flood field. The authors use this finding as the foundation for a novel algorithm in which the characteristic MTF(f), NPS(f), and DQE(f) curves are simultaneously generated from the results of a single run. The authors also investigate lowering the number of optical photons used in a scintillator simulation to further increase efficiency. Simulation results are compared with measurements performed on a Varian AS1000 portal imager, and with a previously published simulation performed using clinical fluence levels. Results: On the order of only 10–100 gamma photons per flood image were required to be detected to avoid biasing the NPS estimate. This allowed for a factor of 107 reduction in fluence compared to clinical levels with no loss of accuracy. An optimal signal-to-noise ratio (SNR) was achieved by increasing the number of flood images from a typical value of 100 up to 500, thereby illustrating the importance of flood image quantity over the number of gammas per flood. For the point-spread ensemble technique, an additional 2× reduction in the number of incident gammas was realized. As a result, when modeling gamma transport in a thick pixelated array, the simulation time was reduced from 2.5 × 106 CPU min if using clinical fluence levels to 3.1 CPU min if using optimized fluence levels while also producing a higher SNR. The AS1000 DQE(f) simulation entailing both optical and radiative transport matched experimental results to within 11%, and required 14.5 min to complete on a single CPU. Conclusions: The authors demonstrate the feasibility of accurately modeling x-ray detector DQE(f) with completion times on the order of several minutes using a single CPU. Convenience of simulation can be achieved using GEANT4 which offers both gamma and optical photon transport capabilities. PMID:24593734
Rapid Monte Carlo simulation of detector DQE(f)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Star-Lack, Josh, E-mail: josh.starlack@varian.com; Sun, Mingshan; Abel, Eric
2014-03-15
Purpose: Performance optimization of indirect x-ray detectors requires proper characterization of both ionizing (gamma) and optical photon transport in a heterogeneous medium. As the tool of choice for modeling detector physics, Monte Carlo methods have failed to gain traction as a design utility, due mostly to excessive simulation times and a lack of convenient simulation packages. The most important figure-of-merit in assessing detector performance is the detective quantum efficiency (DQE), for which most of the computational burden has traditionally been associated with the determination of the noise power spectrum (NPS) from an ensemble of flood images, each conventionally having 10{supmore » 7} − 10{sup 9} detected gamma photons. In this work, the authors show that the idealized conditions inherent in a numerical simulation allow for a dramatic reduction in the number of gamma and optical photons required to accurately predict the NPS. Methods: The authors derived an expression for the mean squared error (MSE) of a simulated NPS when computed using the International Electrotechnical Commission-recommended technique based on taking the 2D Fourier transform of flood images. It is shown that the MSE is inversely proportional to the number of flood images, and is independent of the input fluence provided that the input fluence is above a minimal value that avoids biasing the estimate. The authors then propose to further lower the input fluence so that each event creates a point-spread function rather than a flood field. The authors use this finding as the foundation for a novel algorithm in which the characteristic MTF(f), NPS(f), and DQE(f) curves are simultaneously generated from the results of a single run. The authors also investigate lowering the number of optical photons used in a scintillator simulation to further increase efficiency. Simulation results are compared with measurements performed on a Varian AS1000 portal imager, and with a previously published simulation performed using clinical fluence levels. Results: On the order of only 10–100 gamma photons per flood image were required to be detected to avoid biasing the NPS estimate. This allowed for a factor of 10{sup 7} reduction in fluence compared to clinical levels with no loss of accuracy. An optimal signal-to-noise ratio (SNR) was achieved by increasing the number of flood images from a typical value of 100 up to 500, thereby illustrating the importance of flood image quantity over the number of gammas per flood. For the point-spread ensemble technique, an additional 2× reduction in the number of incident gammas was realized. As a result, when modeling gamma transport in a thick pixelated array, the simulation time was reduced from 2.5 × 10{sup 6} CPU min if using clinical fluence levels to 3.1 CPU min if using optimized fluence levels while also producing a higher SNR. The AS1000 DQE(f) simulation entailing both optical and radiative transport matched experimental results to within 11%, and required 14.5 min to complete on a single CPU. Conclusions: The authors demonstrate the feasibility of accurately modeling x-ray detector DQE(f) with completion times on the order of several minutes using a single CPU. Convenience of simulation can be achieved using GEANT4 which offers both gamma and optical photon transport capabilities.« less
NASA Technical Reports Server (NTRS)
Tada, H. Y.; Carter, J. R., Jr.
1977-01-01
Solar cell theory cells are manufactured, and how they are modeled mathematically is reviewed. The interaction of energetic charged particle radiation with solar cells is discussed in detail and the concept of 1 MeV equivalent electron fluence is introduced. The space radiation environment is described and methods of calculating equivalent fluences for the space environment are developed. A computer program was written to perform the equivalent fluence calculations and a FORTRAN listing of the program is included. Finally, an extensive body of data detailing the degradation of solar cell electrical parameters as a function of 1 MeV electron fluence is presented.
Results of dosimetric measurements in space missions
NASA Astrophysics Data System (ADS)
Reitz, G.; Beaujean, R.; Heilmann, C.; Kopp, J.; Leicher, M.; Strauch, K.
Detector packages consisting of plastic nuclear track detectors, nuclear emulsions, and thermoluminescence detectors were exposed at different locations inside the space laboratory Spacelab and at the astronauts' body and in different sections of the MIR space station. Total dose, particle fluence rate and linear energy transfer (LET) spectra of heavy ions, number of nuclear disintegrations and fast neutron fluence rates were determined of each exposure. The dose equivalent received by the Payload specialists (PSs) were calculated from the measurements, they range from 190 muSv d^-1 to 770 muSv d^-1. Finally, a preliminary investigation of results from a particle telescope of two silicon detectors, first used in the last BIORACK mission on STS 76, is reported.
NASA Technical Reports Server (NTRS)
Tada, H. Y.; Carter, J. R., Jr.; Anspaugh, B. E.; Downing, R. G.
1982-01-01
The handbook to predict the degradation of solar cell electrical performance in any given space radiation environment is presented. Solar cell theory, cell manufacturing and how they are modeled mathematically are described. The interaction of energetic charged particles radiation with solar cells is discussed and the concept of 1 MeV equivalent electron fluence is introduced. The space radiation environment is described and methods of calculating equivalent fluences for the space environment are developed. A computer program was written to perform the equivalent fluence calculations and a FORTRAN listing of the program is included. Data detailing the degradation of solar cell electrical parameters as a function of 1 MeV electron fluence are presented.
NASA Astrophysics Data System (ADS)
Mathews, A. J.; Gang, G.; Levinson, R.; Zbijewski, W.; Kawamoto, S.; Siewerdsen, J. H.; Stayman, J. W.
2017-03-01
Acquisition of CT images with comparable diagnostic power can potentially be achieved with lower radiation exposure than the current standard of care through the adoption of hardware-based fluence-field modulation (e.g. dynamic bowtie filters). While modern CT scanners employ elements such as static bowtie filters and tube-current modulation, such solutions are limited in the fluence patterns that they can achieve, and thus are limited in their ability to adapt to broad classes of patient morphology. Fluence-field modulation also enables new applications such as region-of-interest imaging, task specific imaging, reducing measurement noise or improving image quality. The work presented in this paper leverages a novel fluence modulation strategy that uses "Multiple Aperture Devices" (MADs) which are, in essence, binary filters, blocking or passing x-rays on a fine scale. Utilizing two MAD devices in series provides the capability of generating a large number of fluence patterns via small relative motions between the MAD filters. We present the first experimental evaluation of fluence-field modulation using a dual-MAD system, and demonstrate the efficacy of this technique with a characterization of achievable fluence patterns and an investigation of experimental projection data.
Natto, S A; Lewis, D G; Ryde, S J
1998-01-01
The Monte Carlo computer code MCNP (version 4A) has been used to develop a personal computer-based model of the Swansea in vivo neutron activation analysis (IVNAA) system. The model included specification of the neutron source (252Cf), collimators, reflectors and shielding. The MCNP model was 'benchmarked' against fast neutron and thermal neutron fluence data obtained experimentally from the IVNAA system. The Swansea system allows two irradiation geometries using 'short' and 'long' collimators, which provide alternative dose rates for IVNAA. The data presented here relate to the short collimator, although results of similar accuracy were obtained using the long collimator. The fast neutron fluence was measured in air at a series of depths inside the collimator. The measurements agreed with the MCNP simulation within the statistical uncertainty (5-10%) of the calculations. The thermal neutron fluence was measured and calculated inside the cuboidal water phantom. The depth of maximum thermal fluence was 3.2 cm (measured) and 3.0 cm (calculated). The width of the 50% thermal fluence level across the phantom at its mid-depth was found to be the same by both MCNP and experiment. This benchmarking exercise has given us a high degree of confidence in MCNP as a tool for the design of IVNAA systems.
Li, Mengkai; Li, Wentao; Qiang, Zhimin; Blatchley, Ernest R
2017-07-18
At present, on-site fluence (distribution) determination and monitoring of an operating UV system represent a considerable challenge. The recently developed microfluorescent silica detector (MFSD) is able to measure the approximate true fluence rate (FR) at a fixed position in a UV reactor that can be compared with a FR model directly. Hence it has provided a connection between model calculation and real-time fluence determination. In this study, an on-site determination and monitoring method of fluence delivery for an operating UV reactor was developed. True FR detectors, a UV transmittance (UVT) meter, and a flow rate meter were used for fundamental measurements. The fluence distribution, as well as reduction equivalent fluence (REF), 10th percentile dose in the UV fluence distribution (F 10 ), minimum fluence (F min ), and mean fluence (F mean ) of a test reactor, was calculated in advance by the combined use of computational fluid dynamics and FR field modeling. A field test was carried out on the test reactor for disinfection of a secondary water supply. The estimated real-time REF, F 10 , F min , and F mean decreased 73.6%, 71.4%, 69.6%, and 72.9%, respectively, during a 6-month period, which was attributable to lamp output attenuation and sleeve fouling. The results were analyzed with synchronous data from a previously developed triparameter UV monitoring system and water temperature sensor. This study allowed demonstration of an accurate method for on-site, real-time fluence determination which could be used to enhance the security and public confidence of UV-based water treatment processes.
Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster
2017-12-01
This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.
Evaluating focused ion beam patterning for position-controlled nanowire growth using computer vision
NASA Astrophysics Data System (ADS)
Mosberg, A. B.; Myklebost, S.; Ren, D.; Weman, H.; Fimland, B. O.; van Helvoort, A. T. J.
2017-09-01
To efficiently evaluate the novel approach of focused ion beam (FIB) direct patterning of substrates for nanowire growth, a reference matrix of hole arrays has been used to study the effect of ion fluence and hole diameter on nanowire growth. Self-catalyzed GaAsSb nanowires were grown using molecular beam epitaxy and studied by scanning electron microscopy (SEM). To ensure an objective analysis, SEM images were analyzed with computer vision to automatically identify nanowires and characterize each array. It is shown that FIB milling parameters can be used to control the nanowire growth. Lower ion fluence and smaller diameter holes result in a higher yield (up to 83%) of single vertical nanowires, while higher fluence and hole diameter exhibit a regime of multiple nanowires. The catalyst size distribution and placement uniformity of vertical nanowires is best for low-value parameter combinations, indicating how to improve the FIB parameters for positioned-controlled nanowire growth.
Development of a new version of the Vehicle Protection Factor Code (VPF3)
NASA Astrophysics Data System (ADS)
Jamieson, Terrance J.
1990-10-01
The Vehicle Protection Factor (VPF) Code is an engineering tool for estimating radiation protection afforded by armoured vehicles and other structures exposed to neutron and gamma ray radiation from fission, thermonuclear, and fusion sources. A number of suggestions for modifications have been offered by users of early versions of the code. These include: implementing some of the more advanced features of the air transport rating code, ATR5, used to perform the air over ground radiation transport analyses; allowing the ability to study specific vehicle orientations within the free field; implementing an adjoint transport scheme to reduce the number of transport runs required; investigating the possibility of accelerating the transport scheme; and upgrading the computer automated design (CAD) package used by VPF. The generation of radiation free field fluences for infinite air geometries as required for aircraft analysis can be accomplished by using ATR with the air over ground correction factors disabled. Analysis of the effects of fallout bearing debris clouds on aircraft will require additional modelling of VPF.
NASA Astrophysics Data System (ADS)
Lam, Hing-Lan
2017-01-01
A statistical study of relativistic electron (>2 MeV) fluence derived from geosynchronous satellites and Pc5 ultralow frequency (ULF) wave power computed from a ground magnetic observatory data located in Canada's auroral zone has been carried out. The ground observations were made near the foot points of field lines passing through the GOESs from 1987 to 2009 (cycles 22 and 23). We determine statistical relationships between the two quantities for different phases of a solar cycle and validate these relationships in two different cycles. There is a positive linear relationship between log fluence and log Pc5 power for all solar phases; however, the power law indices vary for different phases of the cycle. High index values existed during the descending phase. The Pearson's cross correlation between electron fluence and Pc5 power indicates fluence enhancement 2-3 days after strong Pc5 wave activity for all solar phases. The lag between the two quantities is shorter for extremely high fluence (due to high Pc5 power), which tends to occur during the declining phases of both cycles. Most occurrences of extremely low fluence were observed during the extended solar minimum of cycle 23. The precursory attribute of Pc5 power with respect to fluence and the enhancement of fluence due to rising Pc5 power both support the notion of an electron acceleration mechanism by Pc5 ULF waves. This precursor behavior establishes the potential of using Pc5 power to predict relativistic electron fluence.
Optimizing fluence and debridement effects on cutaneous resurfacing carbon dioxide laser surgery.
Weisberg, N K; Kuo, T; Torkian, B; Reinisch, L; Ellis, D L
1998-10-01
To develop methods to compare carbon dioxide (CO2) resurfacing lasers, fluence, and debridement effects on tissue shrinkage and histological thermal denaturation. In vitro human or in vivo porcine skin samples received up to 5 passes with scanner or short-pulsed CO2 resurfacing lasers. Fluences ranging from 2.19 to 17.58 J/cm2 (scanner) and 1.11 to 5.56 J/cm2 (short pulsed) were used to determine each laser's threshold energy for clinical effect. Variable amounts of debridement were also studied. Tissue shrinkage was evaluated by using digital photography to measure linear distance change of the treated tissue. Tissue histological studies were evaluated using quantitative computer image analysis. Fluence-independent in vitro tissue shrinkage was seen with the scanned and short-pulsed lasers above threshold fluence levels of 5.9 and 2.5 J/cm2, respectively. Histologically, fluence-independent thermal depths of damage of 77 microns (scanner) and 25 microns (pulsed) were observed. Aggressive debridement of the tissue increased the shrinkage per pass of the laser, and decreased the fluence required for the threshold effect. In vivo experiments confirmed the in vitro results, although the in vivo threshold fluence level was slightly higher and the shrinkage obtained was slightly lower per pass. Our methods allow comparison of different resurfacing lasers' acute effects. We found equivalent laser tissue effects using lower fluences than those currently accepted clinically. This suggests that the morbidity associated with CO2 laser resurfacing may be minimized by lowering levels of tissue input energy and controlling for tissue debridement.
Gandhi, Varun N; Roberts, Philip J W; Kim, Jae-Hong
2012-12-18
Evaluating the performance of typical water treatment UV reactors is challenging due to the complexity in assessing spatial and temporal variation of UV fluence, resulting from highly unsteady, turbulent nature of flow and variation in UV intensity. In this study, three-dimensional laser-induced fluorescence (3DLIF) was applied to visualize and quantitatively analyze a lab-scale UV reactor consisting of one lamp sleeve placed perpendicular to flow. Mapping the spatial and temporal fluence delivery and MS2 inactivation revealed the highest local fluence in the wake zone due to longer residence time and higher UV exposure, while the lowest local fluence occurred in a region near the walls due to short-circuiting flow and lower UV fluence rate. Comparing the tracer based decomposition between hydrodynamics and IT revealed similar coherent structures showing the dependency of fluence delivery on the reactor flow. The location of tracer injection, varying the height and upstream distance from the lamp center, was found to significantly affect the UV fluence received by the tracer. A Lagrangian-based analysis was also employed to predict the fluence along specific paths of travel, which agreed with the experiments. The 3DLIF technique developed in this study provides new insight on dose delivery that fluctuates both spatially and temporally and is expected to aid design and optimization of UV reactors as well as validate computational fluid dynamics models that are widely used to simulate UV reactor performances.
NASA Astrophysics Data System (ADS)
Lukić, M.; Ćojbašić, Ž.; Rabasović, M. D.; Markushev, D. D.; Todorović, D. M.
2017-11-01
In this paper, the possibilities of computational intelligence applications for trace gas monitoring are discussed. For this, pulsed infrared photoacoustics is used to investigate SF6-Ar mixtures in a multiphoton regime, assisted by artificial neural networks. Feedforward multilayer perceptron networks are applied in order to recognize both the spatial characteristics of the laser beam and the values of laser fluence Φ from the given photoacoustic signal and prevent changes. Neural networks are trained in an offline batch training regime to simultaneously estimate four parameters from theoretical or experimental photoacoustic signals: the laser beam spatial profile R(r), vibrational-to-translational relaxation time τ _{V-T} , distance from the laser beam to the absorption molecules in the photoacoustic cell r* and laser fluence Φ . The results presented in this paper show that neural networks can estimate an unknown laser beam spatial profile and the parameters of photoacoustic signals in real time and with high precision. Real-time operation, high accuracy and the possibility of application for higher intensities of radiation for a wide range of laser fluencies are factors that classify the computational intelligence approach as efficient and powerful for the in situ measurement of atmospheric pollutants.
Adaptive treatment-length optimization in spatiobiologically integrated radiotherapy
NASA Astrophysics Data System (ADS)
Ajdari, Ali; Ghate, Archis; Kim, Minsun
2018-04-01
Recent theoretical research on spatiobiologically integrated radiotherapy has focused on optimization models that adapt fluence-maps to the evolution of tumor state, for example, cell densities, as observed in quantitative functional images acquired over the treatment course. We propose an optimization model that adapts the length of the treatment course as well as the fluence-maps to such imaged tumor state. Specifically, after observing the tumor cell densities at the beginning of a session, the treatment planner solves a group of convex optimization problems to determine an optimal number of remaining treatment sessions, and a corresponding optimal fluence-map for each of these sessions. The objective is to minimize the total number of tumor cells remaining (TNTCR) at the end of this proposed treatment course, subject to upper limits on the biologically effective dose delivered to the organs-at-risk. This fluence-map is administered in future sessions until the next image is available, and then the number of sessions and the fluence-map are re-optimized based on the latest cell density information. We demonstrate via computer simulations on five head-and-neck test cases that such adaptive treatment-length and fluence-map planning reduces the TNTCR and increases the biological effect on the tumor while employing shorter treatment courses, as compared to only adapting fluence-maps and using a pre-determined treatment course length based on one-size-fits-all guidelines.
On mechanism of explosive boiling in nanosecond regime
NASA Astrophysics Data System (ADS)
Çelen, Serap
2016-06-01
Today laser-based machining is used to manufacture vital parts for biomedical, aviation and aerospace industries. The aim of the paper is to report theoretical, numerical and experimental investigations of explosive boiling under nanosecond pulsed ytterbium fiber laser irradiation. Experiments were performed in an effective peak power density range between 1397 and 1450 MW/cm2 on pure titanium specimens. The threshold laser fluence for phase explosion, the pressure and temperature at the target surface and the velocity of the expulsed material were reported. A narrow transition zone was realized between the normal vaporization and phase explosion fields. The proof of heterogeneous boiling was given with detailed micrographs. A novel thermal model was proposed for laser-induced splashing at high fluences. Packaging factor and scattering arc radius terms were proposed to state the level of the melt ejection process. Results of the present investigation explain the explosive boiling during high-power laser interaction with metal.
NASA Technical Reports Server (NTRS)
Banks, Bruce A.
2011-01-01
This innovation enables a means for actively measuring atomic oxygen fluence (accumulated atoms of atomic oxygen per area) that has impinged upon spacecraft surfaces. Telemetered data from the device provides spacecraft designers, researchers, and mission managers with real-time measurement of atomic oxygen fluence, which is useful for prediction of the durability of spacecraft materials and components. The innovation is a compact fluence measuring device that allows in-space measurement and transmittance of measured atomic oxygen fluence as a function of time based on atomic oxygen erosion yields (the erosion yield of a material is the volume of material that is oxidized per incident oxygen atom) of materials that have been measured in low Earth orbit. It has a linear electrical response to atomic oxygen fluence, and is capable of measuring high atomic oxygen fluences (up to >10(exp 22) atoms/sq cm), which are representative of multi-year low-Earth orbital missions (such as the International Space Station). The durability or remaining structural lifetime of solar arrays that consist of polymer blankets on which the solar cells are attached can be predicted if one knows the atomic oxygen fluence that the solar array blanket has been exposed to. In addition, numerous organizations that launch space experiments into low-Earth orbit want to know the accumulated atomic oxygen fluence that their materials or components have been exposed to. The device is based on the erosion yield of pyrolytic graphite. It uses two 12deg inclined wedges of graphite that are over a grit-blasted fused silica window covering a photodiode. As the wedges erode, a greater area of solar illumination reaches the photodiode. A reference photodiode is also used that receives unobstructed solar illumination and is oriented in the same direction as the pyrolytic graphite covered photodiode. The short-circuit current from the photodiodes is measured and either sent to an onboard data logger, or transmitted to a receiving station on Earth. By comparison of the short-circuit currents from the fluence-measuring photodiode and the reference photodiode, one can compute the accumulated atomic oxygen fluence arriving in the direction that the fluence monitor is pointing. The device produces a signal that is linear with atomic oxygen fluence using a material whose atomic oxygen erosion yield has been measured over a period of several years in low-Earth orbit.
SU-E-T-422: Fast Analytical Beamlet Optimization for Volumetric Intensity-Modulated Arc Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chan, Kenny S K; Lee, Louis K Y; Xing, L
2015-06-15
Purpose: To implement a fast optimization algorithm on CPU/GPU heterogeneous computing platform and to obtain an optimal fluence for a given target dose distribution from the pre-calculated beamlets in an analytical approach. Methods: The 2D target dose distribution was modeled as an n-dimensional vector and estimated by a linear combination of independent basis vectors. The basis set was composed of the pre-calculated beamlet dose distributions at every 6 degrees of gantry angle and the cost function was set as the magnitude square of the vector difference between the target and the estimated dose distribution. The optimal weighting of the basis,more » which corresponds to the optimal fluence, was obtained analytically by the least square method. Those basis vectors with a positive weighting were selected for entering into the next level of optimization. Totally, 7 levels of optimization were implemented in the study.Ten head-and-neck and ten prostate carcinoma cases were selected for the study and mapped to a round water phantom with a diameter of 20cm. The Matlab computation was performed in a heterogeneous programming environment with Intel i7 CPU and NVIDIA Geforce 840M GPU. Results: In all selected cases, the estimated dose distribution was in a good agreement with the given target dose distribution and their correlation coefficients were found to be in the range of 0.9992 to 0.9997. Their root-mean-square error was monotonically decreasing and converging after 7 cycles of optimization. The computation took only about 10 seconds and the optimal fluence maps at each gantry angle throughout an arc were quickly obtained. Conclusion: An analytical approach is derived for finding the optimal fluence for a given target dose distribution and a fast optimization algorithm implemented on the CPU/GPU heterogeneous computing environment greatly reduces the optimization time.« less
A method for radiological characterization based on fluence conversion coefficients
NASA Astrophysics Data System (ADS)
Froeschl, Robert
2018-06-01
Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.
Expected neutrino fluence from short Gamma-Ray Burst 170817A and off-axis angle constraints
NASA Astrophysics Data System (ADS)
Biehl, D.; Heinze, J.; Winter, W.
2018-05-01
We compute the expected neutrino fluence from SGRB 170817A, associated with the gravitational wave event GW 170817, directly based on Fermi observations in two scenarios: structured jet and off-axis (observed) top-hat jet. While the expected neutrino fluence for the structured jet case is very small, large off-axis angles imply high radiation densities in the jet, which can enhance the neutrino production efficiency. In the most optimistic allowed scenario, the neutrino fluence can reach only 10-4 of the sensitivity of the neutrino telescopes. We furthermore demonstrate that the fact that gamma-rays can escape limits the baryonic loading (energy in protons versus photons) and the off-axis angle for the internal shock scenario. In particular, for a baryonic loading of 10, the off-axis angle is more strongly constrained by the baryonic loading than by the time delay between the gravitational wave event and the onset of the gamma-ray emission.
Chow, James C.L.; Grigorov, Grigor N.; Yazdani, Nuri
2006-01-01
A custom‐made computer program, SWIMRT, to construct “multileaf collimator (MLC) machine” file for intensity‐modulated radiotherapy (IMRT) fluence maps was developed using MATLAB® and the sliding window algorithm. The user can either import a fluence map with a graphical file format created by an external treatment‐planning system such as Pinnacle3 or create his or her own fluence map using the matrix editor in the program. Through comprehensive calibrations of the dose and the dimension of the imported fluence field, the user can use associated image‐processing tools such as field resizing and edge trimming to modify the imported map. When the processed fluence map is suitable, a “MLC machine” file is generated for our Varian 21 EX linear accelerator with a 120‐leaf Millennium MLC. This machine file is transferred to the MLC console of the LINAC to control the continuous motions of the leaves during beam irradiation. An IMRT field is then irradiated with the 2D intensity profiles, and the irradiated profiles are compared to the imported or modified fluence map. This program was verified and tested using film dosimetry to address the following uncertainties: (1) the mechanical limitation due to the leaf width and maximum traveling speed, and (2) the dosimetric limitation due to the leaf leakage/transmission and penumbra effect. Because the fluence map can be edited, resized, and processed according to the requirement of a study, SWIMRT is essential in studying and investigating the IMRT technique using the sliding window algorithm. Using this program, future work on the algorithm may include redistributing the time space between segmental fields to enhance the fluence resolution, and readjusting the timing of each leaf during delivery to avoid small fields. Possible clinical utilities and examples for SWIMRT are given in this paper. PACS numbers: 87.53.Kn, 87.53.St, 87.53.Uv PMID:17533330
WE-AB-209-10: Optimizing the Delivery of Sequential Fluence Maps for Efficient VMAT Delivery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, D; Balvert, M
2016-06-15
Purpose: To develop an optimization model and solution approach for computing MLC leaf trajectories and dose rates for high quality matching of a set of optimized fluence maps to be delivered sequentially around a patient in a VMAT treatment. Methods: We formulate the fluence map matching problem as a nonlinear optimization problem where time is discretized but dose rates and leaf positions are continuous variables. For a given allotted time, which is allocated across the fluence maps based on the complexity of each fluence map, the optimization problem searches for the best leaf trajectories and dose rates such that themore » original fluence maps are closely recreated. Constraints include maximum leaf speed, maximum dose rate, and leaf collision avoidance, as well as the constraint that the ending leaf positions for one map are the starting leaf positions for the next map. The resulting model is non-convex but smooth, and therefore we solve it by local searches from a variety of starting positions. We improve solution time by a custom decomposition approach which allows us to decouple the rows of the fluence maps and solve each leaf pair individually. This decomposition also makes the problem easily parallelized. Results: We demonstrate method on a prostate case and a head-and-neck case and show that one can recreate fluence maps to high degree of fidelity in modest total delivery time (minutes). Conclusion: We present a VMAT sequencing method that reproduces optimal fluence maps by searching over a vast number of possible leaf trajectories. By varying the total allotted time given, this approach is the first of its kind to allow users to produce VMAT solutions that span the range of wide-field coarse VMAT deliveries to narrow-field high-MU sliding window-like approaches.« less
Radiation damage of gallium arsenide production cells
NASA Technical Reports Server (NTRS)
Mardesich, N.; Joslin, D.; Garlick, J.; Lillington, D.; Gillanders, M.; Cavicchi, B.; Scott-Monck, J.; Kachare, R.; Anspaugh, B.
1987-01-01
High efficiency liquid phase epitaxy (LPE) gallium arsenide cells were irradiated with 1 Mev electrons up to fluences of 1 times 10 to the 16th power cm-2. Measurements of spectral response and dark and illuminated I-V data were made at each fluence and then, using computer codes, the experimental data was fitted to gallium arsenide cell models. In this way it was possible to determine the extent of the damage, and hence damage coefficients in both the emitter and base of the cell.
NASA Technical Reports Server (NTRS)
King, J. H.; Stassinopoulos, E. G.
1975-01-01
The relative importance of solar and trapped proton fluxes in the consideration of shielding requirements for geocentric space missions is analyzed. Using models of these particles, their fluences encountered by spacecraft in circular orbits are computed as functions of orbital altitude and inclination, mission duration, threshold energy (10 to 100 MeV), and risk factor (for solar protons only), and ratios of solar-to-trapped fluences are derived. It is shown that solar protons predominate for low-altitude polar and very high-altitude missions, while trapped protons predominate for missions at low and medium altitudes and low inclinations. It is recommended that if the ratio of solar-to-trapped protons falls between 0.1 and 10, both fluences should be considered in planning shielding systems.
NASA Technical Reports Server (NTRS)
Harris, Richard D.
2008-01-01
Commercial silicon carbide and silicon Schottky barrier power diodes have been subjected to 203 MeV proton irradiation and the effects of the resultant displacement damage on the I-V characteristics have been observed. Changes in forward bias I-V characteristics are reported for fluences up to 4 x 10(exp 14) p/cm2. For devices of both material types, the series resistance is observed to increase as the fluence increases. The changes in series resistance result from changes in the free carrier concentration due to carrier removal by the defects produced. A simple model is presented that allows calculation of the series resistance of the device and then relates the carrier removal rate to the changes in series resistance. Using this model to calculate the carrier removal rate in both materials reveals that the carrier removal rate in silicon is less than that in silicon carbide, indicating that silicon is the more radiation tolerant material.
ERIC Educational Resources Information Center
Pollard, Jim
This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thiyagarajan, Rajesh; Karrthick, KP; Kataria, Tejinder
Purpose: Performing DQA for Bilateral (B-L) breast tomotherapy is a challenging task due to the limitation of any commercially available detector array or film. Aim of this study is to perform DQA for B-L breast tomotherapy plan using MLC fluence sinogram. Methods: Treatment plan was generated on Tomotherapy system for B-L breast tumour. B-L breast targets were given 50.4 Gy prescribed over 28 fractions. Plan is generated with 6 MV photon beam & pitch was set to 0.3. As the width of the total target is 39 cm (left & right) length is 20 cm. DQA plan delivered without anymore » phantom on the mega voltage computed tomography (MCVT) detector system. The pulses recorded by MVCT system were exported to the delivery analysis software (Tomotherapy Inc.) for reconstruction. The detector signals are reconstructed to a sonogram and converted to MLC fluence sonogram. The MLC fluence sinogram compared with the planned fluence sinogram. Also point dose measured with cheese phantom and ionization chamber to verify the absolute dose component Results: Planned fluence sinogram and reconstructed MLC fluence sinogram were compared using Gamma metric. MLC positional difference and intensity of the beamlet were used as parameters to evaluate gamma. 3 mm positional difference and 3% beamlet intensity difference were used set for gamma calculation. A total of 26784 non-zero beamlets were included in the analysis out of which 161 beamlets had gamma more than 1. The gamma passing rate found to be 99.4%. Point dose measurements were within 1.3% of the calculated dose. Conclusion: MLC fluence sinogram based delivery quality assurance performed for bilateral breast irradiation. This would be a suitable alternate for large volume targets like bilateral breast, Total body irradiation etc. However conventional method of DQA should be used to validate this method periodically.« less
Constraints on the Early Terrestrial Surface UV Environment Relevant to Prebiotic Chemistry.
Ranjan, Sukrit; Sasselov, Dimitar D
2017-03-01
The UV environment is a key boundary condition to abiogenesis. However, considerable uncertainty exists as to planetary conditions and hence surface UV at abiogenesis. Here, we present two-stream multilayer clear-sky calculations of the UV surface radiance on Earth at 3.9 Ga to constrain the UV surface fluence as a function of albedo, solar zenith angle (SZA), and atmospheric composition. Variation in albedo and latitude (through SZA) can affect maximum photoreaction rates by a factor of >10.4; for the same atmosphere, photoreactions can proceed an order of magnitude faster at the equator of a snowball Earth than at the poles of a warmer world. Hence, surface conditions are important considerations when computing prebiotic UV fluences. For climatically reasonable levels of CO 2 , fluence shortward of 189 nm is screened out, meaning that prebiotic chemistry is robustly shielded from variations in UV fluence due to solar flares or variability. Strong shielding from CO 2 also means that the UV surface fluence is insensitive to plausible levels of CH 4 , O 2 , and O 3 . At scattering wavelengths, UV fluence drops off comparatively slowly with increasing CO 2 levels. However, if SO 2 and/or H 2 S can build up to the ≥1-100 ppm level as hypothesized by some workers, then they can dramatically suppress surface fluence and hence prebiotic photoprocesses. H 2 O is a robust UV shield for λ < 198 nm. This means that regardless of the levels of other atmospheric gases, fluence ≲198 nm is only available for cold, dry atmospheres, meaning sources with emission ≲198 (e.g., ArF excimer lasers) can only be used in simulations of cold environments with low abundance of volcanogenic gases. On the other hand, fluence at 254 nm is unshielded by H 2 O and is available across a broad range of [Formula: see text], meaning that mercury lamps are suitable for initial studies regardless of the uncertainty in primordial H 2 O and CO 2 levels. Key Words: Radiative transfer-Origin of life-Planetary environments-UV radiation-Prebiotic chemistry. Astrobiology 17, 169-204.
Constraints on the Early Terrestrial Surface UV Environment Relevant to Prebiotic Chemistry
NASA Astrophysics Data System (ADS)
Ranjan, Sukrit; Sasselov, Dimitar D.
2017-03-01
The UV environment is a key boundary condition to abiogenesis. However, considerable uncertainty exists as to planetary conditions and hence surface UV at abiogenesis. Here, we present two-stream multilayer clear-sky calculations of the UV surface radiance on Earth at 3.9 Ga to constrain the UV surface fluence as a function of albedo, solar zenith angle (SZA), and atmospheric composition. Variation in albedo and latitude (through SZA) can affect maximum photoreaction rates by a factor of >10.4; for the same atmosphere, photoreactions can proceed an order of magnitude faster at the equator of a snowball Earth than at the poles of a warmer world. Hence, surface conditions are important considerations when computing prebiotic UV fluences. For climatically reasonable levels of CO2, fluence shortward of 189 nm is screened out, meaning that prebiotic chemistry is robustly shielded from variations in UV fluence due to solar flares or variability. Strong shielding from CO2 also means that the UV surface fluence is insensitive to plausible levels of CH4, O2, and O3. At scattering wavelengths, UV fluence drops off comparatively slowly with increasing CO2 levels. However, if SO2 and/or H2S can build up to the ≥1-100 ppm level as hypothesized by some workers, then they can dramatically suppress surface fluence and hence prebiotic photoprocesses. H2O is a robust UV shield for λ < 198 nm. This means that regardless of the levels of other atmospheric gases, fluence ≲198 nm is only available for cold, dry atmospheres, meaning sources with emission ≲198 (e.g., ArF excimer lasers) can only be used in simulations of cold environments with low abundance of volcanogenic gases. On the other hand, fluence at 254 nm is unshielded by H2O and is available across a broad range of NCO2, meaning that mercury lamps are suitable for initial studies regardless of the uncertainty in primordial H2O and CO2 levels.
AGC-2 Graphite Pre-irradiation Data Package
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Swank; Joseph Lord; David Rohrbaugh
2010-08-01
The NGNP Graphite R&D program is currently establishing the safe operating envelope of graphite core components for a Very High Temperature Reactor (VHTR) design. The program is generating quantitative data necessary for predicting the behavior and operating performance of the new nuclear graphite grades. To determine the in-service behavior of the graphite for pebble bed and prismatic designs, the Advanced Graphite Creep (AGC) experiment is underway. This experiment is examining the properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences and compressive loads. Each experiment consists of over 400 graphite specimens that are characterizedmore » prior to irradiation and following irradiation. Six experiments are planned with the first, AGC-1, currently being irradiated in the Advanced Test Reactor (ATR) and pre-irradiation characterization of the second, AGC-2, completed. This data package establishes the readiness of 512 specimens for assembly into the AGC-2 capsule.« less
Plasma focus ion beam-scaling laws
NASA Astrophysics Data System (ADS)
Saw, S. H.
2014-08-01
Measurements on plasma focus ion beams include various advanced techniques producing a variety of data which has yet to produce benchmark numbers. Recent numerical experiments using an extended version of the Lee Code has produced reference numbers and scaling trends for number and energy fluence of deuteron beams as functions of stored energy E0. At the pinch exit the ion number fluence (ions m-2) and energy fluence (J m-2) computed as 2.4-7.8×1020 and 2.2-33×106 respectively were found to be independent of E0 from 0.4 - 486 kJ. This work was extended to the ion beams for various gases. The results show that, for a given plasma focus, the fluence, flux, ion number and ion current decrease from the lightest to the heaviest gas except for trend-breaking higher values for Ar fluence and flux. The energy fluence, energy flux, power flow and damage factors are relatively constant from H2 to N2 but increase for Ne, Ar, Kr and Xe due to radiative cooling and collapse effects. This paper reviews this work and in a concluding section attempts to put the accumulating large amounts of data into the form of a scaling law of beam energy Ebeam versus storage energy E0 taking the form for deuteron as: {Ebeam} = 18.2{E}01.23; where Ebeam is in J and E0 is in kJ. It is hoped that the establishment of such scaling laws places on a firm footing the reference quantitative ideas for plasma focus ion beams.
Neutron Fluence And DPA Rate Analysis In Pebble-Bed HTR Reactor Vessel Using MCNP
NASA Astrophysics Data System (ADS)
Hamzah, Amir; Suwoto; Rohanda, Anis; Adrial, Hery; Bakhri, Syaiful; Sunaryo, Geni Rina
2018-02-01
In the Pebble-bed HTR reactor, the distance between the core and the reactor vessel is very close and the media inside are carbon and He gas. Neutron moderation capability of graphite material is theoretically lower than that of water-moderated reactors. Thus, it is estimated much more the fast neutrons will reach the reactor vessel. The fast neutron collisions with the atoms in the reactor vessel will result in radiation damage and could be reducing the vessel life. The purpose of this study was to obtain the magnitude of neutron fluence in the Pebble-bed HTR reactor vessel. Neutron fluence calculations in the pebble-bed HTR reactor vessel were performed using the MCNP computer program. By determining the tally position, it can be calculated flux, spectrum and neutron fluence in the position of Pebble-bed HTR reactor vessel. The calculations results of total neutron flux and fast neutron flux in the reactor vessel of 1.82x108 n/cm2/s and 1.79x108 n/cm2/s respectively. The fast neutron fluence in the reactor vessel is 3.4x1017 n/cm2 for 60 years reactor operation. Radiation damage in stainless steel material caused by high-energy neutrons (> 1.0 MeV) will occur when it has reached the neutron flux level of 1.0x1024 n/cm2. The neutron fluence results show that there is no radiation damage in the Pebble-bed HTR reactor vessel, so it is predicted that it will be safe to operate at least for 60 years.
Function Package for Computing Quantum Resource Measures
NASA Astrophysics Data System (ADS)
Huang, Zhiming
2018-05-01
In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Prediction of LDEF ionizing radiation environment
NASA Astrophysics Data System (ADS)
Watts, John W.; Parnell, T. A.; Derrickson, James H.; Armstrong, T. W.; Benton, E. V.
1992-01-01
The Long Duration Exposure Facility (LDEF) spacecraft flew in a 28.5 deg inclination circular orbit with an altitude in the range from 172 to 258.5 nautical miles. For this orbital altitude and inclination two components contribute most of the penetrating charge particle radiation encountered - the galactic cosmic rays and the geomagnetically trapped Van Allen protons. Where shielding is less than 1.0 g/sq cm geomagnetically trapped electrons make a significant contribution. The 'Vette' models together with the associated magnetic filed models were used to obtain the trapped electron and proton fluences. The mission proton doses were obtained from the fluence using the Burrell proton dose program. For the electron and bremsstrahlung dose we used the Marshall Space Flight Center (MSFC) electron dose program. The predicted doses were in general agreement with those measured with on-board thermoluminescent detector (TLD) dosimeters. The NRL package of programs, Cosmic Ray Effects on MicroElectronics (CREME), was used to calculate the linear energy transfer (LET) spectrum due to galactic cosmic rays (GCR) and trapped protons for comparison with LDEF measurements.
NASA Astrophysics Data System (ADS)
Chytyk-Praznik, Krista Joy
Radiation therapy is continuously increasing in complexity due to technological innovation in delivery techniques, necessitating thorough dosimetric verification. Comparing accurately predicted portal dose images to measured images obtained during patient treatment can determine if a particular treatment was delivered correctly. The goal of this thesis was to create a method to predict portal dose images that was versatile and accurate enough to use in a clinical setting. All measured images in this work were obtained with an amorphous silicon electronic portal imaging device (a-Si EPID), but the technique is applicable to any planar imager. A detailed, physics-motivated fluence model was developed to characterize fluence exiting the linear accelerator head. The model was further refined using results from Monte Carlo simulations and schematics of the linear accelerator. The fluence incident on the EPID was converted to a portal dose image through a superposition of Monte Carlo-generated, monoenergetic dose kernels specific to the a-Si EPID. Predictions of clinical IMRT fields with no patient present agreed with measured portal dose images within 3% and 3 mm. The dose kernels were applied ignoring the geometrically divergent nature of incident fluence on the EPID. A computational investigation into this parallel dose kernel assumption determined its validity under clinically relevant situations. Introducing a patient or phantom into the beam required the portal image prediction algorithm to account for patient scatter and attenuation. Primary fluence was calculated by attenuating raylines cast through the patient CT dataset, while scatter fluence was determined through the superposition of pre-calculated scatter fluence kernels. Total dose in the EPID was calculated by convolving the total predicted incident fluence with the EPID-specific dose kernels. The algorithm was tested on water slabs with square fields, agreeing with measurement within 3% and 3 mm. The method was then applied to five prostate and six head-and-neck IMRT treatment courses (˜1900 clinical images). Deviations between the predicted and measured images were quantified. The portal dose image prediction model developed in this thesis work has been shown to be accurate, and it was demonstrated to be able to verify patients' delivered radiation treatments.
An accurate method for computer-generating tungsten anode x-ray spectra from 30 to 140 kV.
Boone, J M; Seibert, J A
1997-11-01
A tungsten anode spectral model using interpolating polynomials (TASMIP) was used to compute x-ray spectra at 1 keV intervals over the range from 30 kV to 140 kV. The TASMIP is not semi-empirical and uses no physical assumptions regarding x-ray production, but rather interpolates measured constant potential x-ray spectra published by Fewell et al. [Handbook of Computed Tomography X-ray Spectra (U.S. Government Printing Office, Washington, D.C., 1981)]. X-ray output measurements (mR/mAs measured at 1 m) were made on a calibrated constant potential generator in our laboratory from 50 kV to 124 kV, and with 0-5 mm added aluminum filtration. The Fewell spectra were slightly modified (numerically hardened) and normalized based on the attenuation and output characteristics of a constant potential generator and metal-insert x-ray tube in our laboratory. Then, using the modified Fewell spectra of different kVs, the photon fluence phi at each 1 keV energy bin (E) over energies from 10 keV to 140 keV was characterized using polynomial functions of the form phi (E) = a0[E] + a1[E] kV + a2[E] kV2 + ... + a(n)[E] kVn. A total of 131 polynomial functions were used to calculate accurate x-ray spectra, each function requiring between two and four terms. The resulting TASMIP algorithm produced x-ray spectra that match both the quality and quantity characteristics of the x-ray system in our laboratory. For photon fluences above 10% of the peak fluence in the spectrum, the average percent difference (and standard deviation) between the modified Fewell spectra and the TASMIP photon fluence was -1.43% (3.8%) for the 50 kV spectrum, -0.89% (1.37%) for the 70 kV spectrum, and for the 80, 90, 100, 110, 120, 130 and 140 kV spectra, the mean differences between spectra were all less than 0.20% and the standard deviations were less than approximately 1.1%. The model was also extended to include the effects of generator-induced kV ripple. Finally, the x-ray photon fluence in the units of photons/mm2 per mR was calculated as a function of HVL, kV, and ripple factor, for various (water-equivalent) patient thicknesses (0, 10, 20, and 30 cm). These values may be useful for computing the detective quantum efficiency, DQE(f), of x-ray detector systems. The TASMIP algorithm and ancillary data are made available on line at http:/(/)www.aip.org/epaps/epaps.html.
NASA Astrophysics Data System (ADS)
Levi, Michele; Steinhoff, Jan
2017-12-01
We present a novel public package ‘EFTofPNG’ for high precision computation in the effective field theory of post-Newtonian (PN) gravity, including spins. We created this package in view of the timely need to publicly share automated computation tools, which integrate the various types of physics manifested in the expected increasing influx of gravitational wave (GW) data. Hence, we created a free and open source package, which is self-contained, modular, all-inclusive, and accessible to the classical gravity community. The ‘EFTofPNG’ Mathematica package also uses the power of the ‘xTensor’ package, suited for complicated tensor computation, where our coding also strategically approaches the generic generation of Feynman contractions, which is universal to all perturbation theories in physics, by efficiently treating n-point functions as tensors of rank n. The package currently contains four independent units, which serve as subsidiaries to the main one. Its final unit serves as a pipeline chain for the obtainment of the final GW templates, and provides the full computation of derivatives and physical observables of interest. The upcoming ‘EFTofPNG’ package version 1.0 should cover the point mass sector, and all the spin sectors, up to the fourth PN order, and the two-loop level. We expect and strongly encourage public development of the package to improve its efficiency, and to extend it to further PN sectors, and observables useful for the waveform modelling.
LDEF: Dosimetric measurement results (AO 138-7 experiment)
NASA Technical Reports Server (NTRS)
Bourrieau, J.
1992-01-01
One of the objectives of the AO 138-7 experiment on board the LDEF was a total dose measurement with Thermo Luminescent Detectors (TLD 100). Two identical cases, both including 5 TLDs inside various aluminum shields, are exposed to the space environment in order to obtain the absorbed dose profile induced. Radiation fluence received during the total mission length was computed, taking into account the trapped particles (solar maximum and solar minimum periods) and the cosmic rays; due to the magnetospheric shielding, the solar proton fluences are negligible on the LDEF orbit. The total dose induced by these radiations inside a semi-infinite plane shield of Al are computed with radiation transport codes. TLD reading are performed after flight; due to the mission duration increase, a post-flight calibration was necessary in order to cover the range of the flight induced dose. The results obtained, similar (+ or - 30 pct.) in both cases, are compared with the dose profile computation. In practice, these LDEF results, with less than a factor 1.4 between measurements and forecasts, reinforce the validity of the computation methods and models used for the long term evaluation of space radiation intensity on low inclination Earth orbits.
Differential pencil beam dose computation model for photons.
Mohan, R; Chui, C; Lidofsky, L
1986-01-01
Differential pencil beam (DPB) is defined as the dose distribution relative to the position of the first collision, per unit collision density, for a monoenergetic pencil beam of photons in an infinite homogeneous medium of unit density. We have generated DPB dose distribution tables for a number of photon energies in water using the Monte Carlo method. The three-dimensional (3D) nature of the transport of photons and electrons is automatically incorporated in DPB dose distributions. Dose is computed by evaluating 3D integrals of DPB dose. The DPB dose computation model has been applied to calculate dose distributions for 60Co and accelerator beams. Calculations for the latter are performed using energy spectra generated with the Monte Carlo program. To predict dose distributions near the beam boundaries defined by the collimation system as well as blocks, we utilize the angular distribution of incident photons. Inhomogeneities are taken into account by attenuating the primary photon fluence exponentially utilizing the average total linear attenuation coefficient of intervening tissue, by multiplying photon fluence by the linear attenuation coefficient to yield the number of collisions in the scattering volume, and by scaling the path between the scattering volume element and the computation point by an effective density.
The gputools package enables GPU computing in R.
Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan
2010-01-01
By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu
Measurement of X-ray intensity in mammography by a ferroelectric dosimeter
NASA Astrophysics Data System (ADS)
Alter, Albert J.
2005-07-01
Each year in the US over 20 million women undergo mammography, a relatively high dose x-ray examination of the breast, which is relatively sensitive to the carcinogenic effect of ionizing radiation. The radiation risk from mammography is usually expressed in terms of mean glandular dose (MGD) which is calculated as the product of measured entrance exposure (ESE) and a dose conversion factor which is a function of anode material, peak tube voltage (23 to 35 kVp), half-value layer, filtration, compressed breast thickness and breast composition. Mammographic units may have anodes made of molybdenum, rhodium or tungsten and filters of molybdenum, rhodium, or aluminum. In order to accommodate all these parameters, multiple extensive tables of conversion factors are required to cover the range of possibilities. Energy fluence and energy imparted are alternative measures of radiation hazard, which have been used in situations where geometry or filtration is unconventional such as computed tomography or fluoroscopy. Unfortunately, at the present there is no way to directly measure these quantities clinically. In radiation therapy applications, calorimetry has been used to measure energy absorbed. A ferroelectric-based detector has been described that measures energy fluence rate (x-ray intensity) for diagnostic x-ray, 50 to 140 kVp, aluminum filtered tungsten spectrum [Carvalho & Alter: IEEE Transactions 44(6) 1997]. This work explores use of ferroelectric detectors to measure energy fluence, energy fluence rate and energy imparted in mammography. A detector interfaced with a laptop computer was developed to allow measurements on clinical units of five different manufactures having targets of molybdenum, rhodium and tungsten and filters of molybdenum, rhodium, and aluminum of various thicknesses. The measurements provide the first values of energy fluence and energy imparted in mammography. These measurements are compared with conventional parameters such as entrance exposure and mean glandular dose as well as published values of energy imparted for other types of x-ray examinations. Advantage of measuring dose in terms of energy imparted in mammography are simplicity of comparison with other sources of radiation exposure and potential (relative ease) of measurement across a variety of anode and filter combinations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Uytven, Eric, E-mail: eric.vanuytven@cancercare.mb.ca; Van Beek, Timothy; McCowan, Peter M.
2015-12-15
Purpose: Radiation treatments are trending toward delivering higher doses per fraction under stereotactic radiosurgery and hypofractionated treatment regimens. There is a need for accurate 3D in vivo patient dose verification using electronic portal imaging device (EPID) measurements. This work presents a model-based technique to compute full three-dimensional patient dose reconstructed from on-treatment EPID portal images (i.e., transmission images). Methods: EPID dose is converted to incident fluence entering the patient using a series of steps which include converting measured EPID dose to fluence at the detector plane and then back-projecting the primary source component of the EPID fluence upstream of themore » patient. Incident fluence is then recombined with predicted extra-focal fluence and used to calculate 3D patient dose via a collapsed-cone convolution method. This method is implemented in an iterative manner, although in practice it provides accurate results in a single iteration. The robustness of the dose reconstruction technique is demonstrated with several simple slab phantom and nine anthropomorphic phantom cases. Prostate, head and neck, and lung treatments are all included as well as a range of delivery techniques including VMAT and dynamic intensity modulated radiation therapy (IMRT). Results: Results indicate that the patient dose reconstruction algorithm compares well with treatment planning system computed doses for controlled test situations. For simple phantom and square field tests, agreement was excellent with a 2%/2 mm 3D chi pass rate ≥98.9%. On anthropomorphic phantoms, the 2%/2 mm 3D chi pass rates ranged from 79.9% to 99.9% in the planning target volume (PTV) region and 96.5% to 100% in the low dose region (>20% of prescription, excluding PTV and skin build-up region). Conclusions: An algorithm to reconstruct delivered patient 3D doses from EPID exit dosimetry measurements was presented. The method was applied to phantom and patient data sets, as well as for dynamic IMRT and VMAT delivery techniques. Results indicate that the EPID dose reconstruction algorithm presented in this work is suitable for clinical implementation.« less
Application of fluence field modulation to proton computed tomography for proton therapy imaging.
Dedes, G; De Angelis, L; Rit, S; Hansen, D; Belka, C; Bashkirov, V; Johnson, R P; Coutrakon, G; Schubert, K E; Schulte, R W; Parodi, K; Landry, G
2017-07-12
This simulation study presents the application of fluence field modulated computed tomography, initially developed for x-ray CT, to proton computed tomography (pCT). By using pencil beam (PB) scanning, fluence modulated pCT (FMpCT) may achieve variable image quality in a pCT image and imaging dose reduction. Three virtual phantoms, a uniform cylinder and two patients, were studied using Monte Carlo simulations of an ideal list-mode pCT scanner. Regions of interest (ROI) were selected for high image quality and only PBs intercepting them preserved full fluence (FF). Image quality was investigated in terms of accuracy (mean) and noise (standard deviation) of the reconstructed proton relative stopping power compared to reference values. Dose calculation accuracy on FMpCT images was evaluated in terms of dose volume histograms (DVH), range difference (RD) for beam-eye-view (BEV) dose profiles and gamma evaluation. Pseudo FMpCT scans were created from broad beam experimental data acquired with a list-mode pCT prototype. FMpCT noise in ROIs was equivalent to FF images and accuracy better than -1.3%(-0.7%) by using 1% of FF for the cylinder (patients). Integral imaging dose reduction of 37% and 56% was achieved for the two patients for that level of modulation. Corresponding DVHs from proton dose calculation on FMpCT images agreed to those from reference images and 96% of BEV profiles had RD below 2 mm, compared to only 1% for uniform 1% of FF. Gamma pass rates (2%, 2 mm) were 98% for FMpCT while for uniform 1% of FF they were as low as 59%. Applying FMpCT to preliminary experimental data showed that low noise levels and accuracy could be preserved in a ROI, down to 30% modulation. We have shown, using both virtual and experimental pCT scans, that FMpCT is potentially feasible and may allow a means of imaging dose reduction for a pCT scanner operating in PB scanning mode. This may be of particular importance to proton therapy given the low integral dose found outside the target.
10 CFR 431.92 - Definitions concerning commercial air conditioners and heat pumps.
Code of Federal Regulations, 2013 CFR
2013-01-01
... expressed in identical units of measurement. Commercial package air-conditioning and heating equipment means... application. Computer Room Air Conditioner means a basic model of commercial package air-conditioning and heating equipment (packaged or split) that is: Used in computer rooms, data processing rooms, or other...
Experimental characterization of the AFIT neutron facility. Master's thesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lessard, O.J.
1993-09-01
AFIT's Neutron Facility was characterized for room-return neutrons using a (252)Cf source and a Bonner sphere spectrometer with three experimental models, the shadow shield, the Eisenhauer, Schwartz, and Johnson (ESJ), and the polynomial models. The free-field fluences at one meter from the ESJ and polynomial models were compared to the equivalent value from the accepted experimental shadow shield model to determine the suitability of the models in the AFIT facility. The polynomial model behaved erratically, as expected, while the ESJ model compared to within 4.8% of the shadow shield model results for the four Bonner sphere calibration. The ratio ofmore » total fluence to free-field fluence at one meter for the ESJ model was then compared to the equivalent ratio obtained by a Monte Cario Neutron-Photon transport code (MCNP), an accepted computational model. The ESJ model compared to within 6.2% of the MCNP results. AFIT's fluence ratios were compared to equivalent ratios reported by three other neutron facilities which verified that AFIT's results fit previously published trends based on room volumes. The ESJ model appeared adequate for health physics applications and was chosen was chosen for calibration of the AFIT facility. Neutron Detector, Bonner Sphere, Neutron Dosimetry, Room Characterization.« less
Consistent improvements in processor speed and computer access have substantially increased the use of computer modeling by experts and non-experts alike. Several new computer modeling packages operating under graphical operating systems (i.e. Microsoft Windows or Macintosh) m...
Structural Changes in Polymer Films by Fast Ion Implantation
NASA Astrophysics Data System (ADS)
Parada, M. A.; Minamisawa, R. A.; Muntele, C.; Muntele, I.; De Almeida, A.; Ila, D.
2006-11-01
In applications from food wrapping to solar sails, polymers films can be subjected to intense charged panicle bombardment and implantation. ETFE (ethylenetetrafluoroethylene) with high impact resistance is used for pumps, valves, tie wraps, and electrical components. PFA (tetrafluoroethylene-per-fluoromethoxyethylene) and FEP (tetrafluoroethylene-hexa-fluoropropylene) are sufficiently biocompatible to be used as transcutaneous implants since they resist damage from the ionizing space radiation, they can be used in aerospace engineering applications. PVDC (polyvinyllidene-chloride) is used for food packaging, and combined with others plastics, improves the oxygen barrier responsible for the food preservation. Fluoropolymers are also known for their radiation dosimetry applications, dependent on the type and energy of the radiation, as well as of the beam intensity. In this work ETFE, PFA, FEP and PVDC were irradiated with ions of keV and MeV energies at several fluences and were analyzed through techniques as RGA, OAP, FTIR, ATR and Raman spectrophotometry. CF3 is the main specie emitted from PFA and FEP when irradiated with MeV protons. H and HF are released from ETFE due to the broken C-F and C-H bonds when the polymer is irradiated with keV Nitrogen ions and protons. At high fluence, especially for keV Si and N, damage due to carbonization is observed with the formation of hydroperoxide and polymer dehydroflorination. The main broken bonds in PVDC are C-O and C-Cl, with the release of Cl and the formation of double carbon bonds. The ion fluence that causes damage, which could compromise fluoropolymer film applications, has been determined.
Open-source Software for Exoplanet Atmospheric Modeling
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph
2018-01-01
I will present a suite of self-standing open-source tools to model and retrieve exoplanet spectra implemented for Python. These include: (1) a Bayesian-statistical package to run Levenberg-Marquardt optimization and Markov-chain Monte Carlo posterior sampling, (2) a package to compress line-transition data from HITRAN or Exomol without loss of information, (3) a package to compute partition functions for HITRAN molecules, (4) a package to compute collision-induced absorption, and (5) a package to produce radiative-transfer spectra of transit and eclipse exoplanet observations and atmospheric retrievals.
NASA Astrophysics Data System (ADS)
Christiansen, Christian; Hartmann, Daniel
This paper documents a package of menu-driven POLYPASCAL87 computer programs for handling grouped observations data from both sieving (increment data) and settling tube procedures (cumulative data). The package is designed deliberately for use on IBM-compatible personal computers. Two of the programs solve the numerical problem of determining the estimates of the four (main) parameters of the log-hyperbolic distribution and their derivatives. The package also contains a program for determining the mean, sorting, skewness. and kurtosis according to the standard moments. Moreover, the package contains procedures for smoothing and grouping of settling tube data. A graphic part of the package plots the data in a log-log plot together with the estimated log-hyperbolic curve. Along with the plot follows all estimated parameters. Another graphic option is a plot of the log-hyperbolic shape triangle with the (χ,ζ) position of the sample.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Persson, Mats, E-mail: mats.persson@mi.physics.kth
Purpose: The highest photon fluence rate that a computed tomography (CT) detector must be able to measure is an important parameter. The authors calculate the maximum transmitted fluence rate in a commercial CT scanner as a function of patient size for standard head, chest, and abdomen protocols. Methods: The authors scanned an anthropomorphic phantom (Kyoto Kagaku PBU-60) with the reference CT protocols provided by AAPM on a GE LightSpeed VCT scanner and noted the tube current applied with the tube current modulation (TCM) system. By rescaling this tube current using published measurements on the tube current modulation of a GEmore » scanner [N. Keat, “CT scanner automatic exposure control systems,” MHRA Evaluation Report 05016, ImPACT, London, UK, 2005], the authors could estimate the tube current that these protocols would have resulted in for other patient sizes. An ECG gated chest protocol was also simulated. Using measured dose rate profiles along the bowtie filters, the authors simulated imaging of anonymized patient images with a range of sizes on a GE VCT scanner and calculated the maximum transmitted fluence rate. In addition, the 99th and the 95th percentiles of the transmitted fluence rate distribution behind the patient are calculated and the effect of omitting projection lines passing just below the skin line is investigated. Results: The highest transmitted fluence rates on the detector for the AAPM reference protocols with centered patients are found for head images and for intermediate-sized chest images, both with a maximum of 3.4 ⋅ 10{sup 8} mm{sup −2} s{sup −1}, at 949 mm distance from the source. Miscentering the head by 50 mm downward increases the maximum transmitted fluence rate to 5.7 ⋅ 10{sup 8} mm{sup −2} s{sup −1}. The ECG gated chest protocol gives fluence rates up to 2.3 ⋅ 10{sup 8} − 3.6 ⋅ 10{sup 8} mm{sup −2} s{sup −1} depending on miscentering. Conclusions: The fluence rate on a CT detector reaches 3 ⋅ 10{sup 8} − 6 ⋅ 10{sup 8} mm{sup −2} s{sup −1} in standard imaging protocols, with the highest rates occurring for ECG gated chest and miscentered head scans. These results will be useful to developers of CT detectors, in particular photon counting detectors.« less
NMRbox: A Resource for Biomolecular NMR Computation.
Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C
2017-04-25
Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y M; Bush, K; Han, B
Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less
Spin wave Feynman diagram vertex computation package
NASA Astrophysics Data System (ADS)
Price, Alexander; Javernick, Philip; Datta, Trinanjan
Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velişa, G.; Wendler, E.; Zhao, S.
A combined experimental and computational evaluation of damage accumulation in ion-irradiated Ni, NiFe, and NiFeCoCr is presented. Furthermore, a suppressed damage accumulation, at early stages (low-fluence irradiation), is revealed in NiFeCoCr, with a linear dependence as a function of ion fluence, in sharp contrast with Ni and NiFe. This effect, observed at 16 K, is attributed to the complex energy landscape in these alloys that limits defect mobility and therefore enhances defect interaction and recombination. Our results, together with previous room-temperature and high-temperature investigations, suggest "self-healing" as an intrinsic property of complex alloys that is not a thermally activated process.
Velişa, G.; Wendler, E.; Zhao, S.; ...
2017-12-17
A combined experimental and computational evaluation of damage accumulation in ion-irradiated Ni, NiFe, and NiFeCoCr is presented. Furthermore, a suppressed damage accumulation, at early stages (low-fluence irradiation), is revealed in NiFeCoCr, with a linear dependence as a function of ion fluence, in sharp contrast with Ni and NiFe. This effect, observed at 16 K, is attributed to the complex energy landscape in these alloys that limits defect mobility and therefore enhances defect interaction and recombination. Our results, together with previous room-temperature and high-temperature investigations, suggest "self-healing" as an intrinsic property of complex alloys that is not a thermally activated process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
C.A. Baldwin; F.B.K. Kam; I. Remec
1998-10-01
This report describes the computational methodology for the least-squares adjustment of the dosimetry data from the HSSI 10.OD dosimetry capsule with neutronics calculations. It presents exposure rates at each dosimetry location for the neutron fluence greater than 1.0 MeV, fluence greater than 0.1 MeV, and displacements per atom. Exposure parameter distributions are also described in terms of three- dimensional fitting functions. When fitting functions are used it is suggested that an uncertainty of 6% (1 o) should be associated with the exposure rate values. The specific activity of each dosimeter at the end of irradiation is listed in the Appendix.
NASA Technical Reports Server (NTRS)
Olmedo, L.
1980-01-01
The changes, modifications, and inclusions which were adapted to the current version of the MINIVER program are discussed. Extensive modifications were made to various subroutines, and a new plot package added. This plot package is the Johnson Space Center DISSPLA Graphics System currently driven under an 1110 EXEC 8 configuration. User instructions on executing the MINIVER program are provided and the plot package is described.
ERIC Educational Resources Information Center
Gambari, Isiaka Amosa; Ezenwa, Victoria Ifeoma; Anyanwu, Romanus Chogozie
2014-01-01
The study examined the effects of two modes of computer-assisted instructional package on solid geometry achievement amongst senior secondary school students in Minna, Niger State, Nigeria. Also, the influence of gender on the performance of students exposed to CAI(AT) and CAI(AN) packages were examined. This study adopted a pretest-posttest…
Computers and Writing. Learning Package No. 33.
ERIC Educational Resources Information Center
Simic, Marge, Comp.; Smith, Carl, Ed.
Originally developed as part of a project for the Department of Defense Schools (DoDDS) system, this learning package on computers and writing is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an…
3DHZETRN: Inhomogeneous Geometry Issues
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.
2017-01-01
Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.
An Interactive Computer Aided Design and Analysis Package.
1986-03-01
Al-A167 114 AN INTERACTIVE COMPUTER AIDED DESIGN MUD ANAILYSIS 1/ PACKAGE(U) NAVAL POSTGRADUATE SCHOOL NONTEREY CA T L EUALD "AR 86 UNCLSSIFIED F... SCHOOL Monterey, California DTIC .LECTE MAYOS THESIS AN INTERACTIVE COMPUTER AIDED DESIGN AND ANALYSIS PACKAGE by Terrence L. Ewald March 1986 jThesis...ORGANIZATION Naval Postgraduate School (if dAp90h81111) Naval Postgraduate School . 62A 6C. ADDRESS (0ty. State, and ZIP Code) 7b. ADDRESS (City State. and
Williams, C; Aubin, S; Harkin, P; Cottrell, D
2001-09-01
Computer-based teaching may allow effective teaching of important psychiatric knowledge and skills. To investigate the effectiveness and acceptability of computer-based teaching. A single-blind, randomized, controlled study of 166 undergraduate medical students at the University of Leeds, involving an educational intervention of either a structured lecture or a computer-based teaching package (both of equal duration). There was no difference in knowledge between the groups at baseline or immediately after teaching. Both groups made significant gains in knowledge after teaching. Students who attended the lecture rated their subjective knowledge and skills at a statistically significantly higher level than students who had used the computers. Students who had used the computer package scored higher on an objective measure of assessment skills. Students did not perceive the computer package to be as useful as the traditional lecture format, despite finding it easy to use and recommending its use to other students. Medical students rate themselves subjectively as learning less from computer-based as compared with lecture-based teaching. Objective measures suggest equivalence in knowledge acquisition and significantly greater skills acquisition for computer-based teaching.
ERIC Educational Resources Information Center
Pollard, Jim
This report reviews software packages for Apple Macintosh and Apple II computers available to secondary schools to teach computer-aided drafting (CAD). Products for the report were gathered through reviews of CAD periodicals, computers in education periodicals, advertisements, and teacher recommendations. The first section lists the primary…
Thaysen-Petersen, D; Barbet-Pfeilsticker, M; Beerwerth, F; Nash, J F; Philipsen, P A; Staubach, P; Haedersdal, M
2015-01-01
At-home laser and intense pulsed-light hair removal continues to grow in popularity and availability. A relatively limited body of evidence is available on the course of hair growth during and after low-fluence laser usage. To assess growing hair counts, thickness and colour quantitatively during and after cessation of low-fluence laser treatment. Thirty-six women with skin phototypes I-IV and light to dark-brown axillary hairs were included. Entire axillary regions were randomized to zero or eight self-administered weekly treatments with an 810-nm home-use laser at 5·0-6·4 J cm(-2). Standardized clinical photographs were taken before each treatment and up to 3 months after the final treatment for computer-aided quantification of growing hair counts, thickness and colour. Thirty-two women completed the study protocol. During sustained treatment, there was a reduction in growing hair that reached a plateau of up to 59%, while remaining hairs became up to 38% thinner and 5% lighter (P < 0·001). The majority of subjects (77%) reported 'moderately' to 'much less hair' in treated than untreated axilla, and assessed remaining hairs as thinner and lighter (≥ 60%). After treatment cessation, hair growth gradually returned to baseline levels, and 3 months after the final treatment the count and thickness of actively growing hair exceeded pretreatment values by 29% and 7%, respectively (P ≤ 0·04). Sustained usage of low-fluence laser induced a stable reduction of growing hair counts, thickness and colour. The reduction was reversible and hairs regrew beyond baseline values after cessation of usage. Computer-aided image analysis was qualified for quantification of hair counts, thickness and colour after laser epilation. © 2014 British Association of Dermatologists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Folkerts, MM; University of California San Diego, La Jolla, California; Long, T
Purpose: To provide a tool to generate large sets of realistic virtual patient geometries and beamlet doses for treatment optimization research. This tool enables countless studies exploring the fundamental interplay between patient geometry, objective functions, weight selections, and achievable dose distributions for various algorithms and modalities. Methods: Generating realistic virtual patient geometries requires a small set of real patient data. We developed a normalized patient shape model (PSM) which captures organ and target contours in a correspondence-preserving manner. Using PSM-processed data, we perform principal component analysis (PCA) to extract major modes of variation from the population. These PCA modes canmore » be shared without exposing patient information. The modes are re-combined with different weights to produce sets of realistic virtual patient contours. Because virtual patients lack imaging information, we developed a shape-based dose calculation (SBD) relying on the assumption that the region inside the body contour is water. SBD utilizes a 2D fluence-convolved scatter kernel, derived from Monte Carlo simulations, and can compute both full dose for a given set of fluence maps, or produce a dose matrix (dose per fluence pixel) for many modalities. Combining the shape model with SBD provides the data needed for treatment plan optimization research. Results: We used PSM to capture organ and target contours for 96 prostate cases, extracted the first 20 PCA modes, and generated 2048 virtual patient shapes by randomly sampling mode scores. Nearly half of the shapes were thrown out for failing anatomical checks, the remaining 1124 were used in computing dose matrices via SBD and a standard 7-beam protocol. As a proof of concept, and to generate data for later study, we performed fluence map optimization emphasizing PTV coverage. Conclusions: We successfully developed and tested a tool for creating customizable sets of virtual patients suitable for large-scale radiation therapy optimization research.« less
Advance Directives and Do Not Resuscitate Orders
... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...
EQS Goes R: Simulations for SEM Using the Package REQS
ERIC Educational Resources Information Center
Mair, Patrick; Wu, Eric; Bentler, Peter M.
2010-01-01
The REQS package is an interface between the R environment of statistical computing and the EQS software for structural equation modeling. The package consists of 3 main functions that read EQS script files and import the results into R, call EQS script files from R, and run EQS script files from R and import the results after EQS computations.…
The Computer as an Aid to Reading Instruction. Learning Package No. 27.
ERIC Educational Resources Information Center
Simic, Marge, Comp.; Smith, Carl, Ed.
Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on computer use in reading is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an overview on the…
Secondary bremsstrahlung and the energy-conservation aspects of kerma in photon-irradiated media.
Kumar, Sudhir; Nahum, Alan E
2016-02-07
Kerma, collision kerma and absorbed dose in media irradiated by megavoltage photons are analysed with respect to energy conservation. The user-code DOSRZnrc was employed to compute absorbed dose D, kerma K and a special form of kerma, K ncpt, obtained by setting the charged-particle transport energy cut-off very high, thereby preventing the generation of 'secondary bremsstrahlung' along the charged-particle paths. The user-code FLURZnrc was employed to compute photon fluence, differential in energy, from which collision kerma, K col and K were derived. The ratios K/D, K ncpt/D and K col/D have thereby been determined over a very large volumes of water, aluminium and copper irradiated by broad, parallel beams of 0.1 to 25 MeV monoenergetic photons, and 6, 10 and 15 MV 'clinical' radiotherapy qualities. Concerning depth-dependence, the 'area under the kerma, K, curve' exceeded that under the dose curve, demonstrating that kerma does not conserve energy when computed over a large volume. This is due to the 'double counting' of the energy of the secondary bremsstrahlung photons, this energy being (implicitly) included in the kerma 'liberated' in the irradiated medium, at the same time as this secondary bremsstrahlung is included in the photon fluence which gives rise to kerma elsewhere in the medium. For 25 MeV photons this 'violation' amounts to 8.6%, 14.2% and 25.5% in large volumes of water, aluminium and copper respectively but only 0.6% for a 'clinical' 6 MV beam in water. By contrast, K col/D and K ncpt/D, also computed over very large phantoms of the same three media, for the same beam qualities, are equal to unity within (very low) statistical uncertainties, demonstrating that collision kerma and the special type of kerma, K ncpt, do conserve energy over a large volume. A comparison of photon fluence spectra for the 25 MeV beam at a depth of ≈51 g cm−2 for both very high and very low charged-particle transport cut-offs reveals the considerable contribution to the total photon fluence by secondary bremsstrahlung in the latter case. Finally, a correction to the 'kerma integral' has been formulated to account for the energy transferred to charged particles by photons with initial energies below the Monte-Carlo photon transport cut-off PCUT; for 25 MeV photons this 'photon track end' correction is negligible for all PCUT below 10 keV.
NASA Astrophysics Data System (ADS)
Yao, Weiguang; Merchant, Thomas E.; Farr, Jonathan B.
2016-10-01
The lateral homogeneity assumption is used in most analytical algorithms for proton dose, such as the pencil-beam algorithms and our simplified analytical random walk model. To improve the dose calculation in the distal fall-off region in heterogeneous media, we analyzed primary proton fluence near heterogeneous media and propose to calculate the lateral fluence with voxel-specific Gaussian distributions. The lateral fluence from a beamlet is no longer expressed by a single Gaussian for all the lateral voxels, but by a specific Gaussian for each lateral voxel. The voxel-specific Gaussian for the beamlet of interest is calculated by re-initializing the fluence deviation on an effective surface where the proton energies of the beamlet of interest and the beamlet passing the voxel are the same. The dose improvement from the correction scheme was demonstrated by the dose distributions in two sets of heterogeneous phantoms consisting of cortical bone, lung, and water and by evaluating distributions in example patients with a head-and-neck tumor and metal spinal implants. The dose distributions from Monte Carlo simulations were used as the reference. The correction scheme effectively improved the dose calculation accuracy in the distal fall-off region and increased the gamma test pass rate. The extra computation for the correction was about 20% of that for the original algorithm but is dependent upon patient geometry.
A New Streamflow-Routing (SFR1) Package to Simulate Stream-Aquifer Interaction with MODFLOW-2000
Prudic, David E.; Konikow, Leonard F.; Banta, Edward R.
2004-01-01
The increasing concern for water and its quality require improved methods to evaluate the interaction between streams and aquifers and the strong influence that streams can have on the flow and transport of contaminants through many aquifers. For this reason, a new Streamflow-Routing (SFR1) Package was written for use with the U.S. Geological Survey's MODFLOW-2000 ground-water flow model. The SFR1 Package is linked to the Lake (LAK3) Package, and both have been integrated with the Ground-Water Transport (GWT) Process of MODFLOW-2000 (MODFLOW-GWT). SFR1 replaces the previous Stream (STR1) Package, with the most important difference being that stream depth is computed at the midpoint of each reach instead of at the beginning of each reach, as was done in the original Stream Package. This approach allows for the addition and subtraction of water from runoff, precipitation, and evapotranspiration within each reach. Because the SFR1 Package computes stream depth differently than that for the original package, a different name was used to distinguish it from the original Stream (STR1) Package. The SFR1 Package has five options for simulating stream depth and four options for computing diversions from a stream. The options for computing stream depth are: a specified value; Manning's equation (using a wide rectangular channel or an eight-point cross section); a power equation; or a table of values that relate flow to depth and width. Each stream segment can have a different option. Outflow from lakes can be computed using the same options. Because the wetted perimeter is computed for the eight-point cross section and width is computed for the power equation and table of values, the streambed conductance term no longer needs to be calculated externally whenever the area of streambed changes as a function of flow. The concentration of solute is computed in a stream network when MODFLOW-GWT is used in conjunction with the SFR1 Package. The concentration of a solute in a stream reach is based on a mass-balance approach and accounts for exchanges with (inputs from or losses to) ground-water systems. Two test examples are used to illustrate some of the capabilities of the SFR1 Package. The first test simulation was designed to illustrate how pumping of ground water from an aquifer connected to streams can affect streamflow, depth, width, and streambed conductance using the different options. The second test simulation was designed to illustrate solute transport through interconnected lakes, streams, and aquifers. Because of the need to examine time series results from the model simulations, the Gage Package first described in the LAK3 documentation was revised to include time series results of selected variables (streamflows, stream depth and width, streambed conductance, solute concentrations, and solute loads) for specified stream reaches. The mass-balance or continuity approach for routing flow and solutes through a stream network may not be applicable for all interactions between streams and aquifers. The SFR1 Package is best suited for modeling long-term changes (months to hundreds of years) in ground-water flow and solute concentrations using averaged flows in streams. The Package is not recommended for modeling the transient exchange of water between streams and aquifers when the objective is to examine short-term (minutes to days) effects caused by rapidly changing streamflows.
Optimal segmentation and packaging process
Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.
1999-01-01
A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.
Hijnen, W A M; Beerendonk, E F; Medema, G J
2006-01-01
UV disinfection technology is of growing interest in the water industry since it was demonstrated that UV radiation is very effective against (oo)cysts of Cryptosporidium and Giardia, two pathogenic micro-organisms of major importance for the safety of drinking water. Quantitative Microbial Risk Assessment, the new concept for microbial safety of drinking water and wastewater, requires quantitative data of the inactivation or removal of pathogenic micro-organisms by water treatment processes. The objective of this study was to review the literature on UV disinfection and extract quantitative information about the relation between the inactivation of micro-organisms and the applied UV fluence. The quality of the available studies was evaluated and only high-quality studies were incorporated in the analysis of the inactivation kinetics. The results show that UV is effective against all waterborne pathogens. The inactivation of micro-organisms by UV could be described with first-order kinetics using fluence-inactivation data from laboratory studies in collimated beam tests. No inactivation at low fluences (offset) and/or no further increase of inactivation at higher fluences (tailing) was observed for some micro-organisms. Where observed, these were included in the description of the inactivation kinetics, even though the cause of tailing is still a matter of debate. The parameters that were used to describe inactivation are the inactivation rate constant k (cm(2)/mJ), the maximum inactivation demonstrated and (only for bacterial spores and Acanthamoeba) the offset value. These parameters were the basis for the calculation of the microbial inactivation credit (MIC="log-credits") that can be assigned to a certain UV fluence. The most UV-resistant organisms are viruses, specifically Adenoviruses, and bacterial spores. The protozoon Acanthamoeba is also highly UV resistant. Bacteria and (oo)cysts of Cryptosporidium and Giardia are more susceptible with a fluence requirement of <20 mJ/cm(2) for an MIC of 3 log. Several studies have reported an increased UV resistance of environmental bacteria and bacterial spores, compared to lab-grown strains. This means that higher UV fluences are required to obtain the same level of inactivation. Hence, for bacteria and spores, a correction factor of 2 and 4 was included in the MIC calculation, respectively, whereas some wastewater studies suggest that a correction of a factor of 7 is needed under these conditions. For phages and viruses this phenomenon appears to be of little significance and for protozoan (oo)cysts this aspect needs further investigation. Correction of the required fluence for DNA repair is considered unnecessary under the conditions of drinking water practice (no photo-repair, dark repair insignificant, esp. at higher (60 mJ/cm(2)) fluences) and probably also wastewater practice (photo-repair limited by light absorption). To enable accurate assessment of the effective fluence in continuous flow UV systems in water treatment practice, biodosimetry is still essential, although the use of computational fluid dynamics (CFD) improves the description of reactor hydraulics and fluence distribution. For UV systems that are primarily dedicated to inactivate the more sensitive pathogens (Cryptosporidium, Giardia, pathogenic bacteria), additional model organisms are needed to serve as biodosimeter.
NASA Astrophysics Data System (ADS)
Pan, A. F.; Wang, W. J.; Mei, X. S.; Zheng, B. X.; Yan, Z. X.
2016-11-01
This study reported on the formation of sub-5-μm microstructures covered on titanium by cracks growth under 10-ns laser radiation at the wavelength of 532 nm and its induced light modification for production of nanostructures. The electric field intensity and laser power density absorbed by commercial pure titanium were computed to investigate the self-trapping introduced by cracks and the effect of surface morphology on laser propagation characteristics. It is found that nanostructures can form at the surface with the curvature radius below 20 μm. Meanwhile, variable laser fluences were applied to explore the evolution of cracks on commercial pure titanium with or without melt as spot overlap number increased. Experimental study was first performed at the peak laser fluence of 1.063 J/cm2 to investigate the microstructures induced only by cracks growth. The results demonstrated that angular microstructures with size between 1.68 μm and 4.74 μm was obtained and no nanostructure covered. Then, at the peak laser fluence of 2.126 J/cm2, there were some nanostructures covered on the melt-induced curved microstructured surface. However, surface molten material submerged in the most of cracks at the spot overlap number of 744, where the old cracks disappeared. The results indicated that there was too much molten material and melting time at the peak laser fluence of 2.126 J/cm2, which was not suitable for obtainment of perfect micro-nano structures. On this basis, peak laser fluence was reduced down to 1.595 J/cm2 and the sharp sub-5 μm microstructures with nanostructures covered was obtained at spot overlap number of 3720.
Reactive oxygen species explicit dosimetry (ROSED) of a type 1 photosensitizer
NASA Astrophysics Data System (ADS)
Ong, Yi Hong; Kim, Michele M.; Huang, Zheng; Zhu, Timothy C.
2018-02-01
Type I photodynamic therapy (PDT) is based on the use of photochemical reactions mediated through an interaction between a tumor-selective photosensitizer, photoexcitation with a specific wavelength of light, and production of reactive oxygen species (ROS). The goal of this study is to develop a model to calculate reactive oxygen species concentration ([ROS]rx) after Tookad®-mediated vascular PDT. Mice with radiation-induced fibrosarcoma (RIF) tumors were treated with different light fluence and fluence rate conditions. Explicit measurements of photosensitizer drug concentration were made via diffuse reflective absorption spectrum using a contact probe before and after PDT. Blood flow and tissue oxygen concentration over time were measured during PDT as a mean to validate the photochemical parameters for the ROSED calculation. Cure index was computed from the rate of tumor regrowth after treatment and was compared against three calculated dose metrics: total light fluence, PDT dose, reacted [ROS]rx. The tumor growth study demonstrates that [ROS]rx serves as a better dosimetric quantity for predicting treatment outcome, as a clinically relevant tumor growth endpoint.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, X. R.; Poenisch, F.; Lii, M.
2013-04-15
Purpose: To present our method and experience in commissioning dose models in water for spot scanning proton therapy in a commercial treatment planning system (TPS). Methods: The input data required by the TPS included in-air transverse profiles and integral depth doses (IDDs). All input data were obtained from Monte Carlo (MC) simulations that had been validated by measurements. MC-generated IDDs were converted to units of Gy mm{sup 2}/MU using the measured IDDs at a depth of 2 cm employing the largest commercially available parallel-plate ionization chamber. The sensitive area of the chamber was insufficient to fully encompass the entire lateralmore » dose deposited at depth by a pencil beam (spot). To correct for the detector size, correction factors as a function of proton energy were defined and determined using MC. The fluence of individual spots was initially modeled as a single Gaussian (SG) function and later as a double Gaussian (DG) function. The DG fluence model was introduced to account for the spot fluence due to contributions of large angle scattering from the devices within the scanning nozzle, especially from the spot profile monitor. To validate the DG fluence model, we compared calculations and measurements, including doses at the center of spread out Bragg peaks (SOBPs) as a function of nominal field size, range, and SOBP width, lateral dose profiles, and depth doses for different widths of SOBP. Dose models were validated extensively with patient treatment field-specific measurements. Results: We demonstrated that the DG fluence model is necessary for predicting the field size dependence of dose distributions. With this model, the calculated doses at the center of SOBPs as a function of nominal field size, range, and SOBP width, lateral dose profiles and depth doses for rectangular target volumes agreed well with respective measured values. With the DG fluence model for our scanning proton beam line, we successfully treated more than 500 patients from March 2010 through June 2012 with acceptable agreement between TPS calculated and measured dose distributions. However, the current dose model still has limitations in predicting field size dependence of doses at some intermediate depths of proton beams with high energies. Conclusions: We have commissioned a DG fluence model for clinical use. It is demonstrated that the DG fluence model is significantly more accurate than the SG fluence model. However, some deficiencies in modeling the low-dose envelope in the current dose algorithm still exist. Further improvements to the current dose algorithm are needed. The method presented here should be useful for commissioning pencil beam dose algorithms in new versions of TPS in the future.« less
Zhu, X. R.; Poenisch, F.; Lii, M.; Sawakuchi, G. O.; Titt, U.; Bues, M.; Song, X.; Zhang, X.; Li, Y.; Ciangaru, G.; Li, H.; Taylor, M. B.; Suzuki, K.; Mohan, R.; Gillin, M. T.; Sahoo, N.
2013-01-01
Purpose: To present our method and experience in commissioning dose models in water for spot scanning proton therapy in a commercial treatment planning system (TPS). Methods: The input data required by the TPS included in-air transverse profiles and integral depth doses (IDDs). All input data were obtained from Monte Carlo (MC) simulations that had been validated by measurements. MC-generated IDDs were converted to units of Gy mm2/MU using the measured IDDs at a depth of 2 cm employing the largest commercially available parallel-plate ionization chamber. The sensitive area of the chamber was insufficient to fully encompass the entire lateral dose deposited at depth by a pencil beam (spot). To correct for the detector size, correction factors as a function of proton energy were defined and determined using MC. The fluence of individual spots was initially modeled as a single Gaussian (SG) function and later as a double Gaussian (DG) function. The DG fluence model was introduced to account for the spot fluence due to contributions of large angle scattering from the devices within the scanning nozzle, especially from the spot profile monitor. To validate the DG fluence model, we compared calculations and measurements, including doses at the center of spread out Bragg peaks (SOBPs) as a function of nominal field size, range, and SOBP width, lateral dose profiles, and depth doses for different widths of SOBP. Dose models were validated extensively with patient treatment field-specific measurements. Results: We demonstrated that the DG fluence model is necessary for predicting the field size dependence of dose distributions. With this model, the calculated doses at the center of SOBPs as a function of nominal field size, range, and SOBP width, lateral dose profiles and depth doses for rectangular target volumes agreed well with respective measured values. With the DG fluence model for our scanning proton beam line, we successfully treated more than 500 patients from March 2010 through June 2012 with acceptable agreement between TPS calculated and measured dose distributions. However, the current dose model still has limitations in predicting field size dependence of doses at some intermediate depths of proton beams with high energies. Conclusions: We have commissioned a DG fluence model for clinical use. It is demonstrated that the DG fluence model is significantly more accurate than the SG fluence model. However, some deficiencies in modeling the low-dose envelope in the current dose algorithm still exist. Further improvements to the current dose algorithm are needed. The method presented here should be useful for commissioning pencil beam dose algorithms in new versions of TPS in the future. PMID:23556893
Zhu, X R; Poenisch, F; Lii, M; Sawakuchi, G O; Titt, U; Bues, M; Song, X; Zhang, X; Li, Y; Ciangaru, G; Li, H; Taylor, M B; Suzuki, K; Mohan, R; Gillin, M T; Sahoo, N
2013-04-01
To present our method and experience in commissioning dose models in water for spot scanning proton therapy in a commercial treatment planning system (TPS). The input data required by the TPS included in-air transverse profiles and integral depth doses (IDDs). All input data were obtained from Monte Carlo (MC) simulations that had been validated by measurements. MC-generated IDDs were converted to units of Gy mm(2)/MU using the measured IDDs at a depth of 2 cm employing the largest commercially available parallel-plate ionization chamber. The sensitive area of the chamber was insufficient to fully encompass the entire lateral dose deposited at depth by a pencil beam (spot). To correct for the detector size, correction factors as a function of proton energy were defined and determined using MC. The fluence of individual spots was initially modeled as a single Gaussian (SG) function and later as a double Gaussian (DG) function. The DG fluence model was introduced to account for the spot fluence due to contributions of large angle scattering from the devices within the scanning nozzle, especially from the spot profile monitor. To validate the DG fluence model, we compared calculations and measurements, including doses at the center of spread out Bragg peaks (SOBPs) as a function of nominal field size, range, and SOBP width, lateral dose profiles, and depth doses for different widths of SOBP. Dose models were validated extensively with patient treatment field-specific measurements. We demonstrated that the DG fluence model is necessary for predicting the field size dependence of dose distributions. With this model, the calculated doses at the center of SOBPs as a function of nominal field size, range, and SOBP width, lateral dose profiles and depth doses for rectangular target volumes agreed well with respective measured values. With the DG fluence model for our scanning proton beam line, we successfully treated more than 500 patients from March 2010 through June 2012 with acceptable agreement between TPS calculated and measured dose distributions. However, the current dose model still has limitations in predicting field size dependence of doses at some intermediate depths of proton beams with high energies. We have commissioned a DG fluence model for clinical use. It is demonstrated that the DG fluence model is significantly more accurate than the SG fluence model. However, some deficiencies in modeling the low-dose envelope in the current dose algorithm still exist. Further improvements to the current dose algorithm are needed. The method presented here should be useful for commissioning pencil beam dose algorithms in new versions of TPS in the future.
NASA Technical Reports Server (NTRS)
Hesser, R. J.; Gershman, R.
1975-01-01
A valve opening-response problem encountered during development of a control valve for the Skylab thruster attitude control system (TACS) is described. The problem involved effects of dynamic interaction among valves in the quad-redundant valve package. Also described is a detailed computer simulation of the quad-valve package which was helpful in resolving the problem.
Response Funtions for Computing Absorbed Dose to Skeletal Tissues from Photon Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eckerman, Keith F; Bolch, W E; Zankl, M
2007-01-01
The calculation of absorbed dose in skeletal tissues at radiogenic risk has been a difficult problem because the relevant structures cannot be represented in conventional geometric terms nor can they be visualised in the tomographic image data used to define the computational models of the human body. The active marrow, the tissue of concern in leukaemia induction, is present within the spongiosa regions of trabecular bone, whereas the osteoprogenitor cells at risk for bone cancer induction are considered to be within the soft tissues adjacent to the mineral surfaces. The International Commission on Radiological Protection (ICRP) recommends averaging the absorbedmore » energy over the active marrow within the spongiosa and over the soft tissues within 10 mm of the mineral surface for leukaemia and bone cancer induction, respectively. In its forthcoming recommendation, it is expected that the latter guidance will be changed to include soft tissues within 50 mm of the mineral surfaces. To address the computational problems, the skeleton of the proposed ICRP reference computational phantom has been subdivided to identify those voxels associated with cortical shell, spongiosa and the medullary cavity of the long bones. It is further proposed that the Monte Carlo calculations with these phantoms compute the energy deposition in the skeletal target tissues as the product of the particle fluence in the skeletal subdivisions and applicable fluence-to-dose response functions. This paper outlines the development of such response functions for photons.« less
Computer Managed Instruction: An Application in Teaching Introductory Statistics.
ERIC Educational Resources Information Center
Hudson, Walter W.
1985-01-01
This paper describes a computer managed instruction package for teaching introductory or advanced statistics. The instructional package is described and anecdotal information concerning its performance and student responses to its use over two semesters are given. (Author/BL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sofrata, H.; Khoshaim, B.; Megahed, M.
1980-12-01
In this paper a computer package for the design and optimization of the simple Li-Br absorption air conditioning system, operated by solar energy, is developed in order to study its performance. This was necessary, as a first step, before carrying out any computations regarding the dual system (1-3). The computer package has the facilities of examining any parameter which may control the system; namely generator, evaporator, condenser, absorber temperatures and pumping factor. The output may be tabulated and also fed to the graph plotter. The flow chart of the programme is explained in an easy way and a typical examplemore » is included.« less
NASA Technical Reports Server (NTRS)
1993-01-01
Developed under a Small Business Innovation Research (SBIR) contract, RAMPANT is a CFD software package for computing flow around complex shapes. The package is flexible, fast and easy to use. It has found a great number of applications, including computation of air flow around a Nordic ski jumper, prediction of flow over an airfoil and computation of the external aerodynamics of motor vehicles.
A Freeware Path to Neutron Computed Tomography
NASA Astrophysics Data System (ADS)
Schillinger, Burkhard; Craft, Aaron E.
Neutron computed tomography has become a routine method at many neutron sources due to the availability of digital detection systems, powerful computers and advanced software. The commercial packages Octopus by Inside Matters and VGStudio by Volume Graphics have been established as a quasi-standard for high-end computed tomography. However, these packages require a stiff investment and are available to the users only on-site at the imaging facility to do their data processing. There is a demand from users to have image processing software at home to do further data processing; in addition, neutron computed tomography is now being introduced even at smaller and older reactors. Operators need to show a first working tomography setup before they can obtain a budget to build an advanced tomography system. Several packages are available on the web for free; however, these have been developed for X-rays or synchrotron radiation and are not immediately useable for neutron computed tomography. Three reconstruction packages and three 3D-viewers have been identified and used even for Gigabyte datasets. This paper is not a scientific publication in the classic sense, but is intended as a review to provide searchable help to make the described packages usable for the tomography community. It presents the necessary additional preprocessing in ImageJ, some workarounds for bugs in the software, and undocumented or badly documented parameters that need to be adapted for neutron computed tomography. The result is a slightly complicated, but surprisingly high-quality path to neutron computed tomography images in 3D, but not a replacement for the even more powerful commercial software mentioned above.
Computer assisted learning (CAL) of oral manifestations of HIV disease.
Porter, S R; Telford, A; Chandler, K; Furber, S; Williams, J; Price, S; Scully, C; Triantos, D; Bain, L
1996-09-07
General dental practitioners (GDPs) in the UK may wish additional education on relevant aspects of human immunodeficiency virus (HIV) disease. The aim of the present study was to develop and assess a computer assisted learning package on the oral manifestations of HIV disease of relevance to GDPs. A package was developed using a commercially-available software development tool and assessed by a group of 75 GDPs interested in education and computers. Fifty-four (72%) of the GDPs completed a self-administered questionnaire of their opinions of the package. The majority reported the package to be easy to load and run, that it provided clear instructions and displays, and that it was a more effective educational tool than videotapes, audiotapes, professional journals and textbooks, and of similar benefit as post-graduate courses. The GDPs often commented favourably on the effectiveness of the clinical images and use of questions and answers, although some had criticisms of these and other aspects of the package. As a consequence of this investigation the package has been modified and distributed to GDPs in England and Wales.
An Ada Linear-Algebra Software Package Modeled After HAL/S
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.; Lawson, Charles L.
1990-01-01
New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Accounting utility for determining individual usage of production level software systems
NASA Technical Reports Server (NTRS)
Garber, S. C.
1984-01-01
An accounting package was developed which determines the computer resources utilized by a user during the execution of a particular program and updates a file containing accumulated resource totals. The accounting package is divided into two separate programs. The first program determines the total amount of computer resources utilized by a user during the execution of a particular program. The second program uses these totals to update a file containing accumulated totals of computer resources utilized by a user for a particular program. This package is useful to those persons who have several other users continually accessing and running programs from their accounts. The package provides the ability to determine which users are accessing and running specified programs along with their total level of usage.
Optimal segmentation and packaging process
Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.
1999-08-10
A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.
Hypertext-based computer vision teaching packages
NASA Astrophysics Data System (ADS)
Marshall, A. David
1994-10-01
The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole INTERNET. Many applications have been developed on such http servers. At Cardiff we have developed a http hypertext based multimedia server, the Cardiff Information Server, using the widely available Mosaic system. The server provides a variety of information ranging from the provision of teaching modules, on- line documentation, timetables for departmental activities to more light hearted hobby interests. One important and novel development to the server has been the development of courseware facilities. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive teaching packages. A variety of disciplines have benefitted notably Computer Vision, and Image Processing but also C programming, X Windows, Computer Graphics and Parallel Computing. This paper will address the issues of the implementation of the Computer Vision and Image Processing packages, the advantages gained from using a hypertext based system and also will relate practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed and integrated into courseware. The suite of tools developed facilitates a flexible and powerful courseware package that has proved popular in the classroom and over the Internet. The paper will also detail many future developments we see possible. One of the key points raised in the paper is that Mosaic's hypertext language (html) is extremely powerful and yet relatively straightforward to use. It is also possible to link in Unix calls so that programs and shells can be executed. This provides a powerful suite of utilities that can be exploited to develop many packages.
Dosimetry on the Spacelab missions IML1 and IML2, and D2 and on MIR.
Reitz, G; Beaujean, R; Heilmann, C; Kopp, J; Leicher, M; Strauch, K
1996-11-01
Detector packages consisting of plastic nuclear track detectors, nuclear emulsions, and thermoluminescence detectors were exposed inside BIORACK during the Spacelab missions IML1 and IML2, in different sections of the MIR space station, and inside the Spacelab module at rack front panels or stowage lockers and in the Spacelab tunnel during D2. In addition, during D2, each Payload Specialist (PS) has worn three permanent detector packages; one at the neck; one at the waist; and one at the ankle. Total dose measurements, particle fluence rate and LET spectra, number of nuclear disintegrations and neutron dose from this exposure are given in this report. The results are compared to theoretical calculations and to previous missions results. The dose equivalent (total radiation exposure) received by the PSs were calculated from the measurements and range from 190 to 770 microSv d-1. Finally, a cursory investigation of results from a particle telescope from two silicon detectors, first used in the last BIORACK mission on STS76, is reported.
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
ERIC Educational Resources Information Center
Nosik, Melissa R.; Williams, W. Larry; Garrido, Natalia; Lee, Sarah
2013-01-01
In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following…
Virginia Transit Performance Evaluation Package (VATPEP).
DOT National Transportation Integrated Search
1987-01-01
The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...
21 CFR 1314.110 - Reports for mail-order sales.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...
21 CFR 1314.110 - Reports for mail-order sales.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...
21 CFR 1314.110 - Reports for mail-order sales.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...
21 CFR 1314.110 - Reports for mail-order sales.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...
21 CFR 1314.110 - Reports for mail-order sales.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Administration, submit the report in electronic form, either via computer disk or direct electronic data... units (e.g., 100 doses per package); (11) Package type (blister pack, etc.); (12) Number of packages...
spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains
NASA Astrophysics Data System (ADS)
Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo
2016-09-01
The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.
NASA Astrophysics Data System (ADS)
Waghorn, Ben J.; Shah, Amish P.; Ngwa, Wilfred; Meeks, Sanford L.; Moore, Joseph A.; Siebers, Jeffrey V.; Langen, Katja M.
2010-07-01
Intra-fraction organ motion during intensity-modulated radiation therapy (IMRT) treatment can cause differences between the planned and the delivered dose distribution. To investigate the extent of these dosimetric changes, a computational model was developed and validated. The computational method allows for calculation of the rigid motion perturbed three-dimensional dose distribution in the CT volume and therefore a dose volume histogram-based assessment of the dosimetric impact of intra-fraction motion on a rigidly moving body. The method was developed and validated for both step-and-shoot IMRT and solid compensator IMRT treatment plans. For each segment (or beam), fluence maps were exported from the treatment planning system. Fluence maps were shifted according to the target position deduced from a motion track. These shifted, motion-encoded fluence maps were then re-imported into the treatment planning system and were used to calculate the motion-encoded dose distribution. To validate the accuracy of the motion-encoded dose distribution the treatment plan was delivered to a moving cylindrical phantom using a programmed four-dimensional motion phantom. Extended dose response (EDR-2) film was used to measure a planar dose distribution for comparison with the calculated motion-encoded distribution using a gamma index analysis (3% dose difference, 3 mm distance-to-agreement). A series of motion tracks incorporating both inter-beam step-function shifts and continuous sinusoidal motion were tested. The method was shown to accurately predict the film's dose distribution for all of the tested motion tracks, both for the step-and-shoot IMRT and compensator plans. The average gamma analysis pass rate for the measured dose distribution with respect to the calculated motion-encoded distribution was 98.3 ± 0.7%. For static delivery the average film-to-calculation pass rate was 98.7 ± 0.2%. In summary, a computational technique has been developed to calculate the dosimetric effect of intra-fraction motion. This technique has the potential to evaluate a given plan's sensitivity to anticipated organ motion. With knowledge of the organ's motion it can also be used as a tool to assess the impact of measured intra-fraction motion after dose delivery.
Study of the TRAC Airfoil Table Computational System
NASA Technical Reports Server (NTRS)
Hu, Hong
1999-01-01
The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.
Tuning wettability of hydrogen titanate nanowire mesh by Na+ irradiation
NASA Astrophysics Data System (ADS)
Das, Pritam; Chatterjee, Shyamal
2018-04-01
Hydrogen titanate (HT) nanowires have been widely studied for remarkable properties and various potential applications. However, a handful studies are available related to ion beam induced structural changes and influence on wetting behavior of the HT nanowire surface. In this work, we exposed HT nanowires to 5 keV Na+ at an ion fluence of 1×1016 ions.cm-2. Scanning electron microscope shows that at this ion fluence nanowires are bent arbitrarily and they are welded to each other forming an interlinked network structure. Computer simulation shows that ion beam induces defect formation in the nanowires, which plays major role in such structural modifications. An interesting alteration of surface wetting property is observed due to ion irradiation. The hydrophilic pristine surface turns into hydrophobic after ion irradiation.
The Hidden Cost of Buying a Computer.
ERIC Educational Resources Information Center
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
Implementation and use of direct-flow connections in a coupled ground-water and surface-water model
Swain, Eric D.
1994-01-01
The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.
Development of a computer-assisted learning software package on dental traumatology.
Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C
1998-10-01
The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.
Isotopic Dependence of GCR Fluence behind Shielding
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Wilson, John W.; Saganti, Premkumar; Kim, Myung-Hee Y.; Cleghorn, Timothy; Zeitlin, Cary; Tripathi, Ram K.
2006-01-01
In this paper we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR), nuclear fragmentation cross-sections, and isotopic-grid on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. For the nuclear interaction data-base and transport solution, we use the quantum multiple-scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, respectively. The QMSFRG model is shown to accurately describe existing fragmentation data including proper description of the odd-even effects as function of the iso-spin dependence on the projectile nucleus. The principle finding of this study is that large errors (+/-100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotopic-grid (approx.170 ions) to ones that use a reduced isotopic-grid, for example the 59 ion-grid used in the HZETRN code in the past, however less significant errors (<+/-20%) occur in the elemental-fluence spectra. Because a complete isotopic-grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies.
An Introduction to Research and the Computer: A Self-Instructional Package.
ERIC Educational Resources Information Center
Vasu, Ellen Storey; Palmer, Richard I.
This self-instructional package includes learning objectives, definitions, exercises, and feedback for learning some basic concepts and skills involved in using computers for analyzing data and understanding basic research terminology. Learning activities are divided into four sections: research and research hypotheses; variables, cases, and…
Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei
2017-01-01
Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Review and analysis of dense linear system solver package for distributed memory machines
NASA Technical Reports Server (NTRS)
Narang, H. N.
1993-01-01
A dense linear system solver package recently developed at the University of Texas at Austin for distributed memory machine (e.g. Intel Paragon) has been reviewed and analyzed. The package contains about 45 software routines, some written in FORTRAN, and some in C-language, and forms the basis for parallel/distributed solutions of systems of linear equations encountered in many problems of scientific and engineering nature. The package, being studied by the Computer Applications Branch of the Analysis and Computation Division, may provide a significant computational resource for NASA scientists and engineers in parallel/distributed computing. Since the package is new and not well tested or documented, many of its underlying concepts and implementations were unclear; our task was to review, analyze, and critique the package as a step in the process that will enable scientists and engineers to apply it to the solution of their problems. All routines in the package were reviewed and analyzed. Underlying theory or concepts which exist in the form of published papers or technical reports, or memos, were either obtained from the author, or from the scientific literature; and general algorithms, explanations, examples, and critiques have been provided to explain the workings of these programs. Wherever the things were still unclear, communications were made with the developer (author), either by telephone or by electronic mail, to understand the workings of the routines. Whenever possible, tests were made to verify the concepts and logic employed in their implementations. A detailed report is being separately documented to explain the workings of these routines.
Energy-based dosimetry of low-energy, photon-emitting brachytherapy sources
NASA Astrophysics Data System (ADS)
Malin, Martha J.
Model-based dose calculation algorithms (MBDCAs) for low-energy, photon-emitting brachytherapy sources have advanced to the point where the algorithms may be used in clinical practice. Before these algorithms can be used, a methodology must be established to verify the accuracy of the source models used by the algorithms. Additionally, the source strength metric for these algorithms must be established. This work explored the feasibility of verifying the source models used by MBDCAs by measuring the differential photon fluence emitted from the encapsulation of the source. The measured fluence could be compared to that modeled by the algorithm to validate the source model. This work examined how the differential photon fluence varied with position and angle of emission from the source, and the resolution that these measurements would require for dose computations to be accurate to within 1.5%. Both the spatial and angular resolution requirements were determined. The techniques used to determine the resolution required for measurements of the differential photon fluence were applied to determine why dose-rate constants determined using a spectroscopic technique disagreed with those computed using Monte Carlo techniques. The discrepancy between the two techniques had been previously published, but the cause of the discrepancy was not known. This work determined the impact that some of the assumptions used by the spectroscopic technique had on the accuracy of the calculation. The assumption of isotropic emission was found to cause the largest discrepancy in the spectroscopic dose-rate constant. Finally, this work improved the instrumentation used to measure the rate at which energy leaves the encapsulation of a brachytherapy source. This quantity is called emitted power (EP), and is presented as a possible source strength metric for MBDCAs. A calorimeter that measured EP was designed and built. The theoretical framework that the calorimeter relied upon to measure EP was established. Four clinically relevant 125I brachytherapy sources were measured with the instrument. The accuracy of the measured EP was compared to an air-kerma strength-derived EP to test the accuracy of the instrument. The instrument was accurate to within 10%, with three out of the four source measurements accurate to within 4%.
Can I Trust This Software Package? An Exercise in Validation of Computational Results
ERIC Educational Resources Information Center
Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.
2008-01-01
Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…
An Interactive Computer Aided Electrical Engineering Education Package.
ERIC Educational Resources Information Center
Cavati, Cicero Romao
This paper describes an educational package to help the learning process. A case study is presented of an energy distribution course in the Electrical Engineering Department at the Federal University of Espirito Santo (UFES). The advantages of the developed package are shown by comparing it with the traditional academic book. This package presents…
Eddylicious: A Python package for turbulent inflow generation
NASA Astrophysics Data System (ADS)
Mukha, Timofey; Liefvendahl, Mattias
2018-01-01
A Python package for generating inflow for scale-resolving computer simulations of turbulent flow is presented. The purpose of the package is to unite existing inflow generation methods in a single code-base and make them accessible to users of various Computational Fluid Dynamics (CFD) solvers. The currently existing functionality consists of an accurate inflow generation method suitable for flows with a turbulent boundary layer inflow and input/output routines for coupling with the open-source CFD solver OpenFOAM.
Application of GA package in functional packaging
NASA Astrophysics Data System (ADS)
Belousova, D. A.; Noskova, E. E.; Kapulin, D. V.
2018-05-01
The approach to application program for the task of configuration of the elements of the commutation circuit for design of the radio-electronic equipment on the basis of the genetic algorithm is offered. The efficiency of the used approach for commutation circuits with different characteristics for computer-aided design on radio-electronic manufacturing is shown. The prototype of the computer-aided design subsystem on the basis of a package GA for R with a set of the general functions for optimization of multivariate models is programmed.
A Maple package for computing Gröbner bases for linear recurrence relations
NASA Astrophysics Data System (ADS)
Gerdt, Vladimir P.; Robertz, Daniel
2006-04-01
A Maple package for computing Gröbner bases of linear difference ideals is described. The underlying algorithm is based on Janet and Janet-like monomial divisions associated with finite difference operators. The package can be used, for example, for automatic generation of difference schemes for linear partial differential equations and for reduction of multiloop Feynman integrals. These two possible applications are illustrated by simple examples of the Laplace equation and a one-loop scalar integral of propagator type.
Radiation hardness of Efratom M-100 rubidium frequency standard
NASA Technical Reports Server (NTRS)
English, T. C.; Vorwerk, H.; Rudie, N. J.
1983-01-01
The effects of nuclear radiation on rubidium gas cell frequency standards and components are presented, including the results of recent tests where a continuously operating rubidium frequency standard (Effratom, Model M-100) was subjected to simultaneous neutron/gamma radiation. At the highest neutron fluence 7.5 10 to the 12th power n/sq cm and total dose 11 krad(Si) tested, the unit operated satisfactorily; the total frequency change over the 2 1/2 hour test period due to all causes, including repeated retraction from and insertion into the reactor, was less than 1 x 10 to the -10th power. The effects of combined neutron/gamma radiation on rubidium frequency standard physics package components were also studied, and the results are presented.
An Innovative Learning Model for Computation in First Year Mathematics
ERIC Educational Resources Information Center
Tonkes, E. J.; Loch, B. I.; Stace, A. W.
2005-01-01
MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…
Datson, D J; Carter, N G
1988-10-01
The use of personal computers in accountancy and business generally has been stimulated by the availability of flexible software packages. We describe the implementation of a commercial software package designed for interfacing with laboratory instruments and highlight the ease with which it can be implemented, without the need for specialist computer programming staff.
WiLE: A Mathematica package for weak coupling expansion of Wilson loops in ABJ(M) theory
NASA Astrophysics Data System (ADS)
Preti, M.
2018-06-01
We present WiLE, a Mathematica® package designed to perform the weak coupling expansion of any Wilson loop in ABJ(M) theory at arbitrary perturbative order. For a given set of fields on the loop and internal vertices, the package displays all the possible Feynman diagrams and their integral representations. The user can also choose to exclude non planar diagrams, tadpoles and self-energies. Through the use of interactive input windows, the package should be easily accessible to users with little or no previous experience. The package manual provides some pedagogical examples and the computation of all ladder diagrams at three-loop relevant for the cusp anomalous dimension in ABJ(M). The latter application gives also support to some recent results computed in different contexts.
1988-03-01
PACKAGE BODY ) TLCSC P661 (CATALOG #P106-0) This package contains the CAMP parts required to do the vaypoint steering portion of navigation. The...3.3.4.1.6 PROCESSING The following describes the processing performed by this part: package body WaypointSteering is package body ...Steering_Vector_Operations is separate; package body Steering_Vector_Operations_with_Arcsin is separate; procedure Compute Turn_Angle_and Direction (UnitNormal C
NASA Technical Reports Server (NTRS)
Janoudi, A.; Poff, K. L.
1990-01-01
The relationship between the amount of light and the amount of response for any photobiological process can be based on the number of incident quanta per unit time (fluence rate-response) or on the number of incident quanta during a given period of irradiation (fluence-response). Fluence-response and fluence rate-response relationships have been measured for second positive phototropism by seedlings of Arabidopsis thaliana. The fluence-response relationships exhibit a single limiting threshold at about 0.01 micromole per square meter when measured at fluence rates from 2.4 x 10(-5) to 6.5 x 10(-3) micromoles per square meter per second. The threshold values in the fluence rate-response curves decrease with increasing time of irradiation, but show a common fluence threshold at about 0.01 micromole per square meter. These thresholds are the same as the threshold of about 0.01 micromole per square meter measured for first positive phototropism. Based on these data, it is suggested that second positive curvature has a threshold in time of about 10 minutes. Moreover, if the times of irradiation exceed the time threshold, there is a single limiting fluence threshold at about 0.01 micromole per square meter. Thus, the limiting fluence threshold for second positive phototropism is the same as the fluence threshold for first positive phototropism. Based on these data, we suggest that this common fluence threshold for first positive and second positive phototropism is set by a single photoreceptor pigment system.
SU-F-T-540: Comprehensive Fluence Delivery Optimization with Multileaf Collimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weppler, S; Villarreal-Barajas, J; Department of Medical Physics, Tom Baker Cancer Center, Calgary, Alberta
2016-06-15
Purpose: Multileaf collimator (MLC) leaf sequencing is performed via commercial black-box implementations, on which a user has limited to no access. We have developed an explicit, generic MLC sequencing model to serve as a tool for future investigations of fluence map optimization, fluence delivery optimization, and rotational collimator delivery methods. Methods: We have developed a novel, comprehensive model to effectively account for a variety of transmission and penumbra effects previously treated on an ad hoc basis in the literature. As the model is capable of quantifying a variety of effects, we utilize the asymmetric leakage intensity across each leaf tomore » deliver fluence maps with pixel size smaller than the narrowest leaf width. Developed using linear programming and mixed integer programming formulations, the model is implemented using state of the art open-source solvers. To demonstrate the versatility of the algorithm, a graphical user interface (GUI) was developed in MATLAB capable of accepting custom leaf specifications and transmission parameters. As a preliminary proof-ofconcept, we have sequenced the leaves of a Varian 120 Leaf Millennium MLC for five prostate cancer patient fields and one head and neck field. Predetermined fluence maps have been processed by data smoothing methods to obtain pixel sizes of 2.5 cm{sup 2}. The quality of output was analyzed using computer simulations. Results: For the prostate fields, an average root mean squared error (RMSE) of 0.82 and gamma (0.5mm/0.5%) of 91.4% were observed compared to RMSE and gamma (0.5mm/0.5%) values of 7.04 and 34.0% when the leakage considerations were omitted. Similar results were observed for the head and neck case. Conclusion: A model to sequence MLC leaves to optimality has been proposed. Future work will involve extensive testing and evaluation of the method on clinical MLCs and comparison with black-box leaf sequencing algorithms currently used by commercial treatment planning systems.« less
Packaging printed circuit boards: A production application of interactive graphics
NASA Technical Reports Server (NTRS)
Perrill, W. A.
1975-01-01
The structure and use of an Interactive Graphics Packaging Program (IGPP), conceived to apply computer graphics to the design of packaging electronic circuits onto printed circuit boards (PCB), were described. The intent was to combine the data storage and manipulative power of the computer with the imaginative, intuitive power of a human designer. The hardware includes a CDC 6400 computer and two CDC 777 terminals with CRT screens, light pens, and keyboards. The program is written in FORTRAN 4 extended with the exception of a few functions coded in COMPASS (assembly language). The IGPP performs four major functions for the designer: (1) data input and display, (2) component placement (automatic or manual), (3) conductor path routing (automatic or manual), and (4) data output. The most complex PCB packaged to date measured 16.5 cm by 19 cm and contained 380 components, two layers of ground planes and four layers of conductors mixed with ground planes.
Software package for modeling spin-orbit motion in storage rings
NASA Astrophysics Data System (ADS)
Zyuzin, D. V.
2015-12-01
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.
The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment
NASA Astrophysics Data System (ADS)
Howe, Marico; Berleant, Daniel; Everett, Albert
2011-06-01
The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.
Kumar, Sudhir; Fenwick, John D; Underwood, Tracy S A; Deshpande, Deepak D; Scott, Alison J D; Nahum, Alan E
2015-10-21
In small photon fields ionisation chambers can exhibit large deviations from Bragg-Gray behaviour; the EGSnrc Monte Carlo (MC) code system has been employed to investigate this 'Bragg-Gray breakdown'. The total electron (+positron) fluence in small water and air cavities in a water phantom has been computed for a full linac beam model as well as for a point source spectrum for 6 MV and 15 MV qualities for field sizes from 0.25 × 0.25 cm(2) to 10 × 10 cm(2). A water-to-air perturbation factor has been derived as the ratio of total electron (+positron) fluence, integrated over all energies, in a tiny water volume to that in a 'PinPoint 3D-chamber-like' air cavity; for the 0.25 × 0.25 cm(2) field size the perturbation factors are 1.323 and 2.139 for 6 MV and 15 MV full linac geometries respectively. For the 15 MV full linac geometry for field sizes of 1 × 1 cm(2) and smaller not only the absolute magnitude but also the 'shape' of the total electron fluence spectrum in the air cavity is significantly different to that in the water 'cavity'. The physics of this 'Bragg-Gray breakdown' is fully explained, making reference to the Fano theorem. For the 15 MV full linac geometry in the 0.25 × 0.25 cm(2) field the directly computed MC dose ratio, water-to-air, differs by 5% from the product of the Spencer-Attix stopping-power ratio (SPR) and the perturbation factor; this 'difference' is explained by the difference in the shapes of the fluence spectra and is also formulated theoretically. We show that the dimensions of an air-cavity with a perturbation factor within 5% of unity would have to be impractically small in these highly non-equilibrium photon fields. In contrast the dose to water in a 0.25 × 0.25 cm(2) field derived by multiplying the dose in the single-crystal diamond dosimeter (SCDDo) by the Spencer-Attix ratio is within 2.9% of the dose computed directly in the water voxel for full linac geometry at both 6 and 15 MV, thereby demonstrating that this detector exhibits quasi Bragg-Gray behaviour over a wide range of field sizes and beam qualities.
I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison
NASA Technical Reports Server (NTRS)
Somawardhana, Ruwan
2011-01-01
CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.
Use of symbolic computation in robotics education
NASA Technical Reports Server (NTRS)
Vira, Naren; Tunstel, Edward
1992-01-01
An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.
NASA Astrophysics Data System (ADS)
Tian, Wei; Kushner, Mark J.
2015-09-01
Tissue covered by a thin liquid layer treated by atmospheric pressure plasmas for biomedical applications ultimately requires a reproducible protocol for human healthcare. The outcomes of wet tissue treatment by dielectric barrier discharges (DBDs) depend on the plasma dose which determines the integral fluences of radicals and ions onto the tissue. These fluences are controlled in part by frequency and liquid thickness. In this paper, we report on results from a computational investigation of multipulse DBDs interacting with wet tissue. The DBDs were simulated for 100 stationary or random streamers at different repetition rates and liquid thicknesses followed by 10 s to 2 min of afterglow. At 100 Hz, NOaq and OHaq are mixed by randomly striking streamers, although they have different rates of solvation. NOaq is nearly completely consumed by reactions with OHaq at the liquid surface. Only H2O2aq, produced through OHaq mutual reactions, survives to reach the tissue. After 100 pulses, the liquid becomes ozone-rich, in which the nitrous ion, NO2-aq, is converted to the nitric ion, NO3-aq. Reducing the pulse frequency to 10 Hz results in significant fluence of NOaq to the tissue as NOaq can escape during the interpulse period from the liquid surface where OHaq is formed. For the same reason, NO2-aq can also reach deeper into the liquid at lower frequency. Frequency and thickness of the liquid are methods to control the plasma produced aqueous species to the underlying tissue. Work supported by DOE (DE-SC0001319) and NSF (CHE-1124724).
Project SOLWIND: Space radiation exposure. [evaluation of particle fluxes
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.
1975-01-01
A special orbital radiation study was conducted for the SOLWIND project to evaluate mission-encountered energetic particle fluxes. Magnetic field calculations were performed with a current field model, extrapolated to the tentative spacecraft launch epoch with linear time terms. Orbital flux integrations for circular flight paths were performed with the latest proton and electron environment models, using new improved computational methods. Temporal variations in the ambient electron environment are considered and partially accounted for. Estimates of average energetic solar proton fluences are given for a one year mission duration at selected integral energies ranging from E greater than 10 to E greater than 100 MeV; the predicted annual fluence is found to relate to the period of maximum solar activity during the next solar cycle. The results are presented in graphical and tabular form; they are analyzed, explained, and discussed.
Radiation damage of gallium arsenide production cells
NASA Technical Reports Server (NTRS)
Mardesich, N.; Garlick, G. F. J.
1987-01-01
High-efficiency gallium arsenide cells, made by the liquid epitaxy method (LPE), have been irradiated with 1-MeV electrons up to fluences of 10 to the 16th e/sq cm. Measurements have been made of cell spectral response and dark and light-excited current-voltage characteristics and analyzed using computer-based models to determine underlying parameters such as damage coefficients. It is possible to use spectral response to sort out damage effects in the different cell component layers. Damage coefficients are similar to other reported in the literature for the emitter and buffer (base). However, there is also a damage effect in the window layer and possibly at the window emitter interface similar to that found for proton-irradiated liquid-phase epitaxy-grown cells. Depletion layer recombination is found to be less than theoretically expected at high fluence.
METLIN-PC: An applications-program package for problems of mathematical programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pshenichnyi, B.N.; Sobolenko, L.A.; Sosnovskii, A.A.
1994-05-01
The METLIN-PC applications-program package (APP) was developed at the V.M. Glushkov Institute of Cybernetics of the Academy of Sciences of Ukraine on IBM PC XT and AT computers. The present version of the package was written in Turbo Pascal and Fortran-77. The METLIN-PC is chiefly designed for the solution of smooth problems of mathematical programming and is a further development of the METLIN prototype, which was created earlier on a BESM-6 computer. The principal property of the previous package is retained - the applications modules employ a single approach based on the linearization method of B.N. Pschenichnyi. Hence the namemore » {open_quotes}METLIN.{close_quotes}« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Q; Read, P
Purpose: Multiple error pathways can lead to delivery errors during the treatment course that cannot be caught with pre-treatment QA. While in vivo solutions are being developed for linacs, no such solution exists for tomotherapy. The purpose of this study is to develop a near real-time system for tomotherapy that can monitor the delivery and dose accumulation process during the treatment-delivery, which enable the user to assess the impact of delivery variations and/or errors and to interrupt the treatment if necessary. Methods: A program running on a tomotherapy planning station fetches the raw DAS data during treatment. Exit detector datamore » is extracted as well as output, gantry angle, and other machine parameters. For each sample, the MLC open-close state is determined. The delivered plan is compared with the original plan via a Monte Carlo dose engine which transports fluence deviations from a pre-treatment Monte Carlo run. A report containing the difference in fluence, dose and DVH statistics is created in html format. This process is repeated until the treatment is completed. Results: Since we only need to compute the dose for the difference in fluence for a few projections each time, dose with 2% statistical uncertainty can be computed in less than 1 second on a 4-core cpu. However, the current bottleneck in this near real-time system is the repeated fetching and processing the growing DAS data file throughout the delivery. The frame rate drops from 10Hz at the beginning of treatment to 5Hz after 3 minutes and to 2Hz after 10 minutes. Conclusion: A during-treatment delivery monitor system has been built to monitor tomotherapy treatments. The system improves patient safety by allowing operators to assess the delivery variations and errors during treatment delivery and adopt appropriate actions.« less
Language Analysis Package (L.A.P.) Version I System Design.
ERIC Educational Resources Information Center
Porch, Ann
To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…
Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.
ERIC Educational Resources Information Center
Senn, Gary J.; Smyth, Thomas J. C.
Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…
Janoudi, Abdul; Poff, Kenneth L.
1990-01-01
The relationship between the amount of light and the amount of response for any photobiological process can be based on the number of incident quanta per unit time (fluence rate-response) or on the number of incident quanta during a given period of irradiation (fluence-response). Fluence-response and fluence rate-response relationships have been measured for second positive phototropism by seedlings of Arabidopsis thaliana. The fluence-response relationships exhibit a single limiting threshold at about 0.01 micromole per square meter when measured at fluence rates from 2.4 × 10−5 to 6.5 × 10−3 micromoles per square meter per second. The threshold values in the fluence rateresponse curves decrease with increasing time of irradiation, but show a common fluence threshold at about 0.01 micromole per square meter. These thresholds are the same as the threshold of about 0.01 micromole per square meter measured for first positive phototropism. Based on these data, it is suggested that second positive curvature has a threshold in time of about 10 minutes. Moreover, if the times of irradiation exceed the time threshold, there is a single limiting fluence threshold at about 0.01 micromole per square meter. Thus, the limiting fluence threshold for second positive phototropism is the same as the fluence threshold for first positive phototropism. Based on these data, we suggest that this common fluence threshold for first positive and second positive phototropism is set by a single photoreceptor pigment system. PMID:11537470
Lourenço, Ana; Thomas, Russell; Bouchard, Hugo; Kacperek, Andrzej; Vondracek, Vladimir; Royle, Gary; Palmans, Hugo
2016-07-01
The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the fluka code [A. Ferrari et al., "fluka: A multi-particle transport code," in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., "The fluka Code: Developments and challenges for high energy and medical applications," Nucl. Data Sheets 120, 211-214 (2014)], to partial fluence corrections measured experimentally. A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary particle fluence. A correction factor, F(d), has been established to relate fluence corrections defined theoretically to partial fluence corrections derived experimentally. The findings presented here are also relevant to water and tissue-equivalent-plastic materials given their carbon content.
SPARSKIT: A basic tool kit for sparse matrix computations
NASA Technical Reports Server (NTRS)
Saad, Youcef
1990-01-01
Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.
Sato, Tatsuhiko; Endo, Akira; Zankl, Maria; Petoussi-Henss, Nina; Niita, Koji
2009-04-07
The fluence to organ-dose and effective-dose conversion coefficients for neutrons and protons with energies up to 100 GeV was calculated using the PHITS code coupled to male and female adult reference computational phantoms, which are to be released as a common ICRP/ICRU publication. For the calculation, the radiation and tissue weighting factors, w(R) and w(T), respectively, as revised in ICRP Publication 103 were employed. The conversion coefficients for effective dose equivalents derived using the radiation quality factors of both Q(L) and Q(y) relationships were also estimated, utilizing the functions for calculating the probability densities of the absorbed dose in terms of LET (L) and lineal energy (y), respectively, implemented in PHITS. By comparing these data with the corresponding data for the effective dose, we found that the numerical compatibilities of the revised w(R) with the Q(L) and Q(y) relationships are fairly established. The calculated data of these dose conversion coefficients are indispensable for constructing the radiation protection systems based on the new recommendations given in ICRP103 for aircrews and astronauts, as well as for workers in accelerators and nuclear facilities.
Fission Product Release and Survivability of UN-Kernel LWR TRISO Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Besmann, Theodore M; Ferber, Mattison K; Lin, Hua-Tay
2014-01-01
A thermomechanical assessment of the LWR application of TRISO fuel with UN kernels was performed. Fission product release under operational and transient temperature conditions was determined by extrapolation from range calculations and limited data from irradiated UN pellets. Both fission recoil and diffusive release were considered and internal particle pressures computed for both 650 and 800 m diameter kernels as a function of buffer layer thickness. These pressures were used in conjunction with a finite element program to compute the radial and tangential stresses generated with a TRISO particle as a function of fluence. Creep and swelling of the innermore » and outer pyrolytic carbon layers were included in the analyses. A measure of reliability of the TRISO particle was obtained by measuring the probability of survival of the SiC barrier layer and the maximum tensile stress generated in the pyrolytic carbon layers as a function of fluence. These reliability estimates were obtained as functions of the kernel diameter, buffer layer thickness, and pyrolytic carbon layer thickness. The value of the probability of survival at the end of irradiation was inversely proportional to the maximum pressure.« less
Novel Ruggedized Packaging Technology for VCSELs
2017-03-01
Novel Ruggedized Packaging Technology for VCSELs Charlie Kuznia ckuznia@ultracomm-inc.com Ultra Communications, Inc. Vista, CA, USA, 92081...n ac hieve l ow-power, E MI-immune links within hi gh-performance m ilitary computing an d sensor systems. Figure 1. Chip-scale-packaging of
Introduction to Software Packages. [Final Report.
ERIC Educational Resources Information Center
Frankel, Sheila, Ed.; And Others
This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…
NASA Astrophysics Data System (ADS)
Liu, Fengshan; Rogak, Steven; Snelling, David R.; Saffaripour, Meghdad; Thomson, Kevin A.; Smallwood, Gregory J.
2016-11-01
Multimode pulsed Nd:YAG lasers are commonly used in auto-compensating laser-induced incandescence (AC-LII) measurements of soot in flames and engine exhaust as well as black carbon in the atmosphere. Such lasers possess a certain degree of fluence non-uniformity across the laser beam even with the use of beam shaping optics. Recent research showed that the measured volume fraction of ambient-temperature soot using AC-LII increases significantly, by about a factor of 5-8, with increasing the laser fluence in the low-fluence regime from a very low fluence to a relatively high fluence of near sublimation. The causes of this so-called soot volume fraction anomaly are currently not understood. The effects of laser fluence non-uniformity on the measured soot volume fraction using AC-LII were investigated. Three sets of LII experiments were conducted in the exhaust of a MiniCAST soot generator under conditions of high elemental carbon using Nd:YAG lasers operated at 1064 nm. The laser beams were shaped and relay imaged to achieve a relatively uniform fluence distribution in the measurement volume. To further homogenize the laser fluence, one set of LII experiments was conducted by using a diffractive optical element. The measured soot volume fractions in all three sets of LII experiments increase strongly with increasing the laser fluence before a peak value is reached and then start to decrease at higher fluences. Numerical calculations were conducted using the experimental laser fluence histograms. Laser fluence non-uniformity is found partially responsible for the soot volume fraction anomaly, but is insufficient to explain the degree of soot volume fraction anomaly observed experimentally. Representing the laser fluence variations by a histogram derived from high-resolution images of the laser beam energy profile gives a more accurate definition of inhomogeneity than a simple averaged linear profile across the laser beam.
Optimization of light source parameters in the photodynamic therapy of heterogeneous prostate
NASA Astrophysics Data System (ADS)
Li, Jun; Altschuler, Martin D.; Hahn, Stephen M.; Zhu, Timothy C.
2008-08-01
The three-dimensional (3D) heterogeneous distributions of optical properties in a patient prostate can now be measured in vivo. Such data can be used to obtain a more accurate light-fluence kernel. (For specified sources and points, the kernel gives the fluence delivered to a point by a source of unit strength.) In turn, the kernel can be used to solve the inverse problem that determines the source strengths needed to deliver a prescribed photodynamic therapy (PDT) dose (or light-fluence) distribution within the prostate (assuming uniform drug concentration). We have developed and tested computational procedures to use the new heterogeneous data to optimize delivered light-fluence. New problems arise, however, in quickly obtaining an accurate kernel following the insertion of interstitial light sources and data acquisition. (1) The light-fluence kernel must be calculated in 3D and separately for each light source, which increases kernel size. (2) An accurate kernel for light scattering in a heterogeneous medium requires ray tracing and volume partitioning, thus significant calculation time. To address these problems, two different kernels were examined and compared for speed of creation and accuracy of dose. Kernels derived more quickly involve simpler algorithms. Our goal is to achieve optimal dose planning with patient-specific heterogeneous optical data applied through accurate kernels, all within clinical times. The optimization process is restricted to accepting the given (interstitially inserted) sources, and determining the best source strengths with which to obtain a prescribed dose. The Cimmino feasibility algorithm is used for this purpose. The dose distribution and source weights obtained for each kernel are analyzed. In clinical use, optimization will also be performed prior to source insertion to obtain initial source positions, source lengths and source weights, but with the assumption of homogeneous optical properties. For this reason, we compare the results from heterogeneous optical data with those obtained from average homogeneous optical properties. The optimized treatment plans are also compared with the reference clinical plan, defined as the plan with sources of equal strength, distributed regularly in space, which delivers a mean value of prescribed fluence at detector locations within the treatment region. The study suggests that comprehensive optimization of source parameters (i.e. strengths, lengths and locations) is feasible, thus allowing acceptable dose coverage in a heterogeneous prostate PDT within the time constraints of the PDT procedure.
Effects of Computer Animation Instructional Package on Students' Achievement in Practical Biology
ERIC Educational Resources Information Center
Hamzat, Abdulrasaq; Bello, Ganiyu; Abimbola, Isaac Olakanmi
2017-01-01
This study examined the effects of computer animation instructional package on secondary school students' achievement in practical biology in Ilorin, Nigeria. The study adopted a pre-test, post-test, control group, non-randomised and nonequivalent quasi-experimental design, with a 2x2x3 factorial design. Two intact classes from two secondary…
Sigma 2 Graphic Display Software Program Description
NASA Technical Reports Server (NTRS)
Johnson, B. T.
1973-01-01
A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.
NASA Astrophysics Data System (ADS)
Williams, G. V. M.; Prakash, T.; Kennedy, J.
2017-10-01
Superparamagnetic Ni1-yFey nanoparticles were made in a SiO2 film by 10 keV ion beam implantation of Ni followed by Fe with a Ni fluence of 4 × 1016 at.cm-2 and a Fe fluence fraction of 0.47. Nearly all of the moments magnetically ordered, which was not reported for an implanted film made with a Fe fluence fraction of 0.56 and half the Ni fluence. The temperature dependence of the saturation moment is remarkably similar for low and high Ni fluences where there is also the presence of very thin spin-disordered shells. The higher Ni fluence leads to a significant enhancement of the susceptibility by a factor of 9 when compared with the lower fluence sample. This enhancement is likely to be due to a larger magnetically ordered volume fraction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malin, Martha J.; Bartol, Laura J.; DeWerd, Larry A., E-mail: mmalin@wisc.edu, E-mail: ladewerd@wisc.edu
2015-05-15
Purpose: To investigate why dose-rate constants for {sup 125}I and {sup 103}Pd seeds computed using the spectroscopic technique, Λ{sub spec}, differ from those computed with standard Monte Carlo (MC) techniques. A potential cause of these discrepancies is the spectroscopic technique’s use of approximations of the true fluence distribution leaving the source, φ{sub full}. In particular, the fluence distribution used in the spectroscopic technique, φ{sub spec}, approximates the spatial, angular, and energy distributions of φ{sub full}. This work quantified the extent to which each of these approximations affects the accuracy of Λ{sub spec}. Additionally, this study investigated how the simplified water-onlymore » model used in the spectroscopic technique impacts the accuracy of Λ{sub spec}. Methods: Dose-rate constants as described in the AAPM TG-43U1 report, Λ{sub full}, were computed with MC simulations using the full source geometry for each of 14 different {sup 125}I and 6 different {sup 103}Pd source models. In addition, the spectrum emitted along the perpendicular bisector of each source was simulated in vacuum using the full source model and used to compute Λ{sub spec}. Λ{sub spec} was compared to Λ{sub full} to verify the discrepancy reported by Rodriguez and Rogers. Using MC simulations, a phase space of the fluence leaving the encapsulation of each full source model was created. The spatial and angular distributions of φ{sub full} were extracted from the phase spaces and were qualitatively compared to those used by φ{sub spec}. Additionally, each phase space was modified to reflect one of the approximated distributions (spatial, angular, or energy) used by φ{sub spec}. The dose-rate constant resulting from using approximated distribution i, Λ{sub approx,i}, was computed using the modified phase space and compared to Λ{sub full}. For each source, this process was repeated for each approximation in order to determine which approximations used in the spectroscopic technique affect the accuracy of Λ{sub spec}. Results: For all sources studied, the angular and spatial distributions of φ{sub full} were more complex than the distributions used in φ{sub spec}. Differences between Λ{sub spec} and Λ{sub full} ranged from −0.6% to +6.4%, confirming the discrepancies found by Rodriguez and Rogers. The largest contribution to the discrepancy was the assumption of isotropic emission in φ{sub spec}, which caused differences in Λ of up to +5.3% relative to Λ{sub full}. Use of the approximated spatial and energy distributions caused smaller average discrepancies in Λ of −0.4% and +0.1%, respectively. The water-only model introduced an average discrepancy in Λ of −0.4%. Conclusions: The approximations used in φ{sub spec} caused discrepancies between Λ{sub approx,i} and Λ{sub full} of up to 7.8%. With the exception of the energy distribution, the approximations used in φ{sub spec} contributed to this discrepancy for all source models studied. To improve the accuracy of Λ{sub spec}, the spatial and angular distributions of φ{sub full} could be measured, with the measurements replacing the approximated distributions. The methodology used in this work could be used to determine the resolution that such measurements would require by computing the dose-rate constants from phase spaces modified to reflect φ{sub full} binned at different spatial and angular resolutions.« less
Paperless Work Package Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kilgore, Jr., William R.; Morrell, Jr., Otto K.; Morrison, Dan
2014-07-31
Paperless Work Package (PWP) System is a computer program process that takes information from Asset Suite, provides a platform for other electronic inputs, Processes the inputs into an electronic package that can be downloaded onto an electronic work tablet or laptop computer, provides a platform for electronic inputs into the work tablet, and then transposes those inputs back into Asset Suite and to permanent SRS records. The PWP System will basically eliminate paper requirements from the maintenance work control system. The program electronically relays the instructions given by the planner to work on a piece of equipment which is currentlymore » relayed via a printed work package. The program does not control/approve what is done. The planner will continue to plan the work package, the package will continue to be routed, approved, and scheduled. The supervisor reviews and approves the work to be performed and assigns work to individuals or to a work group. (The supervisor conducts pre job briefings with the workers involved in the job) The Operations Manager (Work Controlling Entity) approves the work package electronically for the work that will be done in his facility prior to work starting. The PWP System will provide the package in an electronic form. All the reviews, approvals, and safety measures taken by people outside the electronic package does not change from the paper driven work packages.« less
10 CFR 431.92 - Definitions concerning commercial air conditioners and heat pumps.
Code of Federal Regulations, 2014 CFR
2014-01-01
... measurement. Commercial package air-conditioning and heating equipment means air-cooled, water-cooled... Conditioner means a basic model of commercial package air-conditioning and heating equipment (packaged or split) that is: Used in computer rooms, data processing rooms, or other information technology cooling...
Packaging strategies for printed circuit board components. Volume I, materials & thermal stresses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neilsen, Michael K.; Austin, Kevin N.; Adolf, Douglas Brian
2011-09-01
Decisions on material selections for electronics packaging can be quite complicated by the need to balance the criteria to withstand severe impacts yet survive deep thermal cycles intact. Many times, material choices are based on historical precedence perhaps ignorant of whether those initial choices were carefully investigated or whether the requirements on the new component match those of previous units. The goal of this program focuses on developing both increased intuition for generic packaging guidelines and computational methodologies for optimizing packaging in specific components. Initial efforts centered on characterization of classes of materials common to packaging strategies and computational analysesmore » of stresses generated during thermal cycling to identify strengths and weaknesses of various material choices. Future studies will analyze the same example problems incorporating the effects of curing stresses as needed and analyzing dynamic loadings to compare trends with the quasi-static conclusions.« less
Besnier, Francois; Glover, Kevin A.
2013-01-01
This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
Software package for modeling spin–orbit motion in storage rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de
2015-12-15
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less
Space-Shuttle Emulator Software
NASA Technical Reports Server (NTRS)
Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram;
2007-01-01
A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.
Bethe-Salpeter Eigenvalue Solver Package (BSEPACK) v0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
SHAO, MEIYEU; YANG, CHAO
2017-04-25
The BSEPACK contains a set of subroutines for solving the Bethe-Salpeter Eigenvalue (BSE) problem. This type of problem arises in this study of optical excitation of nanoscale materials. The BSE problem is a structured non-Hermitian eigenvalue problem. The BSEPACK software can be used to compute all or subset of eigenpairs of a BSE Hamiltonian. It can also be used to compute the optical absorption spectrum without computing BSE eigenvalues and eigenvectors explicitly. The package makes use of the ScaLAPACK, LAPACK and BLAS.
ERIC Educational Resources Information Center
Gratz, Zandra S.; And Others
A study was conducted at a large, state-supported college in the Northeast to establish a mechanism by which a popular software package, Statistical Package for the Social Sciences (SPSS), could be used in psychology program statistics courses in such a way that no prior computer expertise would be needed on the part of the faculty or the…
Physics of the Isotopic Dependence of Galactic Cosmic Ray Fluence Behind Shielding
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Saganti, Premkumar B.; Hu, Xiao-Dong; Kim, Myung-Hee Y.; Cleghorn, Timothy F.; Wilson, John W.; Tripathi, Ram K.; Zeitlin, Cary J.
2003-01-01
For over 25 years, NASA has supported the development of space radiation transport models for shielding applications. The NASA space radiation transport model now predicts dose and dose equivalent in Earth and Mars orbit to an accuracy of plus or minus 20%. However, because larger errors may occur in particle fluence predictions, there is interest in further assessments and improvements in NASA's space radiation transport model. In this paper, we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR) and the isotopic dependence of nuclear fragmentation cross-sections on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. Using NASA's quantum multiple-scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, we study the effect of the isotopic dependence of the primary GCR composition and secondary nuclei on shielding calculations. The QMSFRG is shown to accurately describe the iso-spin dependence of nuclear fragmentation. The principal finding of this study is that large errors (plus or minus 100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotope grid (approximately 170 ions) to ones that use a reduced isotope grid, for example the 59 ion-grid used in the HZETRN code in the past, however less significant errors (less than 20%) occur in the elemental-fluence spectra. Because a complete isotope grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies.
Ultrafast dynamics of Al-doped zinc oxide under optical excitation (Presentation Recording)
NASA Astrophysics Data System (ADS)
Kinsey, Nathaniel; DeVault, Clayton T.; Kim, Jongbum; Ferrera, Marcello; Kildishev, Alexander V.; Shalaev, Vladimir M.; Boltasseva, Alexandra
2015-09-01
There is a continual need to explore new and promising dynamic materials to power next-generation switchable devices. In recent years, transparent conducting oxides have been shown to be vital materials for such systems, allowing for both optical and electrical tunability. Using a pump-probe technique, we investigate the optical tunability of CMOS-compatible, highly aluminum doped zinc oxide (AZO) thin films. The sample was pumped at 325 nm and probed with a weak beam at 1.3 μm to determine the timescale and magnitude of the changes by altering the temporal delay between the pulses with a delay line. For an incident fluence of 3.9 mJ/cm2 a change of 40% in reflection and 30% (max 6.3dB/μm modulation depth) in transmission is observed which is fully recovered within 1ps. Using a computational model, the experimental results were fitted for the given fluence allowing the recombination time and induced carrier density to be extracted. For a fluence of 3.9 mJ/cm2 the average excess carrier density within the material is 0.7×10^20cm-3 and the recombination time is 88fs. The ultrafast temporal response is the result of Auger recombination due to the extremely high carrier concentration present in our films, ~10^21 cm-3. By measuring and fitting the results at several incident fluence levels, the recombination time versus carrier density was determined and fitted with an Auger model resulting in an Auger coefficient of C = 1.03×10^20cm6/sec. Consequently, AZO is shown to be a unique, promising, and CMOS-compatible material for high performance dynamic devices in the near future.
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1991-01-01
The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.
High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects
NASA Technical Reports Server (NTRS)
Schutt-Aine, Jose E.
1996-01-01
The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.
NASA Astrophysics Data System (ADS)
Prates, Renato A.; da Silva, Eriques G.; Yamada, Aécio M.; Suzuki, Luis C.; Paula, Claudete R.; Ribeiro, Martha S.
2009-05-01
The aim of this study was to investigate the influence of light parameters on yeast cells. It has been proposed for many years that photodynamic therapy (PDT) can inactivate microbial cells. A number of photosensitizer and light sources were reported in different light parameters and in a range of dye concentrations. However, much more knowledge concerning the importance of fluence, fluence rate and exposure time are required for a better understanding of the photodynamic efficiency. Suspensions (106 CFU/mL) of Candida albicans, Candida krusei, and Cryptococcus neoformans var. grubii were used. Two fluence rates, 100 and 300 mW/cm2 were compared at 3, 6, and 9 min of irradiation, resulting fluences from 18 to 162 J/cm2. The light source was a laser emitting at λ = 660 nm with output power adjusted at 30 and 90 mW. As photosensitizer, one hundred-μM methylene blue was used. Temperature was monitored to verify possible heat effect and reactive oxygen species (ROS) formation was evaluated. The same fluence in different fluence rates showed dissimilar levels of inactivation on yeast cells as well as in ROS formation. In addition, the increase of the fluence rate showed an improvement on cell photoinactivation. PDT was efficient against yeast cells (6 log reduction), and no significant temperature increase was observed. Fluence per se should not be used as an isolate parameter to compare photoinactivation effects on yeast cells. The higher fluence rate was more effective than the lower one. Furthermore, an adequate duration of light exposure cannot be discarded.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-05-28
Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-01-01
Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045
Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.
ERIC Educational Resources Information Center
McArthur, David
This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…
Scoria: a Python module for manipulating 3D molecular data.
Ropp, Patrick; Friedman, Aaron; Durrant, Jacob D
2017-09-18
Third-party packages have transformed the Python programming language into a powerful computational-biology tool. Package installation is easy for experienced users, but novices sometimes struggle with dependencies and compilers. This presents a barrier that can hinder the otherwise broad adoption of new tools. We present Scoria, a Python package for manipulating three-dimensional molecular data. Unlike similar packages, Scoria requires no dependencies, compilation, or system-wide installation. One can incorporate the Scoria source code directly into their own programs. But Scoria is not designed to compete with other similar packages. Rather, it complements them. Our package leverages others (e.g. NumPy, SciPy), if present, to speed and extend its own functionality. To show its utility, we use Scoria to analyze a molecular dynamics trajectory. Our FootPrint script colors the atoms of one chain by the frequency of their contacts with a second chain. We are hopeful that Scoria will be a useful tool for the computational-biology community. A copy is available for download free of charge (Apache License 2.0) at http://durrantlab.com/scoria/ . Graphical abstract .
Wang, Likun; Yang, Luhe; Peng, Zuohan; Lu, Dan; Jin, Yan; McNutt, Michael; Yin, Yuxin
2015-01-01
With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services.
2015-01-01
Background With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. Results With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. Conclusions This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services. PMID:25708840
Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.
Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B
2017-03-30
Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.
ERIC Educational Resources Information Center
Wainwright, Camille L.
Four classes of high school chemistry students (N=108) were randomly assigned to experimental and control groups to investigate the effectiveness of a computer assisted instruction (CAI) package during a unit on writing/naming of chemical formulas and balancing equations. Students in the experimental group received drill, review, and reinforcement…
Computers in medical education 1: evaluation of a problem-orientated learning package.
Devitt, P; Palmer, E
1998-04-01
A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.
NASA Astrophysics Data System (ADS)
Tekin, Tolga; Töpper, Michael; Reichl, Herbert
2009-05-01
Technological frontiers between semiconductor technology, packaging, and system design are disappearing. Scaling down geometries [1] alone does not provide improvement of performance, less power, smaller size, and lower cost. It will require "More than Moore" [2] through the tighter integration of system level components at the package level. System-in-Package (SiP) will deliver the efficient use of three dimensions (3D) through innovation in packaging and interconnect technology. A key bottleneck to the implementation of high-performance microelectronic systems, including SiP, is the lack of lowlatency, high-bandwidth, and high density off-chip interconnects. Some of the challenges in achieving high-bandwidth chip-to-chip communication using electrical interconnects include the high losses in the substrate dielectric, reflections and impedance discontinuities, and susceptibility to crosstalk [3]. Obviously, the incentive for the use of photonics to overcome the challenges and leverage low-latency and highbandwidth communication will enable the vision of optical computing within next generation architectures. Supercomputers of today offer sustained performance of more than petaflops, which can be increased by utilizing optical interconnects. Next generation computing architectures are needed with ultra low power consumption; ultra high performance with novel interconnection technologies. In this paper we will discuss a CMOS compatible underlying technology to enable next generation optical computing architectures. By introducing a new optical layer within the 3D SiP, the development of converged microsystems, deployment for next generation optical computing architecture will be leveraged.
Computational modelling of a thermoforming process for thermoplastic starch
NASA Astrophysics Data System (ADS)
Szegda, D.; Song, J.; Warby, M. K.; Whiteman, J. R.
2007-05-01
Plastic packaging waste currently forms a significant part of municipal solid waste and as such is causing increasing environmental concerns. Such packaging is largely non-biodegradable and is particularly difficult to recycle or to reuse due to its complex composition. Apart from limited recycling of some easily identifiable packaging wastes, such as bottles, most packaging waste ends up in landfill sites. In recent years, in an attempt to address this problem in the case of plastic packaging, the development of packaging materials from renewable plant resources has received increasing attention and a wide range of bioplastic materials based on starch are now available. Environmentally these bioplastic materials also reduce reliance on oil resources and have the advantage that they are biodegradable and can be composted upon disposal to reduce the environmental impact. Many food packaging containers are produced by thermoforming processes in which thin sheets are inflated under pressure into moulds to produce the required thin wall structures. Hitherto these thin sheets have almost exclusively been made of oil-based polymers and it is for these that computational models of thermoforming processes have been developed. Recently, in the context of bioplastics, commercial thermoplastic starch sheet materials have been developed. The behaviour of such materials is influenced both by temperature and, because of the inherent hydrophilic characteristics of the materials, by moisture content. Both of these aspects affect the behaviour of bioplastic sheets during the thermoforming process. This paper describes experimental work and work on the computational modelling of thermoforming processes for thermoplastic starch sheets in an attempt to address the combined effects of temperature and moisture content. After a discussion of the background of packaging and biomaterials, a mathematical model for the deformation of a membrane into a mould is presented, together with its finite element discretisation. This model depends on material parameters of the thermoplastic and details of tests undertaken to determine these and the results produced are given. Finally the computational model is applied for a thin sheet of commercially available thermoplastic starch material which is thermoformed into a specific mould. Numerical results of thickness and shape for this problem are given.
NASA Technical Reports Server (NTRS)
Willis, Jerry W.
1993-01-01
For a number of years, the Software Technology Branch of the Information Systems Directorate has been involved in the application of cutting edge hardware and software technologies to instructional tasks related to NASA projects. The branch has developed intelligent computer aided training shells, instructional applications of virtual reality and multimedia, and computer-based instructional packages that use fuzzy logic for both instructional and diagnostic decision making. One outcome of the work on space-related technology-supported instruction has been the creation of a significant pool of human talent in the branch with current expertise on the cutting edges of instructional technologies. When the human talent is combined with advanced technologies for graphics, sound, video, CD-ROM, and high speed computing, the result is a powerful research and development group that both contributes to the applied foundations of instructional technology and creates effective instructional packages that take advantage of a range of advanced technologies. Several branch projects are currently underway that combine NASA-developed expertise to significant instructional problems in public education. The branch, for example, has developed intelligent computer aided software to help high school students learn physics and staff are currently working on a project to produce educational software for young children with language deficits. This report deals with another project, the adult literacy tutor. Unfortunately, while there are a number of computer-based instructional packages available for adult literacy instruction, most of them are based on the same instructional models that failed these students when they were in school. The teacher-centered, discrete skill and drill-oriented, instructional strategies, even when they are supported by color computer graphics and animation, that form the foundation for most of the computer-based literacy packages currently on the market may not be the most effective or most desirable way to use computer technology in literacy programs. This project is developing a series of instructional packages that are based on a different instructional model - authentic instruction. The instructional development model used to create these packages is also different. Instead of using the traditional five stage linear, sequential model based on behavioral learning theory, the project uses the recursive, reflective design and development model (R2D2) that is based on cognitive learning theory, particularly the social constructivism of Vygotsky, and an epistemology based on critical theory. Using alternative instructional and instructional development theories, the result of the summer faculty fellowship is LiteraCity, a multimedia adult literacy instructional package that is a simulation of finding and applying for a job. The program, which is about 120 megabytes, is distributed on CD-ROM.
ERIC Educational Resources Information Center
Preising, Paul P.; Frost, Robert
The first of two studies reported was conducted to determine whether unemployed aerospace engineers who received computer science training as well as the Nightengale-Conant attitude change packages would have a significantly higher course completion rate than control classes who were given the same training without the attitude change packages.…
Dependence of the phototropic response of Arabidopsis thaliana on fluence rate and wavelength
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konjevic, R.; Steinitz, B.; Poff, K.L.
1989-12-01
In the phototropic response of Arabidopsis thaliana seedlings, the shape of the fluence-response relation depends on fluence rate and wavelength. At low fluence rates, the response to 450-nm light is characterized by a single maximum at about 0.3 {mu}mol{center dot}m{sup {minus}2}. At higher fluence rates, the response shows two distinct maxima, I and II, at 0.3 and 3.5 {mu}mol{center dot}m{sup {minus}2}, respectively. The response to 500-nm light shows a single maximum at 2 {mu}mol{center dot}m{sup {minus}2}, and the response to 510-nm light shows a single maximum at 4.5 {mu}mol{center dot}m{sup {minus}2}, independent of fluence rate. The response to 490-nm lightmore » shows a maximal at 4.5 {mu}mol{center dot}m{sup {minus}2} and a shoulder at about 0.6 {mu}mol{center dot}m{sup {minus}2}. Preirradiation with high-fluence 510-nm light from above, immediately followed by unilateral 450-nm light, eliminates maximum II but not maximum I. Preirradiation with high-fluence 450-nm light from above eliminates the response to subsequent unilateral irradiation with either 450-nm or 510-nm light. The recovery of the response following high-fluence 450-nm light is considerably slower than the recovery following high-fluence 510-nm light. Unilateral irradiation with low-fluence 510-nm light followed by 450-nm light results in curvature that is approximately the sum of those produced by either irradiation alone. Based on these results, it is proposed that phototropism in A. thaliana seedlings is mediated by at least two blue-light photoreceptor pigments.« less
Comprehensive studies of ultrashort laser pulse ablation of tin target at terawatt power
NASA Astrophysics Data System (ADS)
Elsied, Ahmed M.; Diwakar, Prasoon K.; Hassanein, Ahmed
2018-01-01
The fundamental properties of ultrashort laser interactions with metals using up to terawatt power were comprehensively studied, i.e., specifically mass ablation, nanoparticle formation, and ion dynamics using multitude of diagnostic techniques. Results of this study can be useful in many fields of research including spectroscopy, micromachining, thin film fabrication, particle acceleration, physics of warm dense matter, and equation-of-state determination. A Ti:Sapphire femtosecond laser system (110 mJ maximum energy, 40 fs, 800 nm, P-polarized, single pulse mode) was used, which delivered up to 3 terawatt laser power to ablate 1 mm tin film in vacuum. The experimental analysis includes the effect of the incident laser fluence on the ablated mass, size of the ablated area, and depth of ablation using white light profilometer. Atomic force microscope was used to measure the emitted particles size distribution at different laser fluence. Faraday cup (FC) detector was used to analyze the emitted ions flux by measuring the velocity, and the total charge of the emitted ions. The study shows that the size of emitted particles follows log-normal distribution with peak shifts depending on incident laser fluence. The size of the ablated particles ranges from 20 to 80 nm. The nanoparticles deposited on the wafer tend to aggregate and to be denser as the incident laser fluence increases as shown by AFM images. Laser ablation depth was found to increase logarithmically with laser fluence then leveling off at laser fluence > 400 J/cm2. The total ablated mass tends to increase logarithmically with laser fluence up to 60 J/cm2 while, increases gradually at higher fluence due to the increase in the ablated area. The measured ion emitted flux shows a linear dependence on laser fluence with two distinct regimes. Strong dependence on laser fluence was observed at fluences < 350 J/cm2. Also, a slight enhancement in ion velocity was observed with increasing laser fluence up to 350 J/cm2.
FREQ: A computational package for multivariable system loop-shaping procedures
NASA Technical Reports Server (NTRS)
Giesy, Daniel P.; Armstrong, Ernest S.
1989-01-01
Many approaches in the field of linear, multivariable time-invariant systems analysis and controller synthesis employ loop-sharing procedures wherein design parameters are chosen to shape frequency-response singular value plots of selected transfer matrices. A software package, FREQ, is documented for computing within on unified framework many of the most used multivariable transfer matrices for both continuous and discrete systems. The matrices are evaluated at user-selected frequency-response values, and singular values against frequency. Example computations are presented to demonstrate the use of the FREQ code.
The development of an engineering computer graphics laboratory
NASA Technical Reports Server (NTRS)
Anderson, D. C.; Garrett, R. E.
1975-01-01
Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.
Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt
2015-01-01
One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. PMID:26554005
Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt
2015-12-01
One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.
X based interactive computer graphics applications for aerodynamic design and education
NASA Technical Reports Server (NTRS)
Benson, Thomas J.; Higgs, C. Fred, III
1995-01-01
Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.
BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.
Huang, Hailiang; Tata, Sandeep; Prill, Robert J
2013-01-01
Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
MGtoolkit: A python package for implementing metagraphs
NASA Astrophysics Data System (ADS)
Ranathunga, D.; Nguyen, H.; Roughan, M.
In this paper we present MGtoolkit: an open-source Python package for implementing metagraphs - a first of its kind. Metagraphs are commonly used to specify and analyse business and computer-network policies alike. MGtoolkit can help verify such policies and promotes learning and experimentation with metagraphs. The package currently provides purely textual output for visualising metagraphs and their analysis results.
General-Purpose Ada Software Packages
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.
1991-01-01
Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang
2016-12-23
A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .
Proton Particle Test Fluence: What's the Right Number?
NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.; Ladbury, Raymond
2015-01-01
While we have been utilizing standard fluence levels such as those listed in the JESD57 document, we have begun revisiting what an appropriate test fluence is when it comes to qualifying a device for single events. Instead of a fixed fluence level or until a specific number of events occurs, a different thought process is required.
Particle Test Fluence: What's the Right Number?
NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.
2014-01-01
While we have been utilizing standard fluence levels such as those listed in the JESD57 document, we have begun revisiting what an appropriate test fluence is when it comes to qualifying a device for single events. Instead of a fixed fluence level or until a specific number of events occurs, a different thought process is required.
The Amber Biomolecular Simulation Programs
CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.
2006-01-01
We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636
Link, J; Pachaly, J
1975-08-01
In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.
1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.
Scilab software as an alternative low-cost computing in solving the linear equations problem
NASA Astrophysics Data System (ADS)
Agus, Fahrul; Haviluddin
2017-02-01
Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.
Ultrafast modulation of the plasma frequency of vertically aligned indium tin oxide rods.
Tice, Daniel B; Li, Shi-Qiang; Tagliazucchi, Mario; Buchholz, D Bruce; Weiss, Emily A; Chang, Robert P H
2014-03-12
Light-matter interaction at the nanoscale is of particular interest for future photonic integrated circuits and devices with applications ranging from communication to sensing and imaging. In this Letter a combination of transient absorption (TA) and the use of third harmonic generation as a probe (THG-probe) has been adopted to investigate the response of the localized surface plasmon resonances (LSPRs) of vertically aligned indium tin oxide rods (ITORs) upon ultraviolet light (UV) excitation. TA experiments, which are sensitive to the extinction of the LSPR, show a fluence-dependent increase in the frequency and intensity of the LSPR. The THG-probe experiments show a fluence-dependent decrease of the LSPR-enhanced local electric field intensity within the rod, consistent with a shift of the LSPR to higher frequency. The kinetics from both TA and THG-probe experiments are found to be independent of the fluence of the pump. These results indicate that UV excitation modulates the plasma frequency of ITO on the ultrafast time scale by the injection of electrons into, and their subsequent decay from, the conduction band of the rods. Increases to the electron concentration in the conduction band of ∼13% were achieved in these experiments. Computer simulation and modeling have been used throughout the investigation to guide the design of the experiments and to map the electric field distribution around the rods for interpreting far-field measurement results.
Leake, S.A.; Prudic, David E.
1991-01-01
Removal of ground water by pumping from aquifers may result in compaction of compressible fine-grained beds that are within or adjacent to the aquifers. Compaction of the sediments and resulting land subsidence may be permanent if the head declines result in vertical stresses beyond the previous maximum stress. The process of permanent compaction is not routinely included in simulations of ground-water flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U.S. Geological Survey modular finite-difference ground- water flow model. The new program, the Interbed-Storage Package, is designed to be incorporated into this model. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of the skeletal component of elastic specific storage and the thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the ground-water flow model by adding an additional term to the right-hand side of the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum (preconsolidation) head. Two tests were performed to verify that the package works correctly. The first test compared model-calculated storage and compaction changes to hand-calculated values for a three-dimensional simulation. Model and hand-calculated values were essentially equal. The second test was performed to compare the results of the Interbed-Storage Package with results of the one-dimensional Helm compaction model. This test problem simulated compaction in doubly draining confining beds stressed by head changes in adjacent aquifers. The Interbed-Storage Package and the Helm model computed essentially equal values of compaction. Documentation of the Interbed-Storage Package includes data input instructions, flow charts, narratives, and listings for each of the five modules included in the package. The documentation also includes an appendix describing input instructions and a listing of a computer program for time-variant specified-head boundaries. That package was developed to reduce the amount of data input and output associated with one of the Interbed-Storage Package test problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, Robert
Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports themore » DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.« less
Desensitization and recovery of phototropic responsiveness in Arabidopsis thaliana
NASA Technical Reports Server (NTRS)
Janoudi, A. K.; Poff, K. L.
1993-01-01
Phototropism is induced by blue light, which also induces desensitization, a partial or total loss of phototropic responsiveness. The fluence and fluence-rate dependence of desensitization and recovery from desensitization have been measured for etiolated and red light (669-nm) preirradiated Arabidopsis thaliana seedlings. The extent of desensitization increased as the fluence of the desensitizing 450-nm light was increased from 0.3 to 60 micromoles m-2 s-1. At equal fluences, blue light caused more desensitization when given at a fluence rate of 1.0 micromole m-2 s-1 than at 0.3 micromole m-2 s-1. In addition, seedlings irradiated with blue light at the higher fluence rate required a longer recovery time than seedlings irradiated at the lower fluence rate. A red light preirradiation, probably mediated via phytochrome, decreased the time required for recovery from desensitization. The minimum time for detectable recovery was about 65 s, and the maximum time observed was about 10 min. It is proposed that the descending arm of the fluence-response relationship for first positive phototropism is a consequence of desensitization, and that the time threshold for second positive phototropism establishes a period during which recovery from desensitization occurs.
AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.
Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld
2016-08-01
There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
System and Method for Providing a Climate Data Persistence Service
NASA Technical Reports Server (NTRS)
Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)
2018-01-01
A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.
diffuStats: an R package to compute diffusion-based scores on biological networks.
Picart-Armada, Sergio; Thompson, Wesley K; Buil, Alfonso; Perera-Lluna, Alexandre
2018-02-01
Label propagation and diffusion over biological networks are a common mathematical formalism in computational biology for giving context to molecular entities and prioritizing novel candidates in the area of study. There are several choices in conceiving the diffusion process-involving the graph kernel, the score definitions and the presence of a posterior statistical normalization-which have an impact on the results. This manuscript describes diffuStats, an R package that provides a collection of graph kernels and diffusion scores, as well as a parallel permutation analysis for the normalized scores, that eases the computation of the scores and their benchmarking for an optimal choice. The R package diffuStats is publicly available in Bioconductor, https://bioconductor.org, under the GPL-3 license. sergi.picart@upc.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Technical Reports Server (NTRS)
Rathjen, K. A.; Burk, H. O.
1983-01-01
The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.
Schuettler, Martin; Kohler, Fabian; Ordonez, Juan S; Stieglitz, Thomas
2012-01-01
Future brain-computer-interfaces (BCIs) for severely impaired patients are implanted to electrically contact the brain tissue. Avoiding percutaneous cables requires amplifier and telemetry electronics to be implanted too. We developed a hermetic package that protects the electronic circuitry of a BCI from body moisture while permitting infrared communication through the package wall made from alumina ceramic. The ceramic package is casted in medical grade silicone adhesive, for which we identified MED2-4013 as a promising candidate.
Extension of the Bgl Broad Group Cross Section Library
NASA Astrophysics Data System (ADS)
Kirilova, Desislava; Belousov, Sergey; Ilieva, Krassimira
2009-08-01
The broad group cross-section libraries BUGLE and BGL are applied for reactor shielding calculation using the DOORS package based on discrete ordinates method and multigroup approximation of the neutron cross-sections. BUGLE and BGL libraries are problem oriented for PWR or VVER type of reactors respectively. They had been generated by collapsing the problem independent fine group library VITAMIN-B6 applying PWR and VVER one-dimensional radial model of the reactor middle plane using the SCALE software package. The surveillance assemblies (SA) of VVER-1000/320 are located on the baffle above the reactor core upper edge in a region where geometry and materials differ from those of the middle plane and the neutron field gradient is very high which would result in a different neutron spectrum. That is why the application of the fore-mentioned libraries for the neutron fluence calculation in the region of SA could lead to an additional inaccuracy. This was the main reason to study the necessity for an extension of the BGL library with cross-sections appropriate for the SA region. Comparative analysis of the neutron spectra of the SA region calculated by the VITAMIN-B6 and BGL libraries using the two-dimensional code DORT have been done with purpose to evaluate the BGL applicability for SA calculation.
A review of evaluative studies of computer-based learning in nursing education.
Lewis, M J; Davies, R; Jenkins, D; Tait, M I
2001-01-01
Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Miller, Sharon K.; Waters, Deborah L.
2010-01-01
An atomic oxygen fluence monitor was flown as part of the Materials International Space Station Experiment-6 (MISSE-6). The monitor was designed to measure the accumulation of atomic oxygen fluence with time as it impinged upon the ram surface of the MISSE 6B Passive Experiment Container (PEC). This was an active experiment for which data was to be stored on a battery-powered data logger for post-flight retrieval and analysis. The atomic oxygen fluence measurement was accomplished by allowing atomic oxygen to erode two opposing wedges of pyrolytic graphite that partially covered a photodiode. As the wedges of pyrolytic graphite erode, the area of the photodiode that is illuminated by the Sun increases. The short circuit current, which is proportional to the area of illumination, was to be measured and recorded as a function of time. The short circuit current from a different photodiode, which was oriented in the same direction and had an unobstructed view of the Sun, was also to be recorded as a reference current. The ratio of the two separate recorded currents should bear a linear relationship with the accumulated atomic oxygen fluence and be independent of the intensity of solar illumination. Ground hyperthermal atomic oxygen exposure facilities were used to evaluate the linearity of the ratio of short circuit current to the atomic oxygen fluence. In flight, the current measurement circuitry failed to operate properly, thus the overall atomic oxygen mission fluence could only be estimated based on the physical erosion of the pyrolytic graphite wedges. The atomic oxygen fluence was calculated based on the knowledge of the space atomic oxygen erosion yield of pyrolytic graphite measured from samples on the MISSE 2. The atomic oxygen fluence monitor, the expected result and comparison of mission atomic oxygen fluence based on the erosion of the pyrolytic graphite and Kapton H atomic oxygen fluence witness samples are presented in this paper.
Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen
2017-02-21
To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.
NORTICA—a new code for cyclotron analysis
NASA Astrophysics Data System (ADS)
Gorelov, D.; Johnson, D.; Marti, F.
2001-12-01
The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.
SU-F-T-261: Reconstruction of Initial Photon Fluence Based On EPID Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seliger, T; Engenhart-Cabillic, R; Czarnecki, D
2016-06-15
Purpose: Verifying an algorithm to reconstruct relative initial photon fluence for clinical use. Clinical EPID and CT images were acquired to reconstruct an external photon radiation treatment field. The reconstructed initial photon fluence could be used to verify the treatment or calculate the applied dose to the patient. Methods: The acquired EPID images were corrected for scatter caused by the patient and the EPID with an iterative reconstruction algorithm. The transmitted photon fluence behind the patient was calculated subsequently. Based on the transmitted fluence the initial photon fluence was calculated using a back-projection algorithm which takes the patient geometry andmore » its energy dependent linear attenuation into account. This attenuation was gained from the acquired cone-beam CT or the planning CT by calculating a water-equivalent radiological thickness for each irradiation direction. To verify the algorithm an inhomogeneous phantom consisting of three inhomogeneities was irradiated by a static 6 MV photon field and compared to a reference flood field image. Results: The mean deviation between the reconstructed relative photon fluence for the inhomogeneous phantom and the flood field EPID image was 3% rising up to 7% for off-axis fluence. This was probably caused by the used clinical EPID calibration, which flattens the inhomogeneous fluence profile of the beam. Conclusion: In this clinical experiment the algorithm achieved good results in the center of the field while it showed high deviation of the lateral fluence. This could be reduced by optimizing the EPID calibration, considering the off-axis differential energy response. In further progress this and other aspects of the EPID, eg. field size dependency, CT and dose calibration have to be studied to realize a clinical acceptable accuracy of 2%.« less
Eigensolver for a Sparse, Large Hermitian Matrix
NASA Technical Reports Server (NTRS)
Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris
2003-01-01
A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).
PRROC: computing and visualizing precision-recall and receiver operating characteristic curves in R.
Grau, Jan; Grosse, Ivo; Keilwagen, Jens
2015-08-01
Precision-recall (PR) and receiver operating characteristic (ROC) curves are valuable measures of classifier performance. Here, we present the R-package PRROC, which allows for computing and visualizing both PR and ROC curves. In contrast to available R-packages, PRROC allows for computing PR and ROC curves and areas under these curves for soft-labeled data using a continuous interpolation between the points of PR curves. In addition, PRROC provides a generic plot function for generating publication-quality graphics of PR and ROC curves. © The Author 2015. Published by Oxford University Press.
Quantum Computing Architectural Design
NASA Astrophysics Data System (ADS)
West, Jacob; Simms, Geoffrey; Gyure, Mark
2006-03-01
Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.
An Implemented Strategy for Campus Connectivity and Cooperative Computing.
ERIC Educational Resources Information Center
Halaris, Antony S.; Sloan, Lynda W.
1989-01-01
ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)
Aspects of perturbation theory in quantum mechanics: The BenderWuMATHEMATICA® package
NASA Astrophysics Data System (ADS)
Sulejmanpasic, Tin; Ünsal, Mithat
2018-07-01
We discuss a general setup which allows the study of the perturbation theory of an arbitrary, locally harmonic 1D quantum mechanical potential as well as its multi-variable (many-body) generalization. The latter may form a prototype for regularized quantum field theory. We first generalize the method of Bender-Wu,and derive exact recursion relations which allow the determination of the perturbative wave-function and energy corrections to an arbitrary order, at least in principle. For 1D systems, we implement these equations in an easy to use MATHEMATICA® package we call BenderWu. Our package enables quick home-computer computation of high orders of perturbation theory (about 100 orders in 10-30 s, and 250 orders in 1-2 h) and enables practical study of a large class of problems in Quantum Mechanics. We have two hopes concerning the BenderWu package. One is that due to resurgence, large amount of non-perturbative information, such as non-perturbative energies and wave-functions (e.g. WKB wave functions), can in principle be extracted from the perturbative data. We also hope that the package may be used as a teaching tool, providing an effective bridge between perturbation theory and non-perturbative physics in textbooks. Finally, we show that for the multi-variable case, the recursion relation acquires a geometric character, and has a structure which allows parallelization to computer clusters.
Huda, Walter; Lieberman, Kristin A; Chang, Jack; Roskopf, Marsha L
2004-03-01
We investigated how patient head characteristics, as well as the choice of x-ray technique factors, affect lesion contrast and noise values in computed tomography (CT) images. Head sizes and mean Hounsfield unit (HU) values were obtained from head CT images for five classes of patients ranging from the newborn to adults. X-ray spectra with tube voltages ranging from 80 to 140 kV were used to compute the average photon energy, and energy fluence, transmitted through the heads of patients of varying size. Image contrast, and the corresponding contrast to noise ratios (CNRs), were determined for lesions of fat, muscle, and iodine relative to a uniform water background. Maintaining a constant image CNR for each lesion, the patient energy imparted was also computed to identify the x-ray tube voltage that minimized the radiation dose. For adults, increasing the tube voltage from 80 to 140 kV changed the iodine HU from 2.62 x 10(5) to 1.27 x 10(5), the fat HU from -138 to -108, and the muscle HU from 37.1 to 33.0. Increasing the x-ray tube voltage from 80 to 140 kV increased the percentage energy fluence transmission by up to a factor of 2. For a fixed x-ray tube voltage, the percentage transmitted energy fluence in adults was more than a factor of 4 lower than for newborns. For adults, increasing the x-ray tube voltage from 80 to 140 kV improved the CNR for muscle lesions by 130%, for fat lesions by a factor of 2, and for iodine lesions by 25%. As the size of the patient increased from newborn to adults, lesion CNR was reduced by about a factor of 2. The mAs value can be reduced by 80% when scanning newborns while maintaining the same lesion CNR as for adults. Maintaining the CNR of an iodine lesion at a constant level, use of 140 kV increases the energy imparted to an adult patient by nearly a factor of 3.5 in comparison to 80 kV. For fat and muscle lesions, raising the x-ray tube voltage from 80 to 140 kV at a constant CNR increased the patient dose by 37% and 7%, respectively. Our two key findings are that for head CT examinations performed at a constant CNR, the mAs can be substantially reduced when scanning infants, and that use of the lowest x-ray tube voltage will generally reduce patient doses.
Naser, Mohamed A.; Patterson, Michael S.
2010-01-01
Reconstruction algorithms are presented for a two-step solution of the bioluminescence tomography (BLT) problem. In the first step, a priori anatomical information provided by x-ray computed tomography or by other methods is used to solve the continuous wave (cw) diffuse optical tomography (DOT) problem. A Taylor series expansion approximates the light fluence rate dependence on the optical properties of each region where first and second order direct derivatives of the light fluence rate with respect to scattering and absorption coefficients are obtained and used for the reconstruction. In the second step, the reconstructed optical properties at different wavelengths are used to calculate the Green’s function of the system. Then an iterative minimization solution based on the L1 norm shrinks the permissible regions where the sources are allowed by selecting points with higher probability to contribute to the source distribution. This provides an efficient BLT reconstruction algorithm with the ability to determine relative source magnitudes and positions in the presence of noise. PMID:21258486
PR-EDB: Power Reactor Embrittlement Database - Version 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jy-An John; Subramani, Ranjit
2008-03-01
The aging and degradation of light-water reactor pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel materials depends on many factors, such as neutron fluence, flux, and energy spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Large amounts of data from surveillance capsules are needed to develop a generally applicable damage prediction model that can be used for industrymore » standards and regulatory guides. Furthermore, the investigations of regulatory issues such as vessel integrity over plant life, vessel failure, and sufficiency of current codes, Standard Review Plans (SRPs), and Guides for license renewal can be greatly expedited by the use of a well-designed computerized database. The Power Reactor Embrittlement Database (PR-EDB) is such a comprehensive collection of data for U.S. designed commercial nuclear reactors. The current version of the PR-EDB lists the test results of 104 heat-affected-zone (HAZ) materials, 115 weld materials, and 141 base materials, including 103 plates, 35 forgings, and 3 correlation monitor materials that were irradiated in 321 capsules from 106 commercial power reactors. The data files are given in dBASE format and can be accessed with any personal computer using the Windows operating system. "User-friendly" utility programs have been written to investigate radiation embrittlement using this database. Utility programs allow the user to retrieve, select and manipulate specific data, display data to the screen or printer, and fit and plot Charpy impact data. The PR-EDB Version 3.0 upgrades Version 2.0. The package was developed based on the Microsoft .NET framework technology and uses Microsoft Access for backend data storage, and Microsoft Excel for plotting graphs. This software package is compatible with Windows (98 or higher) and has been built with a highly versatile user interface. PR-EDB Version 3.0 also contains an "Evaluated Residual File" utility for generating the evaluated processed files used for radiation embrittlement study.« less
The equipment access software for a distributed UNIX-based accelerator control system
NASA Astrophysics Data System (ADS)
Trofimov, Nikolai; Zelepoukine, Serguei; Zharkov, Eugeny; Charrue, Pierre; Gareyte, Claire; Poirier, Hervé
1994-12-01
This paper presents a generic equipment access software package for a distributed control system using computers with UNIX or UNIX-like operating systems. The package consists of three main components, an application Equipment Access Library, Message Handler and Equipment Data Base. An application task, which may run in any computer in the network, sends requests to access equipment through Equipment Library calls. The basic request is in the form Equipment-Action-Data and is routed via a remote procedure call to the computer to which the given equipment is connected. In this computer the request is received by the Message Handler. According to the type of the equipment connection, the Message Handler either passes the request to the specific process software in the same computer or forwards it to a lower level network of equipment controllers using MIL1553B, GPIB, RS232 or BITBUS communication. The answer is then returned to the calling application. Descriptive information required for request routing and processing is stored in the real-time Equipment Data Base. The package has been written to be portable and is currently available on DEC Ultrix, LynxOS, HPUX, XENIX, OS-9 and Apollo domain.
Single-Walled Carbon Nanotubes, Carbon Nanofibers and Laser-Induced Incandescence
NASA Technical Reports Server (NTRS)
Schubert, Kathy (Technical Monitor); VanderWal, Randy L.; Ticich, Thomas M.; Berger, Gordon M.; Patel, Premal D.
2004-01-01
Laser induced incandescence applied to a heterogeneous, multi-element reacting flows is characterized by a) temporally resolved emission spectra, time-resolved emission at selected detection wavelengths and fluence dependence. Laser fluences above 0.6 Joules per square centimeter at 1064 nm initiate laser-induced vaporization, yielding a lower incandescence intensity, as found through fluence dependence measurements. Spectrally derived temperatures show that values of excitation laser fluence beyond this value lead to a super-heated plasma, well above the vaporization of temperature of carbon. The temporal evolution of the emission signal at these fluences is consistent with plasma dissipation processes, not incandescence from solid-like structures.
Heavy Ion Irradiation Fluence Dependence for Single-Event Upsets of NAND Flash Memory
NASA Technical Reports Server (NTRS)
Chen, Dakai; Wilcox, Edward; Ladbury, Raymond; Kim, Hak; Phan, Anthony; Seidleck, Christina; LaBel, Kenneth
2016-01-01
We investigated the single-event effect (SEE) susceptibility of the Micron 16 nm NAND flash, and found the single-event upset (SEU) cross section varied inversely with fluence. The SEU cross section decreased with increasing fluence. We attribute the effect to the variable upset sensitivities of the memory cells. The current test standards and procedures assume that SEU follow a Poisson process and do not take into account the variability in the error rate with fluence. Therefore, heavy ion irradiation of devices with variable upset sensitivity distribution using typical fluence levels may underestimate the cross section and on-orbit event rate.
QDENSITY—A Mathematica quantum computer simulation
NASA Astrophysics Data System (ADS)
Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank
2009-03-01
This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples included in the package, e.g., the tutorial, Shor's examples, Teleportation examples and Grover's search, run in less than a minute on a Pentium 4 processor (2.6 GHz). The running time for a quantum computation depends crucially on the number of qubits employed.
NASA Astrophysics Data System (ADS)
Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.
2012-12-01
As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.
Seismic waveform modeling over cloud
NASA Astrophysics Data System (ADS)
Luo, Cong; Friederich, Wolfgang
2016-04-01
With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter
In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less
Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter; ...
2016-06-30
In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less
Toward a web-based real-time radiation treatment planning system in a cloud computing environment.
Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei
2013-09-21
To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.
Toward a web-based real-time radiation treatment planning system in a cloud computing environment
NASA Astrophysics Data System (ADS)
Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei
2013-09-01
To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.
NASA Technical Reports Server (NTRS)
Faith, T. J.; Obenschain, A. F.
1974-01-01
Empirical equations have been derived from measurements of solar cell photovoltaic characteristics relating light-generated current and open circuit voltage to cell temperature, intensity of illumination and 1-MeV electron fluence. Both 2-ohm-cm and 10-ohm-cm cells were tested over the temperature range from 120 to 470 K, the illumination intensity range from 5 to 1830 mW/sq cm, and the electron fluence range from 1 x 10 to the 13th to 1 x 10 to the 16th electrons/sq cm. The normalized temperature coefficient of the light generated current varies as the 0.18 power of the fluence for temperatures above approximately 273 K and is independent of fluence at lower temperatures. At 140 mW/sq cm, a power law expression was derived which shows that the light-generated current decreases at a rate proportional to the 0.153 power of the fluence for both resistivities. The coefficient of the expression is larger for 2-ohm-cm cells; consequently, the advantage for 10-ohm-cm cells increased with increasing fluence.
NASA Astrophysics Data System (ADS)
Wang, Jun; Zhu, Fei; Zhang, Bei; Liu, Huixian; Jia, Guangyi; Liu, Changlong
2012-11-01
Polymethylmethacrylate (PMMA) specimens were implanted with 30 keV carbon ions in a fluence range of 1 × 1016 to 2 × 1017 cm-2, and photoluminescence (PL) and reflectivity of the implanted samples were examined. A luminescent band with one peak was found in PL spectra excited by 480 nm line, but its intensity did not vary in parallel with ion fluence. The strongest PL occurred at the fluence of 5 × 1016 cm-2. Results from visible-light-excited micro-Raman spectra indicated that the formation of hydrogenated amorphous carbon structures in subsurface layer and their evolutions with ion fluence could be responsible for the observed PL responses. Measurements of the small-angle reflectance spectra from both the implanted and rear surfaces of samples in the ultraviolet-visible (UV-vis) range demonstrated a kind of both fluence-dependent and wavelength-related reflectivity variations, which were attributed to the structural changes induced by ion implantation. A noticeable reflectivity modification, which may be practically used, could be found at the fluence of 1 × 1016 cm-2.
Surface modifications of ultra-thin gold films by swift heavy ion irradiation
NASA Astrophysics Data System (ADS)
Dash, P.; Mallick, P.; Rath, H.; Dash, B. N.; Tripathi, A.; Prakash, Jai; Avasthi, D. K.; Satyam, P. V.; Mishra, N. C.
2010-10-01
Gold films of thickness 10 and 20 nm grown on float glass substrate by thermal evaporation technique were irradiated with 107 MeV Ag8+ and 58 MeV Ni5+ ions at different fluences and characterized by Grazing Incidence X-ray Diffraction (GIXRD) and Atomic Force Microscopy (AFM). The pristine films were continuous and no island structures were found even at these small thicknesses. The surface roughness estimated from AFM data did not show either monotonic increase or decrease with ion fluences. Instead, it increased at low fluences and decreased at high fluences for 20 nm thick film. In the 10 nm film roughness first increased with ion fluence, then decreased and again increased at higher fluences. The pattern of variation, however, was identical for Ni and Ag beams. Both the beams led to the formation of cracks on the film surface at intermediate fluences. The observed ion-irradiation induced thickness dependent topographic modification is explained by the spatial confinement of the energy deposited by ions in the reduced dimension of the films.
Developments in blade shape design for a Darrieus vertical axis wind turbine
NASA Astrophysics Data System (ADS)
Ashwill, T. D.; Leonard, T. M.
1986-09-01
A new computer program package has been developed that determines the troposkein shape for a Darrieus Vertical Axis Wind Turbine Blade with any geometrical configuration or rotation rate. This package allows users to interact and develop a buildable blade whose shape closely approximates the troposkein. Use of this package can significantly reduce flatwise mean bending stresses in the blade and increase fatigue life.
MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.
Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed
2017-01-20
Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.
Turbofan noise generation. Volume 2: Computer programs
NASA Technical Reports Server (NTRS)
Ventres, C. S.; Theobald, M. A.; Mark, W. D.
1982-01-01
The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.
Turbofan noise generation. Volume 2: Computer programs
NASA Astrophysics Data System (ADS)
Ventres, C. S.; Theobald, M. A.; Mark, W. D.
1982-07-01
The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.
The Design and Implementation of NASA's Advanced Flight Computing Module
NASA Technical Reports Server (NTRS)
Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce
1995-01-01
This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.
MULTIVARIATERESIDUES : A Mathematica package for computing multivariate residues
NASA Astrophysics Data System (ADS)
Larsen, Kasper J.; Rietkerk, Robbert
2018-01-01
Multivariate residues appear in many different contexts in theoretical physics and algebraic geometry. In theoretical physics, they for example give the proper definition of generalized-unitarity cuts, and they play a central role in the Grassmannian formulation of the S-matrix by Arkani-Hamed et al. In realistic cases their evaluation can be non-trivial. In this paper we provide a Mathematica package for efficient evaluation of multivariate residues based on methods from computational algebraic geometry.
The Computer: An Effective Research Assistant
Gancher, Wendy
1984-01-01
The development of software packages such as data management systems and statistical packages has made it possible to process large amounts of research data. Data management systems make the organization and manipulation of such data easier. Floppy disks ease the problem of storing and retrieving records. Patient information can be kept confidential by limiting access to computer passwords linked with research files, or by using floppy disks. These attributes make the microcomputer essential to modern primary care research. PMID:21279042
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grote, D. P.
Forthon generates links between Fortran and Python. Python is a high level, object oriented, interactive and scripting language that allows a flexible and versatile interface to computational tools. The Forthon package generates the necessary wrapping code which allows access to the Fortran database and to the Fortran subroutines and functions. This provides a development package where the computationally intensive parts of a code can be written in efficient Fortran, and the high level controlling code can be written in the much more versatile Python language.
NASA Technical Reports Server (NTRS)
Smith, Leigh M.; Parker, Nelson C. (Technical Monitor)
2002-01-01
This paper analyzes the use of Computer Aided Design (CAD) packages at NASA's Marshall Space Flight Center (MSFC). It examines the effectiveness of recent efforts to standardize CAD practices across MSFC engineering activities. An assessment of the roles played by management, designers, analysts, and manufacturers in this initiative will be explored. Finally, solutions are presented for better integration of CAD across MSFC in the future.
Computer Applications in the Design Process.
ERIC Educational Resources Information Center
Winchip, Susan
Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…
Castaño-Díez, Daniel
2017-01-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909
Castaño-Díez, Daniel
2017-06-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.
Parallel computation and the Basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1992-12-16
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Parallel computation and the basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1993-05-01
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Community-driven computational biology with Debian Linux
2010-01-01
Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984
Measured Thermal and Fast Neutron Fluence Rates for ATF-1 Holders During ATR Cycle 157D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Larry Don; Miller, David Torbet
This report contains the thermal (2200 m/s) and fast (E>1MeV) neutron fluence rate data for the ATF-1 holders located in core for ATR Cycle 157D which were measured by the Radiation Measurements Laboratory (RML) as requested by the Power Reactor Programs (ATR Experiments) Radiation Measurements Work Order. This report contains measurements of the fluence rates corresponding to the particular elevations relative to the 80-ft. core elevation. The data in this report consist of (1) a table of the ATR power history and distribution, (2) a hard copy listing of all thermal and fast neutron fluence rates, and (3) plots ofmore » both the thermal and fast neutron fluence rates. The fluence rates reported are for the average power levels given in the table of power history and distribution.« less
Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry
2013-08-01
Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
ERIC Educational Resources Information Center
Ruben, Barbara
1994-01-01
Reviews a number of interactive environmental computer education networks and software packages. Computer networks include National Geographic Kids Network, Global Lab, and Global Rivers Environmental Education Network. Computer software involve environmental decision making, simulation games, tropical rainforests, the ocean, the greenhouse…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tharrington, Arnold N.
2015-09-09
The NCCS Regression Test Harness is a software package that provides a framework to perform regression and acceptance testing on NCCS High Performance Computers. The package is written in Python and has only the dependency of a Subversion repository to store the regression tests.
ERIC Educational Resources Information Center
Meloy, Jim; And Others
1990-01-01
The relationship between computer-aided design (CAD), computer-aided manufacturing (CAM), and computer numerical control (CNC) computer applications is described. Tips for helping educate the CAM buyer on what to look for and what to avoid when searching for the most appropriate instructional CAM package are provided. (KR)
Research and Development of Fully Automatic Alien Smoke Stack and Packaging System
NASA Astrophysics Data System (ADS)
Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu
2017-12-01
The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.
MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process
Harbaugh, Arlen W.
2005-01-01
This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.
Practical new method of measuring thermal-neutron fluence
NASA Technical Reports Server (NTRS)
Siebold, J. R.; Warman, E. A.
1967-01-01
Thermoluminescence dosimeter technique measures thermal-neutron fluence by encapsulating lithium flouride phosphor powder and exposing it to a neutron environment. The capsule is heated in a dosimeter reader, which results in light emission proportional to the neutron fluence.
Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A
2013-05-21
The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence correction factor is almost solely influenced by non-elastic nuclear interactions the results are also relevant for plastic phantoms that consist of carbon, oxygen and hydrogen atoms as well as for soft tissues.
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
ERIC Educational Resources Information Center
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
ERIC Educational Resources Information Center
Gambari, Isiaka A.; Gbodi, Bimpe E.; Olakanmi, Eyitao U.; Abalaka, Eneojo N.
2016-01-01
The role of computer-assisted instruction in promoting intrinsic and extrinsic motivation among Nigerian secondary school chemistry students was investigated in this study. The study employed two modes of computer-assisted instruction (computer simulation instruction and computer tutorial instructional packages) and two levels of gender (male and…
Swain, J.E.; Stokowski, S.E.; Milam, D.; Kennedy, G.C.; Rainer, F.
1982-07-07
The bulk optical damage threshold fluence of potassium dihydrogen phosphate (KDP) crystals is increased by irradiating the crystals with laser pulses of duration 1 to 20 nanoseconds of increasing fluence, below the optical damage threshold fluence for untreated crystals, or by baking the crystals for times of the order of 24 hours at temperatures of 110 to 165/sup 0/C, or by a combination of laser irradiation and baking.
Desensitization and recovery of phototropic responsiveness in Arabidopsis thaliana
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janoudi, A.K.; Poff, K.L.
1993-04-01
Phototropism is induced by blue light, which also induces desensitization, a partial or total loss of phototropic responsiveness. The fluence and fluence-rate dependence of densensitization and recovery from desensitization have been measured for etiolated and red light (669-nm) preirradiated Arabidopsis thaliana seedlings. The extent of desensitization increased as the fluence of the desensitizing 450-nm light was increased from 0.3 to 60 [mu]mol m[sup [minus]2] s[sup [minus]1]. At equal fluences, blue light caused more desensitization when given at a fluence rate of 1.0 [mu]mol m[sup [minus]2] s[sup [minus]1] than at 0.3 [mu]mol m[sup [minus]2] s[sup [minus]1]. In addition, seedlings irradiated withmore » blue light at the higher fluence rate required a longer recovery time than seedlings irradiated at the lower fluence rate. A red light preirradiation, probably mediated via phytochrome, decreased the time required for recovery from desensitization. The minimum time for detectable recovery was about 65 s, and the maximum time observed was about 10 min. It is proposed that the descending arm of the fluence-response relationship for first positive phototropism is a consequence of desensitization, and that the time threshold for second positive phototropism establishes a period during which recovery from desensitization occurs. 11 refs., 6 figs.« less
Grossman, Craig E.; Carter, Shirron L.; Czupryna, Julie; Wang, Le; Putt, Mary E.; Busch, Theresa M.
2016-01-01
Photodynamic therapy (PDT) of the thoracic cavity can be performed in conjunction with surgery to treat cancers of the lung and its pleura. However, illumination of the cavity results in tissue exposure to a broad range of fluence rates. In a murine model of intrathoracic PDT, we studied the efficacy of 2-(1-hexyloxyethyl)-2-devinyl pyropheophorbide-a (HPPH; Photochlor®)-mediated PDT in reducing the burden of non-small cell lung cancer for treatments performed at different incident fluence rates (75 versus 150 mW/cm). To better understand a role for growth factor signaling in disease progression after intrathoracic PDT, the expression and activation of epidermal growth factor receptor (EGFR) was evaluated in areas of post-treatment proliferation. The low fluence rate of 75 mW/cm produced the largest reductions in tumor burden. Bioluminescent imaging and histological staining for cell proliferation (anti-Ki-67) identified areas of disease progression at both fluence rates after PDT. However, increased EGFR activation in proliferative areas was detected only after treatment at the higher fluence rate of 150 mW/cm. These data suggest that fluence rate may affect the activation of survival factors, such as EGFR, and weaker activation at lower fluence rate could contribute to a smaller tumor burden after PDT at 75 mW/cm. PMID:26784170
Grossman, Craig E; Carter, Shirron L; Czupryna, Julie; Wang, Le; Putt, Mary E; Busch, Theresa M
2016-01-14
Photodynamic therapy (PDT) of the thoracic cavity can be performed in conjunction with surgery to treat cancers of the lung and its pleura. However, illumination of the cavity results in tissue exposure to a broad range of fluence rates. In a murine model of intrathoracic PDT, we studied the efficacy of 2-(1-hexyloxyethyl)-2-devinyl pyropheophorbide-a (HPPH; Photochlor(®))-mediated PDT in reducing the burden of non-small cell lung cancer for treatments performed at different incident fluence rates (75 versus 150 mW/cm). To better understand a role for growth factor signaling in disease progression after intrathoracic PDT, the expression and activation of epidermal growth factor receptor (EGFR) was evaluated in areas of post-treatment proliferation. The low fluence rate of 75 mW/cm produced the largest reductions in tumor burden. Bioluminescent imaging and histological staining for cell proliferation (anti-Ki-67) identified areas of disease progression at both fluence rates after PDT. However, increased EGFR activation in proliferative areas was detected only after treatment at the higher fluence rate of 150 mW/cm. These data suggest that fluence rate may affect the activation of survival factors, such as EGFR, and weaker activation at lower fluence rate could contribute to a smaller tumor burden after PDT at 75 mW/cm.
Surface morphology correlated with field emission properties of laser irradiated nickel
NASA Astrophysics Data System (ADS)
Jalil, S. A.; Bashir, S.; Akram, M.; Ahmed, Q. S.; Haq, F. U.
2017-08-01
The effect of laser fluence on the surface morphology and field emission properties of nickel (Ni) has been investigated. Circular shaped Ni targets are irradiated with Nd:YAG laser (1064 nm, 10 Hz, 10 ns) at various fluences ranging from 5.2 to 26 J/cm2 in air. For low fluence ranging from 5.2 to 10.4 J/cm2, SEM analysis reveals the growth of unorganized channels, grains, droplets, and ridges. Whereas, at moderate fluence of 15.6 J/cm2, the formation of ridges and cones along with few number of holes are observed. However, at high fluence regime ranging from 20 to 26 J/cm2, a sharp transition in morphology from ridges to holes has been observed. The laser structured Ni targets are also investigated for field emission properties by recording their I-V characteristics and Fowler-Nordheim (F-N) plots. The enhancement in field emission factor (β) and the reduction in turn on field are found to be dependent upon the laser fluence and morphology of the grown structures. For samples treated at low and moderate fluences, the growth of cones, channels and ridges is responsible for enhancement of β factor ranging from 121 to 178. Whereas, for samples treated at high fluence region, the formation of pores and holes is responsible for significant field convergence and consequently resulting in substantial enhancement in β factor to 276.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hep, J.; Konecna, A.; Krysl, V.
2011-07-01
This paper describes the application of effective source in forward calculations and the adjoint method to the solution of fast neutron fluence and activation detector activities in the reactor pressure vessel (RPV) and RPV cavity of a VVER-440 reactor. Its objective is the demonstration of both methods on a practical task. The effective source method applies the Boltzmann transport operator to time integrated source data in order to obtain neutron fluence and detector activities. By weighting the source data by time dependent decay of the detector activity, the result of the calculation is the detector activity. Alternatively, if the weightingmore » is uniform with respect to time, the result is the fluence. The approach works because of the inherent linearity of radiation transport in non-multiplying time-invariant media. Integrated in this way, the source data are referred to as the effective source. The effective source in the forward calculations method thereby enables the analyst to replace numerous intensive transport calculations with a single transport calculation in which the time dependence and magnitude of the source are correctly represented. In this work, the effective source method has been expanded slightly in the following way: neutron source data were performed with few group method calculation using the active core calculation code MOBY-DICK. The follow-up neutron transport calculation was performed using the neutron transport code TORT to perform multigroup calculations. For comparison, an alternative method of calculation has been used based upon adjoint functions of the Boltzmann transport equation. Calculation of the three-dimensional (3-D) adjoint function for each required computational outcome has been obtained using the deterministic code TORT and the cross section library BGL440. Adjoint functions appropriate to the required fast neutron flux density and neutron reaction rates have been calculated for several significant points within the RPV and RPV cavity of the VVER-440 reacto rand located axially at the position of maximum power and at the position of the weld. Both of these methods (the effective source and the adjoint function) are briefly described in the present paper. The paper also describes their application to the solution of fast neutron fluence and detectors activities for the VVER-440 reactor. (authors)« less
Computer Aided Design Parameters for Forward Basing
1988-12-01
21 meters. Systematic errors within limits stated for absolute accuracy are tolerated at this level. DEM data acquired photogrammetrically using manual ...This is a professional drawing package, 19 capable of the manipulation required for this project. With the AutoLISP programming language (a variation on...Table 2). 0 25 Data Conversion Package II GWN System’s Digital Terrain Modeling (DTM) package was used. This AutoLISP -based third party software is
smwrGraphs—An R package for graphing hydrologic data, version 1.1.2
Lorenz, David L.; Diekoff, Aliesha L.
2017-01-31
This report describes an R package called smwrGraphs, which consists of a collection of graphing functions for hydrologic data within R, a programming language and software environment for statistical computing. The functions in the package have been developed by the U.S. Geological Survey to create high-quality graphs for publication or presentation of hydrologic data that meet U.S. Geological Survey graphics guidelines.
Ultra high speed image processing techniques. [electronic packaging techniques
NASA Technical Reports Server (NTRS)
Anthony, T.; Hoeschele, D. F.; Connery, R.; Ehland, J.; Billings, J.
1981-01-01
Packaging techniques for ultra high speed image processing were developed. These techniques involve the development of a signal feedthrough technique through LSI/VLSI sapphire substrates. This allows the stacking of LSI/VLSI circuit substrates in a 3 dimensional package with greatly reduced length of interconnecting lines between the LSI/VLSI circuits. The reduced parasitic capacitances results in higher LSI/VLSI computational speeds at significantly reduced power consumption levels.
Leake, S.A.; Prudic, David E.
1988-01-01
The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)
Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1
NASA Technical Reports Server (NTRS)
Schlosser, E. H.
1980-01-01
The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.
Determinant Computation on the GPU using the Condensation Method
NASA Astrophysics Data System (ADS)
Anisul Haque, Sardar; Moreno Maza, Marc
2012-02-01
We report on a GPU implementation of the condensation method designed by Abdelmalek Salem and Kouachi Said for computing the determinant of a matrix. We consider two types of coefficients: modular integers and floating point numbers. We evaluate the performance of our code by measuring its effective bandwidth and argue that it is numerical stable in the floating point number case. In addition, we compare our code with serial implementation of determinant computation from well-known mathematical packages. Our results suggest that a GPU implementation of the condensation method has a large potential for improving those packages in terms of running time and numerical stability.
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.
2018-01-01
The following are updated or new subjects added to the FPGA SEE Test Guidelines manual: academic versus mission specific device evaluation, single event latch-up (SEL) test and analysis, SEE response visibility enhancement during radiation testing, mitigation evaluation (embedded and user-implemented), unreliable design and its affects to SEE Data, testing flushable architectures versus non-flushable architectures, intellectual property core (IP Core) test and evaluation (addresses embedded and user-inserted), heavy-ion energy and linear energy transfer (LET) selection, proton versus heavy-ion testing, fault injection, mean fluence to failure analysis, and mission specific system-level single event upset (SEU) response prediction. Most sections within the guidelines manual provide information regarding best practices for test structure and test system development. The scope of this manual addresses academic versus mission specific device evaluation and visibility enhancement in IP Core testing.
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.
2018-01-01
The following are updated or new subjects added to the FPGA SEE Test Guidelines manual: academic versus mission specific device evaluation, single event latch-up (SEL) test and analysis, SEE response visibility enhancement during radiation testing, mitigation evaluation (embedded and user-implemented), unreliable design and its affects to SEE Data, testing flushable architectures versus non-flushable architectures, intellectual property core (IP Core) test and evaluation (addresses embedded and user-inserted), heavy-ion energy and linear energy transfer (LET) selection, proton versus heavy-ion testing, fault injection, mean fluence to failure analysis, and mission specific system-level single event upset (SEU) response prediction. Most sections within the guidelines manual provide information regarding best practices for test structure and test system development. The scope of this manual addresses academic versus mission specific device evaluation and visibility enhancement in IP Core testing.
Sato, Tatsuhiko; Endo, Akira; Niita, Koji
2010-04-21
The fluence to organ-absorbed-dose and effective-dose conversion coefficients for heavy ions with atomic numbers up to 28 and energies from 1 MeV/nucleon to 100 GeV/nucleon were calculated using the PHITS code coupled to the ICRP/ICRU adult reference computational phantoms, following the instruction given in ICRP Publication 103 (2007 (Oxford: Pergamon)). The conversion coefficients for effective dose equivalents derived using the radiation quality factors of both Q(L) and Q(y) relationships were also estimated, utilizing the functions for calculating the probability densities of absorbed dose in terms of LET (L) and lineal energy (y), respectively, implemented in PHITS. The calculation results indicate that the effective dose can generally give a conservative estimation of the effective dose equivalent for heavy-ion exposure, although it is occasionally too conservative especially for high-energy lighter-ion irradiations. It is also found from the calculation that the conversion coefficients for the Q(y)-based effective dose equivalents are generally smaller than the corresponding Q(L)-based values because of the conceptual difference between LET and y as well as the numerical incompatibility between the Q(L) and Q(y) relationships. The calculated data of these dose conversion coefficients are very useful for the dose estimation of astronauts due to cosmic-ray exposure.
Nosik, Melissa R; Williams, W Larry; Garrido, Natalia; Lee, Sarah
2013-01-01
In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following training, participants were evaluated in terms of their accuracy on completing critical skills for running a discrete trial program. Six participants completed training; three received behavior skills training and three received the computer based training. Participants in the BST group performed better overall after training and during six week probes than those in the computer based training group. There were differences across both groups between research assistant and natural environment competency levels. Copyright © 2012 Elsevier Ltd. All rights reserved.
Femtosecond laser fluence based nanostructuring of W and Mo in ethanol
NASA Astrophysics Data System (ADS)
Bashir, Shazia; Rafique, Muhammad Shahid; Nathala, Chandra Sekher; Ajami, Ali Asghar; Husinsky, Wolfgang
2017-05-01
The effect of femtosecond laser fluence on nanostructuring of Tungsten (W) and Molybdenum (Mo) has been investigated after ablation in ethanol environment. A Ti: Sapphire laser (800 nm, 30 fs) at fluences ranging from 0.6 to 5.7 J cm-2 was employed to ablate targets. The growth of structures on the surface of irradiated targets is investigated by Field Emission Scanning Electron Microscope (FESEM) analysis. The SEM was performed for both central as well as the peripheral ablated regions. It is observed that both the development and shape of nanoscale features is dependent upon deposited energies to the target surface as well as nature of material. Nanostructures grown on Mo are more distinct and well defined as compared to W. At central ablated areas of W, unorganized Laser Induced Periodic Surface Structures (LIPSS) are grown at low fluences, whereas, nonuniform melting along with cracking is observed at higher fluences. In case of Mo, well-defined and organized LIPSS are observed for low fluences. With increasing fluence, LIPSS become unorganized and broken with an appearance of cracks and are completely vanished with the formation of nanoscale cavities and conical structures. In case of peripheral ablated areas broken and bifurcated LIPSS are grown for all fluences for both materials. The, ablated diameter, ablation depth, ablation rate and the dependence of periodicity of LIPSS on the laser fluence are also estimated for both W and Mo. Parametric instabilities of laser-induced plasma along with generation and scattering of surface plasmons is considered as a possible cause for the formation of LIPSS. For ethanol assisted ablation, the role of bubble cavitation, precipitation, confinement and the convective flow is considered to be responsible for inducing increased hydrodynamic instabilities at the liquid-solid interface.
PyBoolNet: a python package for the generation, analysis and visualization of boolean networks.
Klarner, Hannes; Streck, Adam; Siebert, Heike
2017-03-01
The goal of this project is to provide a simple interface to working with Boolean networks. Emphasis is put on easy access to a large number of common tasks including the generation and manipulation of networks, attractor and basin computation, model checking and trap space computation, execution of established graph algorithms as well as graph drawing and layouts. P y B ool N et is a Python package for working with Boolean networks that supports simple access to model checking via N u SMV, standard graph algorithms via N etwork X and visualization via dot . In addition, state of the art attractor computation exploiting P otassco ASP is implemented. The package is function-based and uses only native Python and N etwork X data types. https://github.com/hklarner/PyBoolNet. hannes.klarner@fu-berlin.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NORAD LOOK ANGLES AND PIO SATELLITE PACKAGE
NASA Technical Reports Server (NTRS)
ANONYMOUS
1994-01-01
This program package consists of two programs. First is the NORAD Look Angles Program, which computes satellite look angles (azimuth, elevation, and range) as well as the subsatellite points (latitude, longitude, and height). The second program in this package is the PIO Satellite Program, which computes sighting directions, visibility times, and the maximum elevation angle attained during each pass of an earth-orbiting satellite. Computations take into consideration the observing location and the effect of the earth's shadow on the satellite visibility. Input consists of a magnetic tape prepared by the NORAD Look Angles Program and punched cards containing reference Julian date, right ascension, declination, mean sidereal time at zero hours universal time of the reference date, and daily changes of these quantities. Output consists of a tabulated listing of the satellite's rise and set times, direction, and the maximum elevation angle visible from each observing location. This program has been implemented on the GE 635. The program Assembler code can easily be replaced by FORTRAN statements.
ChemoPy: freely available python package for computational biology and chemoinformatics.
Cao, Dong-Sheng; Xu, Qing-Song; Hu, Qian-Nan; Liang, Yi-Zeng
2013-04-15
Molecular representation for small molecules has been routinely used in QSAR/SAR, virtual screening, database search, ranking, drug ADME/T prediction and other drug discovery processes. To facilitate extensive studies of drug molecules, we developed a freely available, open-source python package called chemoinformatics in python (ChemoPy) for calculating the commonly used structural and physicochemical features. It computes 16 drug feature groups composed of 19 descriptors that include 1135 descriptor values. In addition, it provides seven types of molecular fingerprint systems for drug molecules, including topological fingerprints, electro-topological state (E-state) fingerprints, MACCS keys, FP4 keys, atom pairs fingerprints, topological torsion fingerprints and Morgan/circular fingerprints. By applying a semi-empirical quantum chemistry program MOPAC, ChemoPy can also compute a large number of 3D molecular descriptors conveniently. The python package, ChemoPy, is freely available via http://code.google.com/p/pychem/downloads/list, and it runs on Linux and MS-Windows. Supplementary data are available at Bioinformatics online.
Computerizing the Accounting Curriculum.
ERIC Educational Resources Information Center
Nash, John F.; England, Thomas G.
1986-01-01
Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)
Analysis of multiple photoreceptor pigments for phototropism in a mutant of Arabidopsis thaliana
NASA Technical Reports Server (NTRS)
Konjevic, R.; Khurana, J. P.; Poff, K. L.
1992-01-01
The shape of the fluence-response relationship for the phototropic response of the JK224 strain of Arabidopsis thaliana depends on the fluence rate and wavelength of the actinic light. At low fluence rate (0.1 micromole m-2 s-1), the response to 450-nm light is characterized by a single maximum at about 9 micromoles m-2. At higher fluence rate (0.4 micromole m-2 s-1), the response shows two maxima, at 4.5 and 9 micromoles m-2. The response to 510-nm light shows a single maximum at 4.5 micromoles m-2. Unilateral preirradiation with high fluence rate (25 micromoles m-2 s-1) 510-nm light eliminates the maximum at 4.5 micromoles m-2 in the fluence response curve to a subsequent unilateral 450-nm irradiation, while the second maximum at 9 micromoles m-2 is unaffected. Based on these results, it is concluded that a single photoreceptor pigment has been altered in the JK224 strain of Arabidopsis thaliana.
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Barth, J. L.; Stassinopoulos, E. G.; Burke, E. A.; Gee, G. B.
1999-01-01
The effects that solar proton events have on microelectronics and solar arrays are important considerations for spacecraft in geostationary and polar orbits and for interplanetary missions. Designers of spacecraft and mission planners are required to assess the performance of microelectronic systems under a variety of conditions. A number of useful approaches exist for predicting information about solar proton event fluences and, to a lesser extent, peak fluxes. This includes the cumulative fluence over the course of a mission, the fluence of a worst-case event during a mission, the frequency distribution of event fluences, and the frequency distribution of large peak fluxes. Naval Research Laboratory (NRL) and NASA Goddard Space Flight Center, under the sponsorship of NASA's Space Environments and Effects (SEE) Program, have developed a new model for predicting cumulative solar proton fluences and worst-case solar proton events as functions of mission duration and user confidence level. This model is called the Emission of Solar Protons (ESP) model.
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Barth, J. L.; Stassinopoulos, E. G.; Burke, Edward A.; Gee, G. B.
1999-01-01
The effects that solar proton events have on microelectronics and solar arrays are important considerations for spacecraft in geostationary and polar orbits and for interplanetary missions. Designers of spacecraft and mission planners are required to assess the performance of microelectronic systems under a variety of conditions. A number of useful approaches exist for predicting information about solar proton event fluences and, to a lesser extent, peak fluxes. This includes the cumulative fluence over the course of a mission, the fluence of a worst-case event during a mission, the frequency distribution of event fluences, and the frequency distribution of large peak fluxes. Naval Research Laboratory (NRL) and NASA Goddard Space Flight Center, under the sponsorship of NASA's Space Environments and Effects (SEE) Program, have developed a new model for predicting cumulative solar proton fluences and worst-case solar proton events as functions of mission duration and user confidence level. This model is called the Emission of Solar Protons (ESP) model.
NASA Technical Reports Server (NTRS)
Obenschain, A. F.; Faith, T. J.
1973-01-01
Emperical equations have been derived from measurements of solar cell photovoltaic characteristics relating light generated current, IL, and open circuit voltage, VO, to cell temperature, T, intensity of illumination, W, and 1 Mev electron fluence, phi both 2 ohm-cm and 10 ohm-cm cells were tested. The temperature dependency of IL is similar for both resistivities at 140mw/sq cm; at high temperature the coefficient varies with fluence as phi 0.18, while at low temperatures the coefficient is relatively independent of fluence. Fluence dependent degration causes a decrease in IL at a rate proportional to phi 0.153 for both resistivities. At all intensities other than 560 mw/sq cm, a linear dependence of IL on illumination was found. The temperature coefficient of voltage was, to a good approximation, independent of both temperature and illumination for both resistivities. Illumination dependence of VOC was logarithmic, while the decrease with fluence of VOC varied as phi 0.25 for both resistivities.
NASA Technical Reports Server (NTRS)
Lord, Kenneth R., II; Walters, Michael R.; Woodyard, James R.
1994-01-01
The radiation resistance of commercial solar cells fabricated from hydrogenated amorphous silicon alloys is reported. A number of different device structures were irradiated with 1.0 MeV protons. The cells were insensitive to proton fluences below 1E12 sq cm. The parameters of the irradiated cells were restored with annealing at 200 C. The annealing time was dependent on proton fluence. Annealing devices for one hour restores cell parameters for fluences below 1E14 sq cm fluences above 1E14 sq cm require longer annealing times. A parametric fitting model was used to characterize current mechanisms observed In dark I-V measurements. The current mechanism were explored with irradiation fluence, and voltage and light soaking times. The thermal generation current density and quality factor increased with proton fluence. Device simulation shows the degradation in cell characteristics may be explained by the reduction of the electric field in the intrinsic layer.
Equivalent electron fluence for solar proton damage in GaAs shallow junction cells
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Stock, L. V.
1984-01-01
The short-circuit current reduction in GaAs shallow junction heteroface solar cells was calculated according to a simplified solar cell damage model in which the nonuniformity of the damage as a function of penetration depth is treated explicitly. Although the equivalent electron fluence was not uniquely defined for low-energy monoenergetic proton exposure, an equivalent electron fluence is found for proton spectra characteristic of the space environment. The equivalent electron fluence ratio was calculated for a typical large solar flare event for which the proton spectrum is PHI(sub p)(E) = A/E(p/sq. cm) where E is in MeV. The equivalent fluence ratio is a function of the cover glass shield thickness or the corresponding cutoff energy E(sub c). In terms of the cutoff energy, the equivalent 1 MeV electron fluence ratio is r(sub p)(E sub c) = 10(9)/E(sub c)(1.8) where E(sub c) is in units of KeV.
Social Mathematics and Media: Using Pictures, Maps, Charts, and Graphs. Media Corner.
ERIC Educational Resources Information Center
Braun, Joseph A., Jr., Ed.
1993-01-01
Asserts that integrating disciplines is a goal of elementary social studies education. Presents a bibliographic essay describing instructional materials that can be used to integrate mathematics and social studies. Includes recommended photograph packages, computer databases, and data interpretation packages. (CFR)
Evaluation of Five Microcomputer CAD Packages.
ERIC Educational Resources Information Center
Leach, James A.
1987-01-01
Discusses the similarities, differences, advanced features, applications and number of users of five microcomputer computer-aided design (CAD) packages. Included are: "AutoCAD (V.2.17)"; "CADKEY (V.2.0)"; "CADVANCE (V.1.0)"; "Super MicroCAD"; and "VersaCAD Advanced (V.4.00)." Describes the…
graphkernels: R and Python packages for graph comparison
Ghisu, M Elisabetta; Llinares-López, Felipe; Borgwardt, Karsten
2018-01-01
Abstract Summary Measuring the similarity of graphs is a fundamental step in the analysis of graph-structured data, which is omnipresent in computational biology. Graph kernels have been proposed as a powerful and efficient approach to this problem of graph comparison. Here we provide graphkernels, the first R and Python graph kernel libraries including baseline kernels such as label histogram based kernels, classic graph kernels such as random walk based kernels, and the state-of-the-art Weisfeiler-Lehman graph kernel. The core of all graph kernels is implemented in C ++ for efficiency. Using the kernel matrices computed by the package, we can easily perform tasks such as classification, regression and clustering on graph-structured samples. Availability and implementation The R and Python packages including source code are available at https://CRAN.R-project.org/package=graphkernels and https://pypi.python.org/pypi/graphkernels. Contact mahito@nii.ac.jp or elisabetta.ghisu@bsse.ethz.ch Supplementary information Supplementary data are available online at Bioinformatics. PMID:29028902
Quantitative evaluation of software packages for single-molecule localization microscopy.
Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael
2015-08-01
The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.
graphkernels: R and Python packages for graph comparison.
Sugiyama, Mahito; Ghisu, M Elisabetta; Llinares-López, Felipe; Borgwardt, Karsten
2018-02-01
Measuring the similarity of graphs is a fundamental step in the analysis of graph-structured data, which is omnipresent in computational biology. Graph kernels have been proposed as a powerful and efficient approach to this problem of graph comparison. Here we provide graphkernels, the first R and Python graph kernel libraries including baseline kernels such as label histogram based kernels, classic graph kernels such as random walk based kernels, and the state-of-the-art Weisfeiler-Lehman graph kernel. The core of all graph kernels is implemented in C ++ for efficiency. Using the kernel matrices computed by the package, we can easily perform tasks such as classification, regression and clustering on graph-structured samples. The R and Python packages including source code are available at https://CRAN.R-project.org/package=graphkernels and https://pypi.python.org/pypi/graphkernels. mahito@nii.ac.jp or elisabetta.ghisu@bsse.ethz.ch. Supplementary data are available online at Bioinformatics. © The Author(s) 2017. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Nogueira, P.; Zankl, M.; Schlattl, H.; Vaz, P.
2011-11-01
The radiation-induced posterior subcapsular cataract has long been generally accepted to be a deterministic effect that does not occur at doses below a threshold of at least 2 Gy. Recent epidemiological studies indicate that the threshold for cataract induction may be much lower or that there may be no threshold at all. A thorough study of this subject requires more accurate dose estimates for the eye lens than those available in ICRP Publication 74. Eye lens absorbed dose per unit fluence conversion coefficients for electron irradiation were calculated using a geometrical model of the eye that takes into account different cell populations of the lens epithelium, together with the MCNPX Monte Carlo radiation transport code package. For the cell population most sensitive to ionizing radiation—the germinative cells—absorbed dose per unit fluence conversion coefficients were determined that are up to a factor of 4.8 higher than the mean eye lens absorbed dose conversion coefficients for electron energies below 2 MeV. Comparison of the results with previously published values for a slightly different eye model showed generally good agreement for all electron energies. Finally, the influence of individual anatomical variability was quantified by positioning the lens at various depths below the cornea. A depth difference of 2 mm between the shallowest and the deepest location of the germinative zone can lead to a difference between the resulting absorbed doses of up to nearly a factor of 5000 for electron energy of 0.7 MeV.
Nogueira, P; Zankl, M; Schlattl, H; Vaz, P
2011-11-07
The radiation-induced posterior subcapsular cataract has long been generally accepted to be a deterministic effect that does not occur at doses below a threshold of at least 2 Gy. Recent epidemiological studies indicate that the threshold for cataract induction may be much lower or that there may be no threshold at all. A thorough study of this subject requires more accurate dose estimates for the eye lens than those available in ICRP Publication 74. Eye lens absorbed dose per unit fluence conversion coefficients for electron irradiation were calculated using a geometrical model of the eye that takes into account different cell populations of the lens epithelium, together with the MCNPX Monte Carlo radiation transport code package. For the cell population most sensitive to ionizing radiation-the germinative cells-absorbed dose per unit fluence conversion coefficients were determined that are up to a factor of 4.8 higher than the mean eye lens absorbed dose conversion coefficients for electron energies below 2 MeV. Comparison of the results with previously published values for a slightly different eye model showed generally good agreement for all electron energies. Finally, the influence of individual anatomical variability was quantified by positioning the lens at various depths below the cornea. A depth difference of 2 mm between the shallowest and the deepest location of the germinative zone can lead to a difference between the resulting absorbed doses of up to nearly a factor of 5000 for electron energy of 0.7 MeV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sonzogni, A.A.
2005-05-24
A package of computer codes has been developed to process and display nuclear structure and decay data stored in the ENSDF (Evaluated Nuclear Structure Data File) library. The codes were written in an object-oriented fashion using the java language. This allows for an easy implementation across multiple platforms as well as deployment on web pages. The structure of the different java classes that make up the package is discussed as well as several different implementations.
Wide-field Imaging System and Rapid Direction of Optical Zoom (WOZ)
2010-09-25
commercial software packages: SolidWorks, COMSOL Multiphysics, and ZEMAX optical design. SolidWorks is a computer aided design package, which as a live...interface to COMSOL. COMSOL is a finite element analysis/partial differential equation solver. ZEMAX is an optical design package. Both COMSOL and... ZEMAX have live interfaces to MatLab. Our initial investigations have enabled a model in SolidWorks to be updated in COMSOL, an FEA calculation
The LARSYS Educational Package: Instructor's Notes for Use with the Data 100
NASA Technical Reports Server (NTRS)
Lindenlaub, J. C.; Russell, J. D.
1977-01-01
The LARSYS Educational Package is a set of instructional materials developed to train people to analyze remotely sensed multispectral data using LARSYS, a computer software system. The materials included in this volume have been designed to assist LARSYS instructors as they guide students through the LARSYS Educational Package. All of the materials have been updated from the previous version to reflect the use of a Data 100 Remote Terminal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janoudi, A.K.; Gordon, W.R.; Poff, K.L.
1997-03-01
The amplitude of phototropic curvature to blue light is enhanced by a prior exposure of seedlings to red light. This enhancement is mediated by phytochrome. Fluence-response relationships have been constructed for red-light-induced enhancement in the phytochrome A (phyA) null mutant, the phytochrome B- (phyB) deficient mutant, and in two transgenic lines of Arabidopsis thaliana that overexpress either phyA or phyB. These fluence-response relationships demonstrate the existence of two responses in enhancement, a response in the very-low-to-low-fluence range, and a response in the high-fluence range. Only the response in the high-fluence range is present in the phyA null mutant. In contrast,more » the phyB-deficient mutant is indistinguishable from the wild-type parent in red-light responsiveness. These data indicate that phyA is necessary for the very-low-to-low but not the high-fluence response, and that phyB is not necessary for either response range. Based on these results, the high-fluence response, if controlled by a single phytochrome, must be controlled by a phytochrome other than phyA or phyB. Overexpression of phyA has a negative effect and overexpression of phyB has an enhancing effect in the high fluence range. These results suggest that overexpression of either phytochrome perturbs the function of the endogenous photoreceptor system in unpreditable fashion. 25 refs., 3 figs.« less
Tunnell, J W; Nelson, J S; Torres, J H; Anvari, B
2000-01-01
Higher laser fluences than currently used in therapy (5-10 J/cm(2)) are expected to result in more effective treatment of port wine stain (PWS) birthmarks. However, higher incident fluences increase the risk of epidermal damage caused by absorption of light by melanin. Cryogen spray cooling offers an effective method to reduce epidermal injury during laser irradiation. The objective of this study was to determine whether high laser incident fluences (15-30 J/cm(2)) could be used while still protecting the epidermis in ex vivo human skin samples. Non-PWS skin from a human cadaver was irradiated with a Candela ScleroPlus Laser (lambda = 585 nm; pulse duration = 1.5 msec) by using various incident fluences (8-30 J/cm(2)) without and with cryogen spray cooling (refrigerant R-134a; spurt durations: 40-250 msec). Assessment of epidermal damage was based on histologic analysis. Relatively short spurt durations (40-100 msec) protected the epidermis for laser incident fluences comparable to current therapeutic levels (8-10 J/cm(2)). However, longer spurt durations (100-250 msec) increased the fluence threshold for epidermal damage by a factor of three (up to 30 J/cm(2)) in these ex vivo samples. Results of this ex vivo study show that epidermal protection from high laser incident fluences can be achieved by increasing the cryogen spurt duration immediately before pulsed laser exposure. Copyright 2000 Wiley-Liss, Inc.
Amaroli, Andrea; Ravera, Silvia; Parker, Steven; Panfoli, Isabella; Benedicenti, Alberico; Benedicenti, Stefano
2016-11-01
Photobiomodulation is proposed as a non-linear process. Only the action of light at a low intensity and fluence is assumed to have stimulation on cells; whereas a higher light intensity and fluence generates negative effects, exhausting the cell's energy reserve as a consequence of a too strong stimulation. In our work, we detected the photobiomodulatory effect of an 808-nm higher-fluence diode laser [64 J/cm 2 -1 W, continuous wave (CW)] irradiated by a flat-top handpiece on mitochondria activities, such as oxygen consumption, activity of mitochondria complexes I, II, III, and IV, and cytochrome c as well as ATP synthesis. The experiments are performed by standard procedure on mitochondria purified from bovine liver. Our higher-fluence diode laser positively photobiomodulates the mitochondria oxygen consumption, the activity of the complexes III and IV, and the ATP production, with a P/O = 2.6. The other activities are not influenced. Our data show for the first time that even the higher fluences (64 J/cm 2 -1 W), similar to the low fluences, can photobiostimulate the mitochondria respiratory chain without uncoupling them and can induce an increment in the ATP production. These results suggest that the negative effects of higher fluences observed to date are not unequivocally due to higher fluence per se but might be a consequence of the irradiation carried by handpieces with a Gaussian profile.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1989
1989-01-01
Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)
NLM microcomputer-based tutorials (for microcomputers). Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perkins, M.
1990-04-01
The package consists of TOXLEARN--a microcomputer-based training package for TOXLINE (Toxicology Information Online), CHEMLEARN-a microcomputer-based training package for CHEMLINE (Chemical Information Online), MEDTUTOR--a microcomputer-based training package for MEDLINE (Medical Information Online), and ELHILL LEARN--a microcomputer-based training package for the ELHILL search and retrieval software that supports the above-mentioned databases...Software Description: The programs were developed under PILOTplus using the NLM LEARN Programmer. They run on IBM-PC, XT, AT, PS/2, and fully compatible computers. The programs require 512K RAM memory, one disk drive, and DOS 2.0 or higher. The software supports most monochrome, color graphics, enhanced color graphics, or visual graphics displays.
Pulse fluence dependent nanograting inscription on the surface of fused silica
NASA Astrophysics Data System (ADS)
Liang, Feng; Vallée, Réal; Leang Chin, See
2012-06-01
Pulse fluence dependent nanograting inscription on the surface of fused silica is investigated. The nanograting period is found to decrease with the increase of the incident pulse fluence. Local intensity distribution and incubation effect are responsible for the change of the nanograting period.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
ASP: Automated symbolic computation of approximate symmetries of differential equations
NASA Astrophysics Data System (ADS)
Jefferson, G. F.; Carminati, J.
2013-03-01
A recent paper (Pakdemirli et al. (2004) [12]) compared three methods of determining approximate symmetries of differential equations. Two of these methods are well known and involve either a perturbation of the classical Lie symmetry generator of the differential system (Baikov, Gazizov and Ibragimov (1988) [7], Ibragimov (1996) [6]) or a perturbation of the dependent variable/s and subsequent determination of the classical Lie point symmetries of the resulting coupled system (Fushchych and Shtelen (1989) [11]), both up to a specified order in the perturbation parameter. The third method, proposed by Pakdemirli, Yürüsoy and Dolapçi (2004) [12], simplifies the calculations required by Fushchych and Shtelen's method through the assignment of arbitrary functions to the non-linear components prior to computing symmetries. All three methods have been implemented in the new MAPLE package ASP (Automated Symmetry Package) which is an add-on to the MAPLE symmetry package DESOLVII (Vu, Jefferson and Carminati (2012) [25]). To our knowledge, this is the first computer package to automate all three methods of determining approximate symmetries for differential systems. Extensions to the theory have also been suggested for the third method and which generalise the first method to systems of differential equations. Finally, a number of approximate symmetries and corresponding solutions are compared with results in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.
1995-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less
ppcor: An R Package for a Fast Calculation to Semi-partial Correlation Coefficients.
Kim, Seongho
2015-11-01
Lack of a general matrix formula hampers implementation of the semi-partial correlation, also known as part correlation, to the higher-order coefficient. This is because the higher-order semi-partial correlation calculation using a recursive formula requires an enormous number of recursive calculations to obtain the correlation coefficients. To resolve this difficulty, we derive a general matrix formula of the semi-partial correlation for fast computation. The semi-partial correlations are then implemented on an R package ppcor along with the partial correlation. Owing to the general matrix formulas, users can readily calculate the coefficients of both partial and semi-partial correlations without computational burden. The package ppcor further provides users with the level of the statistical significance with its test statistic.
Xarray: multi-dimensional data analysis in Python
NASA Astrophysics Data System (ADS)
Hoyer, Stephan; Hamman, Joe; Maussion, Fabien
2017-04-01
xarray (http://xarray.pydata.org) is an open source project and Python package that provides a toolkit and data structures for N-dimensional labeled arrays, which are the bread and butter of modern geoscientific data analysis. Key features of the package include label-based indexing and arithmetic, interoperability with the core scientific Python packages (e.g., pandas, NumPy, Matplotlib, Cartopy), out-of-core computation on datasets that don't fit into memory, a wide range of input/output options, and advanced multi-dimensional data manipulation tools such as group-by and resampling. In this contribution we will present the key features of the library and demonstrate its great potential for a wide range of applications, from (big-)data processing on super computers to data exploration in front of a classroom.
Enhanced Electric Power Transmission by Hybrid Compensation Technique
NASA Astrophysics Data System (ADS)
Palanichamy, C.; Kiu, G. Q.
2015-04-01
In today's competitive environment, new power system engineers are likely to contribute immediately to the task, without years of seasoning via on-the-job training, mentoring, and rotation assignments. At the same time it is becoming obligatory to train power system engineering graduates for an increasingly quality-minded corporate environment. In order to achieve this, there is a need to make available better-quality tools for educating and training power system engineering students and in-service system engineers too. As a result of the swift advances in computer hardware and software, many windows-based computer software packages were developed for the purpose of educating and training. In line with those packages, a simulation package called Hybrid Series-Shunt Compensators (HSSC) has been developed and presented in this paper for educational purposes.
Software Reviews. Programs Worth a Second Look.
ERIC Educational Resources Information Center
Schneider, Roxanne; Eiser, Leslie
1989-01-01
Reviewed are three computer software packages for use in middle/high school classrooms. Included are "MacWrite II," a word-processing program for MacIntosh computers; "Super Story Tree," a word-processing program for Apple and IBM computers; and "Math Blaster Mystery," for IBM, Apple, and Tandy computers. (CW)
Computer-Aided Engineering Education at the K.U. Leuven.
ERIC Educational Resources Information Center
Snoeys, R.; Gobin, R.
1987-01-01
Describes some recent initiatives and developments in the computer-aided design program in the engineering faculty of the Katholieke Universiteit Leuven (Belgium). Provides a survey of the engineering curriculum, the computer facilities, and the main software packages available. (TW)
ERIC Educational Resources Information Center
Dwyer, Donna; And Others
1989-01-01
Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)
Computer Aided Management for Information Processing Projects.
ERIC Educational Resources Information Center
Akman, Ibrahim; Kocamustafaogullari, Kemal
1995-01-01
Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…
International Inventory of Software Packages in the Information Field.
ERIC Educational Resources Information Center
Keren, Carl, Ed.; Sered, Irina, Ed.
Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…
A Curriculum Review: The Voyage of the Mimi.
ERIC Educational Resources Information Center
Johns, Kenneth W.
1988-01-01
The curriculum package, "The Voyage of the Mimi," uses computer, videocassette, student text, and workbook for integrated study of the great whales and the impact of social actions on society and the environment. This review suggests that the package also offers many ancillary teaching opportunities. (CB)
Measurement of trapped proton fluences in main stack of P0006 experiment
NASA Technical Reports Server (NTRS)
Nefedov, N.; Csige, I.; Benton, E. V.; Henke, R. P.; Benton, E. R.; Frigo, L. A.
1995-01-01
We have measured directional distribution and Eastward directed mission fluence of trapped protons at two different energies with plastic nuclear track detectors (CR-39 with DOP) in the main stack of the P0006 experiment on LDEF. Results show arriving directions of trapped protons have very high anisotropy with most protons arriving from the West direction. Selecting these particles we have determined the mission fluence of Eastward directed trapped protons. We found experimental fluences are slightly higher than results of the model calculations of Armstrong and Colborn.
Individual Members of the Cab Gene Family Differ Widely in Fluence Response.
White, M. J.; Kaufman, L. S.; Horwitz, B. A.; Briggs, W. R.; Thompson, W. F.
1995-01-01
Chlorophyll a/b-binding protein genes (Cab genes) can be extremely sensitive to light. Transcript accumulation following a red light pulse increases with fluence over 8 orders of magnitude (L.S. Kaufman, W.F. Thompson, W.R. Briggs [1984] Science 226: 1447-1449). We have constructed fluence-response curves for individual Cab genes. At least two Cab genes (Cab-8 and AB96) show a very low fluence response to a single red light pulse. In contrast, two other Cab genes (AB80 and AB66) fail to produce detectable transcript following a single pulse of either red or blue light but are expressed in continuous red light. Thus, very low fluence responses and high irradiance responses occur in the same gene family. PMID:12228352
Shim, Hyung-Sup; Jun, Dai-Won; Kim, Sang-Wha; Jung, Sung-No; Kwon, Ho
2015-01-01
Purpose. Early postoperative fractional laser treatment has been used to reduce scarring in many institutions, but the most effective energy parameters have not yet been established. This study sought to determine effective parameters in the treatment of facial laceration scars. Methods. From September 2012 to September 2013, 57 patients were enrolled according to the study. To compare the low and high fluence parameters of 1,550 nm fractional erbium-glass laser treatment, we virtually divided the scar of each individual patient in half, and each half was treated with a high and low fluence setting, respectively. A total of four treatment sessions were performed at one-month intervals and clinical photographs were taken at every visit. Results. Results were assessed using the Vancouver Scar Scale (VSS) and global assessment of the two portions of each individual scar. Final evaluation revealed that the portions treated with high fluence parameter showed greater difference compared to pretreatment VSS scores and global assessment values, indicating favorable cosmetic results. Conclusion. We compared the effects of high fluence and low fluence 1,550 nm fractional erbium-glass laser treatment for facial scarring in the early postoperative period and revealed that the high fluence parameter was more effective for scar management. PMID:26236738
Transient Heat Conduction Simulation around Microprocessor Die
NASA Astrophysics Data System (ADS)
Nishi, Koji
This paper explains about fundamental formula of calculating power consumption of CMOS (Complementary Metal-Oxide-Semiconductor) devices and its voltage and temperature dependency, then introduces equation for estimating power consumption of the microprocessor for notebook PC (Personal Computer). The equation is applied to heat conduction simulation with simplified thermal model and evaluates in sub-millisecond time step calculation. In addition, the microprocessor has two major heat conduction paths; one is from the top of the silicon die via thermal solution and the other is from package substrate and pins via PGA (Pin Grid Array) socket. Even though the dominant factor of heat conduction is the former path, the latter path - from package substrate and pins - plays an important role in transient heat conduction behavior. Therefore, this paper tries to focus the path from package substrate and pins, and to investigate more accurate method of estimating heat conduction paths of the microprocessor. Also, cooling performance expression of heatsink fan is one of key points to assure result with practical accuracy, while finer expression requires more computation resources which results in longer computation time. Then, this paper discusses the expression to minimize computation workload with a practical accuracy of the result.
Kim, Yoonsang; Emery, Sherry
2013-01-01
Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415
XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.
Ching, Daniel J; Gürsoy, Dogˇa
2017-03-01
The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
XDesign: An open-source software package for designing X-ray imaging phantoms and experiments
Ching, Daniel J.; Gursoy, Dogˇa
2017-02-21
Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
Micro Computer Tomography for medical device and pharmaceutical packaging analysis.
Hindelang, Florine; Zurbach, Raphael; Roggo, Yves
2015-04-10
Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.
1984-12-01
Appendix D: CPESIM II Student Manual .........D-1 Appendix E: CPESIM II Instructor Manual .......E-1 Appendix F: The Abridged Report..........F-i Bibliography...operating system is implemented on. A student and instructor user’s manual is provided. vii I • - Development of a User Support Package for CPESIM II (a...was a manual one. The student changes should be collected into a database to ease the instructor workload and to provide a "history" of the evolution of
Radiation damage study of thin YAG:Ce scintillator using low-energy protons
NASA Astrophysics Data System (ADS)
Novotný, P.; Linhart, V.
2017-07-01
Radiation hardness of a 50 μ m thin YAG:Ce scintillator in a form of dependence of a signal efficiency on 3.1 MeV proton fluence was measured and analysed using X-ray beam. The signal efficiency is a ratio of signals given by a CCD chip after and before radiation damage. The CCD chip was placed outside the primary beam because of its protection from damage which could be caused by radiation. Using simplified assumptions, the 3.1 MeV proton fluences were recalculated to: ṡ 150 MeV proton fluences with intention to estimate radiation damage of this sample under conditions at proton therapy centres during medical treatment, ṡ 150 MeV proton doses with intention to give a chance to compare radiation hardness of the studied sample with radiation hardness of other detectors used in medical physics, ṡ 1 MeV neutron equivalent fluences with intention to compare radiation hardness of the studied sample with properties of position sensitive silicon and diamond detectors used in nuclear and particle physics. The following results of our research were obtained. The signal efficiency of the studied sample varies slightly (± 3%) up to 3.1 MeV proton fluence of c. (4 - 8) × 1014 cm-2. This limit is equivalent to 150 MeV proton fluence of (5 - 9) × 1016 cm-2, 150 MeV proton dose of (350 - 600) kGy and 1 MeV neutron fluence of (1 - 2) × 1016 cm-2. Beyond the limit, the signal efficiency goes gradually down. Fifty percent decrease in the signal efficiency is reached around 3.1 MeV fluence of (1 - 2) × 1016 cm-2 which is equivalent to 150 MeV proton fluence of around 2 × 1018 cm-2, 150 MeV proton dose of around 15 MGy and 1 MeV neutron equivalent fluence of (4 - 8) × 1017 cm-2. In contrast with position sensitive silicon and diamond radiation detectors, the studied sample has at least two order of magnitude greater radiation resistance. Therefore, YAG:Ce scintillator is a suitable material for monitoring of primary beams of particles of ionizing radiation.
Case Study: Audio-Guided Learning, with Computer Graphics.
ERIC Educational Resources Information Center
Koumi, Jack; Daniels, Judith
1994-01-01
Describes teaching packages which involve the use of audiotape recordings with personal computers in Open University (United Kingdom) mathematics courses. Topics addressed include software development; computer graphics; pedagogic principles for distance education; feedback, including course evaluations and student surveys; and future plans.…
Computer Applications in Teaching and Learning.
ERIC Educational Resources Information Center
Halley, Fred S.; And Others
Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…
Software for Computing, Archiving, and Querying Semisimple Braided Monoidal Category Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
This software package collects various open source and freely available codes and algorithms to compute and archive the categorical data for certain semisimple braided monoidal categories. In particular, it computes the data for of group theoretical categories for academic research.
Hypercard Another Computer Tool.
ERIC Educational Resources Information Center
Geske, Joel
1991-01-01
Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…
An Object-Oriented Serial DSMC Simulation Package
NASA Astrophysics Data System (ADS)
Liu, Hongli; Cai, Chunpei
2011-05-01
A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.
A Software Development Approach for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Cushion, Steve
2005-01-01
Over the last 5 years we have developed, produced, tested, and evaluated an authoring software package to produce web-based, interactive, audio-enhanced language-learning material. That authoring package has been used to produce language-learning material in French, Spanish, German, Arabic, and Tamil. We are currently working on increasing…
1980-06-01
courseware package on how to program lessons for an automated system. Since PLANIT (Programming Language for Interactive Teaching) is the student/author...assisted instruction (CAI), how to program PLANIT lessons, and to evaluate the effectiveness of the package for select Army users. The resultant courseware
Effectiveness of Simulation in a Hybrid and Online Networking Course.
ERIC Educational Resources Information Center
Cameron, Brian H.
2003-01-01
Reports on a study that compares the performance of students enrolled in two sections of a Web-based computer networking course: one utilizing a simulation package and the second utilizing a static, graphical software package. Analysis shows statistically significant improvements in performance in the simulation group compared to the…
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Light dosimetry and dose verification for pleural PDT
NASA Astrophysics Data System (ADS)
Dimofte, Andreea; Sharikova, Anna V.; Meo, Julia L.; Simone, Charles B.; Friedberg, Joseph S.; Zhu, Timothy C.
2013-03-01
In-vivo light dosimetry for patients undergoing photodynamic therapy (PDT) is critical for predicting PDT outcome. Patients in this study are enrolled in a Phase I clinical trial of HPPH-mediated PDT for the treatment of non-small cell lung cancer with pleural effusion. They are administered 4mg per kg body weight HPPH 48 hours before the surgery and receive light therapy with a fluence of 15-45 J/cm2 at 661 and 665nm. Fluence rate (mW/cm2) and cumulative fluence (J/cm2) are monitored at 7 sites during the light treatment delivery using isotropic detectors. Light fluence (rate) delivered to patients is examined as a function of treatment time, volume and surface area. In a previous study, a correlation between the treatment time and the treatment volume and surface area was established. However, we did not include the direct light and the effect of the shape of the pleural surface on the scattered light. A real-time infrared (IR) navigation system was used to separate the contribution from the direct light. An improved expression that accurately calculates the total fluence at the cavity wall as a function of light source location, cavity geometry and optical properties is determined based on theoretical and phantom studies. The theoretical study includes an expression for light fluence rate in an elliptical geometry instead of the spheroid geometry used previously. The calculated light fluence is compared to the measured fluence in patients of different cavity geometries and optical properties. The result can be used as a clinical guideline for future pleural PDT treatment.
Li, Wenhai; Liu, Chengyi; Chen, Zhou; Cai, Lin; Zhou, Cheng; Xu, Qianxi; Li, Houmin; Zhang, Jianzhong
2016-11-01
High-fluence diode lasers with contact cooling have emerged as the gold standard to remove unwanted hair. Lowering the energy should result in less pain and could theoretically affect the efficacy of the therapy. To compare the safety and efficacy of a low fluence high repetition rate 810-nm diode laser to those of a high fluence, low repetition rate diode laser for permanent axillary hair removal in Chinese women. Ninety-two Chinese women received four axillae laser hair removal treatments at 4-week intervals using the low fluence, high repetition rate 810-nm diode laser in super hair removal (SHR) mode on one side and the high fluence, low repetition rate diode laser in hair removal (HR) mode on the other side. Hair counts were done at each follow-up visit and 6-month follow-up after the final laser treatment using a "Hi Quality Hair Analysis Program System"; the immediate pain score after each treatment session was recorded by a visual analog scale. The overall median reduction of hair was 90.2% with the 810-nm diode laser in SHR mode and 87% with the same laser in HR mode at 6-month follow-up. The median pain scores in SHR mode and in HR mode were 2.75 and 6.75, respectively. Low fluence, high repetition rate diode laser can efficiently remove unwanted hair but also significantly improve tolerability and reduce adverse events during the course of treatment.
A quality assurance program for clinical PDT
NASA Astrophysics Data System (ADS)
Dimofte, Andreea; Finlay, Jarod; Ong, Yi Hong; Zhu, Timothy C.
2018-02-01
Successful outcome of Photodynamic therapy (PDT) depends on accurate delivery of prescribed light dose. A quality assurance program is necessary to ensure that light dosimetry is correctly measured. We have instituted a QA program that include examination of long term calibration uncertainty of isotropic detectors for light fluence rate, power meter head intercomparison for laser power, stability of the light-emitting diode (LED) light source integrating sphere as a light fluence standard, laser output and calibration of in-vivo reflective fluorescence and absorption spectrometers. We examined the long term calibration uncertainty of isotropic detector sensitivity, defined as fluence rate per voltage. We calibrate the detector using the known calibrated light fluence rate of the LED light source built into an internally baffled 4" integrating sphere. LED light sources were examined using a 1mm diameter isotropic detector calibrated in a collimated beam. Wavelengths varying from 632nm to 690nm were used. The internal LED method gives an overall calibration accuracy of +/- 4%. Intercomparison among power meters was performed to determine the consistency of laser power and light fluence rate measured among different power meters. Power and fluence readings were measured and compared among detectors. A comparison of power and fluence reading among several power heads shows long term consistency for power and light fluence rate calibration to within 3% regardless of wavelength. The standard LED light source is used to calibrate the transmission difference between different channels for the diffuse reflective absorption and fluorescence contact probe as well as isotropic detectors used in PDT dose dosimeter.
Impact of Mg-ion implantation with various fluence ranges on optical properties of n-type GaN
NASA Astrophysics Data System (ADS)
Tsuge, Hirofumi; Ikeda, Kiyoji; Kato, Shigeki; Nishimura, Tomoaki; Nakamura, Tohru; Kuriyama, Kazuo; Mishima, Tomoyoshi
2017-10-01
Optical characteristics of Mg-ion implanted GaN layers with various fluence ranges were evaluated. Mg ion implantation was performed twice at energies of 30 and 60 keV on n-GaN layers. The first implantation at 30 keV was performed with three different fluence ranges of 1.0 × 1014, 1.0 × 1015 and 5.0 × 1015 cm-2. The second implantation at an energy of 60 keV was performed with a fluence of 6.5 × 1013 cm-2. After implantation, samples were annealed at 1250 °C for 1 min under N2 atmosphere. Photoluminescence (PL) spectrum of the GaN layer with the Mg ion implantation at the fluence range of 1.0 × 1014 cm-2 at 30 keV was similar to the one of Mg-doped p-GaN layers grown by MOVPE (Metal-Organic Vapor Phase Epitaxy) on free-standing GaN substrates and those at the fluence ranges over 1.0 × 1015 cm-2 were largely degraded.
Fast determination of the spatially distributed photon fluence for light dose evaluation of PDT
NASA Astrophysics Data System (ADS)
Zhao, Kuanxin; Chen, Weiting; Li, Tongxin; Yan, Panpan; Qin, Zhuanping; Zhao, Huijuan
2018-02-01
Photodynamic therapy (PDT) has shown superiorities of noninvasiveness and high-efficiency in the treatment of early-stage skin cancer. Rapid and accurate determination of spatially distributed photon fluence in turbid tissue is essential for the dosimetry evaluation of PDT. It is generally known that photon fluence can be accurately obtained by Monte Carlo (MC) methods, while too much time would be consumed especially for complex light source mode or online real-time dosimetry evaluation of PDT. In this work, a method to rapidly calculate spatially distributed photon fluence in turbid medium is proposed implementing a classical perturbation and iteration theory on mesh Monte Carlo (MMC). In the proposed method, photon fluence can be obtained by superposing a perturbed and iterative solution caused by the defects in turbid medium to an unperturbed solution for the background medium and therefore repetitive MMC simulations can be avoided. To validate the method, a non-melanoma skin cancer model is carried out. The simulation results show the solution of photon fluence can be obtained quickly and correctly by perturbation algorithm.
Laser fluence dependence on emission dynamics of ultrafast laser induced copper plasma
Anoop, K. K.; Harilal, S. S.; Philip, Reji; ...
2016-11-14
The characteristic emission features of a laser-produced plasma strongly depend strongly on the laser fluence. We investigated the spatial and temporal dynamics of neutrals and ions in femtosecond laser (800 nm, ≈ 40 fs, Ti:Sapphire) induced copper plasma in vacuum using both optical emission spectroscopy (OES) and spectrally resolved two-dimensional (2D) imaging methods over a wide fluence range of 0.5 J/cm 2-77.5 J/cm 2. 2D fast gated monochromatic images showed distinct plume splitting between the neutral and ions especially at moderate to higher fluence ranges. OES studies at low to moderate laser fluence regime confirm intense neutral line emission overmore » the ion emission whereas this trend changes at higher laser fluence with dominance of the latter. This evidences a clear change in the physical processes involved in femtosecond laser matter interaction at high input laser intensity. The obtained ion dynamics resulting from the OES, and spectrally resolved 2D imaging are compared with charged particle measurement employing Faraday cup and Langmuir probe and results showed good correlation.« less
Study of the effects of neutron irradiation on silicon strip detectors
NASA Astrophysics Data System (ADS)
Guibellino, P.; Panizza, G.; Hall, G.; Sotthibandhu, S.; Ziock, H. J.; Ferguson, P.; Sommer, W. F.; Edwards, M.; Cartiglia, N.; Hubbard, B.; Lesloe, J.; Pitzl, D.; O'Shaughnessy, K.; Rowe, W.; Sadoziski, H. F.-W.; Seiden, A.; Spencer, E.
1992-05-01
Silicon strip detectors and test structures were exposed to neutron fluences up to Φ = 6.1 × 10 14 n/cm 2, using the ISIS neutron source at the Rutherford Appleton Laboratory (UK). In this paper we report some of our results concerning the effects of displacement damage, with a comparison of devices made of silicon of different resistivity. The various samples exposed showed a very similar dependence of the leakage current on the fluence received. We studied the change of effective doping concentration, and observed a behaviour suggesting the onset of type inversion at a fluence of ˜ 2.0 × 10 13 n/cm 2, a value which depends on the initial doping concentration. The linear increase of the depletion voltage for fluences higher than the inversion point could eventually determine the maximum fluence tolerable by silicon detectors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Anne A.; Katoh, Yutai; Snead, Mary A.
A new, fine-grain nuclear graphite, grade G347A from Tokai Carbon Co., Ltd., has been irradiated in the High Flux Isotope Reactor at Oak Ridge National Laboratory to study the materials property changes that occur when exposed to neutron irradiation at temperatures of interest for Generation-IV nuclear reactor applications. Specimen temperatures ranged from 290°C to 800 °C with a maximum neutron fluence of 40 × 10 25 n/m 2 [E > 0.1 MeV] (~30dpa). Lastly, observed behaviors include: anisotropic behavior of dimensional change in an isotropic graphite, Young's modulus showing parabolic fluence dependence, electrical resistivity increasing at low fluence and additionalmore » increase at high fluence, thermal conductivity rapidly decreasing at low fluence followed by continued degradation, and a similar plateau value of the mean coefficient of thermal expansion for all irradiation temperatures.« less
Anisotropy of the neutron fluence from a plasma focus.
NASA Technical Reports Server (NTRS)
Lee, J. H.; Shomo, L. P.; Kim, K. H.
1972-01-01
The fluence of neutrons from a plasma focus was measured by gamma spectrometry of an activated silver target. This method results in a significant increase in accuracy over the beta-counting method. Multiple detectors were used in order to measure the anisotropy of the fluence of neutrons. The fluence was found to be concentrated in a cone with a half-angle of 30 deg about the axis, and to drop off rapidly outside of this cone; the anisotropy was found to depend upon the total yield of neutrons. This dependence was strongest on the axis. Neither the axial concentration of the fluence of neutrons nor its dependence on the total yield of neutrons is explained by any of the currently proposed models. Some other explanations, including the possibility of an axially distributed source, are considered.
System and Method for Determining Fluence of a Substance
NASA Technical Reports Server (NTRS)
Banks, Bruce A. (Inventor)
2016-01-01
A system and method for measuring a fluence of gas are disclosed. The system has a first light detector capable of outputting an electrical signal based on an amount of light received. A barrier is positionable adjacent the first light detector and is susceptible to a change in dimension from the fluence of the gas. The barrier permits a portion of light from being received by the first light detector. The change in the dimension of the barrier changes the electrical signal output from the first light detector. A second light detector is positionable to receive light representative of the first light detector without the barrier. The system and method have broad application to detect fluence of gas that may cause erosion chemical reaction causing erosive deterioration. One application is in low orbit Earth for detecting the fluence of atomic oxygen.
Budiarto, E; Keijzer, M; Storchi, P R M; Heemink, A W; Breedveld, S; Heijmen, B J M
2014-01-20
Radiotherapy dose delivery in the tumor and surrounding healthy tissues is affected by movements and deformations of the corresponding organs between fractions. The random variations may be characterized by non-rigid, anisotropic principal component analysis (PCA) modes. In this article new dynamic dose deposition matrices, based on established PCA modes, are introduced as a tool to evaluate the mean and the variance of the dose at each target point resulting from any given set of fluence profiles. The method is tested for a simple cubic geometry and for a prostate case. The movements spread out the distributions of the mean dose and cause the variance of the dose to be highest near the edges of the beams. The non-rigidity and anisotropy of the movements are reflected in both quantities. The dynamic dose deposition matrices facilitate the inclusion of the mean and the variance of the dose in the existing fluence-profile optimizer for radiotherapy planning, to ensure robust plans with respect to the movements.
An investigation of voxel geometries for MCNP-based radiation dose calculations.
Zhang, Juying; Bednarz, Bryan; Xu, X George
2006-11-01
Voxelized geometry such as those obtained from medical images is increasingly used in Monte Carlo calculations of absorbed doses. One useful application of calculated absorbed dose is the determination of fluence-to-dose conversion factors for different organs. However, confusion still exists about how such a geometry is defined and how the energy deposition is best computed, especially involving a popular code, MCNP5. This study investigated two different types of geometries in the MCNP5 code, cell and lattice definitions. A 10 cm x 10 cm x 10 cm test phantom, which contained an embedded 2 cm x 2 cm x 2 cm target at its center, was considered. A planar source emitting parallel photons was also considered in the study. The results revealed that MCNP5 does not calculate total target volume for multi-voxel geometries. Therefore, tallies which involve total target volume must be divided by the user by the total number of voxels to obtain a correct dose result. Also, using planar source areas greater than the phantom size results in the same fluence-to-dose conversion factor.
NASA Astrophysics Data System (ADS)
Torsello, Daniele; Mino, Lorenzo; Bonino, Valentina; Agostino, Angelo; Operti, Lorenza; Borfecchia, Elisa; Vittone, Ettore; Lamberti, Carlo; Truccato, Marco
2018-01-01
We investigate the microscopic mechanism responsible for the change of macroscopic electrical properties of the B i2S r2CaC u2O8 +δ high-temperature superconductor induced by intense synchrotron hard x-ray beams. The possible effects of secondary electrons on the oxygen content via the knock-on interaction are studied by Monte Carlo simulations. The change in the oxygen content expected from the knock-on model is computed convoluting the fluence of photogenerated electrons in the material with the Seitz-Koehler cross section. This approach has been adopted to analyze several experimental irradiation sessions with increasing x-ray fluences. A close comparison between the expected variations in oxygen content and the experimental results allows determining the irradiation regime in which the knock-on mechanism can satisfactorily explain the observed changes. Finally, we estimate the threshold displacement energy of loosely bound oxygen atoms in this material Td=0 .15-0.01+0.025eV .
Effects of neutron irradiation on pinning force scaling in state-of-the-art Nb3Sn wires
NASA Astrophysics Data System (ADS)
Baumgartner, T.; Eisterer, M.; Weber, H. W.; Flükiger, R.; Scheuerlein, C.; Bottura, L.
2014-01-01
We present an extensive irradiation study involving five state-of-the-art Nb3Sn wires which were subjected to sequential neutron irradiation up to a fast neutron fluence of 1.6 × 1022 m-2 (E > 0.1 MeV). The volume pinning force of short wire samples was assessed in the temperature range from 4.2 to 15 K in applied fields of up to 7 T by means of SQUID magnetometry in the unirradiated state and after each irradiation step. Pinning force scaling computations revealed that the exponents in the pinning force function differ significantly from those expected for pure grain boundary pinning, and that fast neutron irradiation causes a substantial change in the functional dependence of the volume pinning force. A model is presented, which describes the pinning force function of irradiated wires using a two-component ansatz involving a point-pinning contribution stemming from radiation induced pinning centers. The dependence of this point-pinning contribution on fast neutron fluence appears to be a universal function for all examined wire types.
ERIC Educational Resources Information Center
Nee, John G.; Kare, Audhut P.
1987-01-01
Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)
Some Experience with Interactive Computing in Teaching Introductory Statistics.
ERIC Educational Resources Information Center
Diegert, Carl
Students in two biostatistics courses at the Cornell Medical College and in a course in applications of computer science given in Cornell's School of Industrial Engineering were given access to an interactive package of computer programs enabling them to perform statistical analysis without the burden of hand computation. After a general…
The Implications of Cognitive Psychology for Computer-Based Learning Tools.
ERIC Educational Resources Information Center
Kozma, Robert B.
1987-01-01
Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…
Commonsense System Pricing; Or, How Much Will that $1,200 Computer Really Cost?
ERIC Educational Resources Information Center
Crawford, Walt
1984-01-01
Three methods employed to price and sell computer equipment are discussed: computer pricing, hardware pricing, system pricing (system includes complete computer and support hardware system and relatively complete software package). Advantages of system pricing are detailed, the author's system is described, and 10 systems currently available are…
Numerical Package in Computer Supported Numeric Analysis Teaching
ERIC Educational Resources Information Center
Tezer, Murat
2007-01-01
At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…
Equivalent electron fluence for space qualification of shallow junction heteroface GaAs solar cells
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Stock, L. V.
1984-01-01
It is desirable to perform qualification tests prior to deployment of solar cells in space power applications. Such test procedures are complicated by the complex mixture of differing radiation components in space which are difficult to simulate in ground test facilities. Although it has been shown that an equivalent electron fluence ratio cannot be uniquely defined for monoenergetic proton exposure of GaAs shallow junction cells, an equivalent electron fluence test can be defined for common spectral components of protons found in space. Equivalent electron fluence levels for the geosynchronous environment are presented.
Teaching "Filing Rules"--Via Computer-Aided Instruction.
ERIC Educational Resources Information Center
Agneberg, Craig
A computer software package has been developed to teach and test students on the Rules for Alphabetical Filing of the Association of Records Managers and Administrators (ARMA). The following computer assisted instruction principles were used in developing the program: gaining attention, stating objectives, providing direction, reviewing…
Gang, G J; Siewerdsen, J H; Stayman, J W
2017-02-11
This work presents a task-driven joint optimization of fluence field modulation (FFM) and regularization in quadratic penalized-likelihood (PL) reconstruction. Conventional FFM strategies proposed for filtered-backprojection (FBP) are evaluated in the context of PL reconstruction for comparison. We present a task-driven framework that leverages prior knowledge of the patient anatomy and imaging task to identify FFM and regularization. We adopted a maxi-min objective that ensures a minimum level of detectability index ( d' ) across sample locations in the image volume. The FFM designs were parameterized by 2D Gaussian basis functions to reduce dimensionality of the optimization and basis function coefficients were estimated using the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. The FFM was jointly optimized with both space-invariant and spatially-varying regularization strength ( β ) - the former via an exhaustive search through discrete values and the latter using an alternating optimization where β was exhaustively optimized locally and interpolated to form a spatially-varying map. The optimal FFM inverts as β increases, demonstrating the importance of a joint optimization. For the task and object investigated, the optimal FFM assigns more fluence through less attenuating views, counter to conventional FFM schemes proposed for FBP. The maxi-min objective homogenizes detectability throughout the image and achieves a higher minimum detectability than conventional FFM strategies. The task-driven FFM designs found in this work are counter to conventional patterns for FBP and yield better performance in terms of the maxi-min objective, suggesting opportunities for improved image quality and/or dose reduction when model-based reconstructions are applied in conjunction with FFM.
HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics
NASA Astrophysics Data System (ADS)
Wiebusch, Martin
2015-10-01
This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.
Mesoscale and severe storms (Mass) data management and analysis system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.; Dickerson, M.
1984-01-01
Progress on the Mesoscale and Severe Storms (MASS) data management and analysis system is described. An interactive atmospheric data base management software package to convert four types of data (Sounding, Single Level, Grid, Image) into standard random access formats is implemented and integrated with the MASS AVE80 Series general purpose plotting and graphics display data analysis software package. An interactive analysis and display graphics software package (AVE80) to analyze large volumes of conventional and satellite derived meteorological data is enhanced to provide imaging/color graphics display utilizing color video hardware integrated into the MASS computer system. Local and remote smart-terminal capability is provided by installing APPLE III computer systems within individual scientist offices and integrated with the MASS system, thus providing color video display, graphics, and characters display of the four data types.
Out of the lab and into the fab: Nano-alignment as an enabler for Silicon Photonics' next chapter
NASA Astrophysics Data System (ADS)
Jordan, Scott
2017-06-01
The rapid advent of Silicon Photonics presents many challenges for test and packaging. Here we concisely review SiP device attributes that differ significantly from classical photonic configurations, with a view to the future beyond current, connectivity-oriented silicon photonics developments, looking to such endeavors as all-optical computing and quantum computing. The necessity for nano-precision alignment of optical elements in test and packaging operations quickly emerges as the unfilled need. We review the industrial test and packaging solutions developed back in the 1997-2001 photonics boom to address the needs of that era's devices, and map their gaps with the new SiP device classes. Finally we review the new state-of-the-art of recent advances in the field that address these gaps.
The Effect of Low Energy Nitrogen Ion Implantation on Graphene Nanosheets
NASA Astrophysics Data System (ADS)
Mishra, Mukesh; Alwarappan, Subbiah; Kanjilal, Dinakar; Mohanty, Tanuja
2018-03-01
Herein, we report the effect 50 keV nitrogen ion implantation at varying fluence on the optical properties of graphene nanosheets (number of layers < 5). Initially, graphene nanosheets synthesized by the direct liquid exfoliation of graphite layers were deposited on a cleaned Si-substrate by drop cast method. These graphene nanosheets are implanted with 50 keV nitrogen-ion beam at six different fluences. Raman spectroscopic results show that the D, D' and G peak get broadened up to the nitrogen ion fluence of 1 × 1015 ions/cm2, while 2D peak of graphene nanosheets disappeared for nitrogen-ions have fluence more than 1014 ions/cm2. However, further increase of fluence causes the indistinguishable superimposition of D, D' and G peaks. Surface contact potential value analysis for ion implanted graphene nanosheets shows the increase in defect concentration from 1.15 × 1012 to 1.98 × 1014 defects/cm2 with increasing the nitrogen ion fluence, which resembles the Fermi level shift towards conduction band. XRD spectra confirmed that the crystallinity of graphene nanosheets was found to tamper with increasing fluence. These results revealed that the limit of nitrogen ion implantation resistant on the vibrational behaviors for graphene nanosheets was 1015 ions/cm2 that opens up the scope of application of graphene nanosheets in device fabrication for ion-active environment and space applications.
Test Generators: Teacher's Tool or Teacher's Headache?
ERIC Educational Resources Information Center
Eiser, Leslie
1988-01-01
Discusses the advantages and disadvantages of test generation programs. Includes setting up, printing exams and "bells and whistles." Reviews eight computer packages for Apple and IBM personal computers. Compares features, costs, and usage. (CW)
ERIC Educational Resources Information Center
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…
A User-Friendly Software Package for HIFU Simulation
NASA Astrophysics Data System (ADS)
Soneson, Joshua E.
2009-04-01
A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.
NASA Technical Reports Server (NTRS)
1984-01-01
Boeing Commercial Airplane Company's Flight Control Department engineers relied on Langley developed software package known as ORACLS to develop an advanced control synthesis package for both continuous and discrete control system. Package was used by Boeing for computerized analysis of new system designs. Resulting applications include a multiple input/output control system for the terrain-following navigation equipment of the Air Forces B-1 Bomber, and another for controlling in flight changes of wing camber on an experimental airplane. ORACLS is one of 1,300 computer programs available from COSMIC.
Neural Network Prototyping Package Within IRAF
NASA Technical Reports Server (NTRS)
Bazell, David
1997-01-01
The purpose of this contract was to develop a neural network package within the IRAF environment to allow users to easily understand and use different neural network algorithms the analysis of astronomical data. The package was developed for use within IRAF to allow portability to different computing environments and to provide a familiar and easy to use interface with the routines. In addition to developing the software and supporting documentation, we planned to use the system for the analysis of several sample problems to prove its viability and usefulness.
DAM package version 7807: Software fixes and enhancements
NASA Technical Reports Server (NTRS)
Schlosser, E.
1979-01-01
The Detection and Mapping package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered, formatted, and interpreted maps from digital LANDSAT multispectral scanner data. This report documents changes to the DAM package in support of its use by the Corps of Engineers for inventorying impounded surface water. Although these changes are presented in terms of their application to detecting and mapping surface water, they are equally relevant to other land surface materials.
Perspectives for imaging single protein molecules with the present design of the European XFEL.
Ayyer, Kartik; Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni; Serkez, Svitozar; Yefanov, Oleksandr; Zagorodnov, Igor
2015-07-01
The Single Particles, Clusters and Biomolecules & Serial Femtosecond Crystallography (SPB/SFX) instrument at the European XFEL is located behind the SASE1 undulator and aims to support imaging and structure determination of biological specimen between about 0.1 μm and 1 μm size. The instrument is designed to work at photon energies from 3 keV up to 16 keV. Here, we propose a cost-effective proof-of-principle experiment, aiming to demonstrate the actual feasibility of a single molecule diffraction experiment at the European XFEL. To this end, we assume self-seeding capabilities at SASE1 and we suggest to make use of the baseline European XFEL accelerator complex-with the addition of a slotted-foil setup-and of the SPB/SFX instrument. As a first step towards the realization of an actual experiment, we developed a complete package of computational tools for start-to-end simulations predicting its performance. Single biomolecule imaging capabilities at the European XFEL can be reached by exploiting special modes of operation of the accelerator complex and of the SASE1 undulator. The output peak power can be increased up to more than 1.5 TW, which allows to relax the requirements on the focusing efficiency of the optics and to reach the required fluence without changing the present design of the SPB/SFX instrument. Explicit simulations are presented using the 15-nm size RNA Polymerase II molecule as a case study. Noisy diffraction patterns were generated and they were processed to generate the 3D intensity distribution. We discuss requirements to the signal-to-background ratio needed to obtain a correct pattern orientation. When these are fulfilled, our results indicate that one can achieve diffraction without destruction with about 0.1 photons per Shannon pixel per shot at 4 Å resolution with 10(13) photons in a 4 fs pulse at 4 keV photon energy and in a 0.3 μm focus, corresponding to a fluence of 10(14) photons/μm(2). We assume negligible structured background. At this signal level, one needs only about 30 000 diffraction patterns to recover full 3D information. At the highest repetition rate manageable by detectors at European XFEL, one will be able to accumulate these data within a fraction of an hour, even assuming a relatively low hit probability of about a percent.
Prudic, David E.
1989-01-01
Computer models are widely used to simulate groundwater flow for evaluating and managing the groundwater resource of many aquifers, but few are designed to also account for surface flow in streams. A computer program was written for use in the US Geological Survey modular finite difference groundwater flow model to account for the amount of flow in streams and to simulate the interaction between surface streams and groundwater. The new program is called the Streamflow-Routing Package. The Streamflow-Routing Package is not a true surface water flow model, but rather is an accounting program that tracks the flow in one or more streams which interact with groundwater. The program limits the amount of groundwater recharge to the available streamflow. It permits two or more streams to merge into one with flow in the merged stream equal to the sum of the tributary flows. The program also permits diversions from streams. The groundwater flow model with the Streamflow-Routing Package has an advantage over the analytical solution in simulating the interaction between aquifer and stream because it can be used to simulate complex systems that cannot be readily solved analytically. The Streamflow-Routing Package does not include a time function for streamflow but rather streamflow entering the modeled area is assumed to be instantly available to downstream reaches during each time period. This assumption is generally reasonable because of the relatively slow rate of groundwater flow. Another assumption is that leakage between streams and aquifers is instantaneous. This assumption may not be reasonable if the streams and aquifers are separated by a thick unsaturated zone. Documentation of the Streamflow-Routing Package includes data input instructions; flow charts, narratives, and listings of the computer program for each of four modules; and input data sets and printed results for two test problems, and one example problem. (Lantz-PTT)
Alloy Design Workbench-Surface Modeling Package Developed
NASA Technical Reports Server (NTRS)
Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.
2003-01-01
NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.
Performance assessment of small-package-class nonintrusive inspection systems
NASA Astrophysics Data System (ADS)
Spradling, Michael L.; Hyatt, Roger
1997-02-01
The DoD Counterdrug Technology Development Program has addressed the development and demonstration of technology to enhance nonintrusive inspection of small packages such as passenger baggage, commercially delivered parcels, and breakbulk cargo items. Within the past year they have supported several small package-class nonintrusive inspection system performance assessment activities. All performance assessment programs involved the use of a red/blue team concept and were conducted in accordance with approved assessment protocols. This paper presents a discussion related to the systematic performance assessment of small package-class nonintrusive inspection technologies, including transmission, backscatter and computed tomography x-ray imaging, and protocol-related considerations for the assessment of these systems.
tscvh R Package: Computational of the two samples test on microarray-sequencing data
NASA Astrophysics Data System (ADS)
Fajriyah, Rohmatul; Rosadi, Dedi
2017-12-01
We present a new R package, a tscvh (two samples cross-variance homogeneity), as we called it. This package is a software of the cross-variance statistical test which has been proposed and introduced by Fajriyah ([3] and [4]), based on the cross-variance concept. The test can be used as an alternative test for the significance difference between two means when sample size is small, the situation which is usually appeared in the bioinformatics research. Based on its statistical distribution, the p-value can be also provided. The package is built under a homogeneity of variance between samples.
Benmakhlouf, Hamza; Andreo, Pedro
2017-02-01
Correction factors for the relative dosimetry of narrow megavoltage photon beams have recently been determined in several publications. These corrections are required because of the several small-field effects generally thought to be caused by the lack of lateral charged particle equilibrium (LCPE) in narrow beams. Correction factors for relative dosimetry are ultimately necessary to account for the fluence perturbation caused by the detector. For most small field detectors the perturbation depends on field size, resulting in large correction factors when the field size is decreased. In this work, electron and photon fluence differential in energy will be calculated within the radiation sensitive volume of a number of small field detectors for 6 MV linear accelerator beams. The calculated electron spectra will be used to determine electron fluence perturbation as a function of field size and its implication on small field dosimetry analyzed. Fluence spectra were calculated with the user code PenEasy, based on the PENELOPE Monte Carlo system. The detectors simulated were one liquid ionization chamber, two air ionization chambers, one diamond detector, and six silicon diodes, all manufactured either by PTW or IBA. The spectra were calculated for broad (10 cm × 10 cm) and narrow (0.5 cm × 0.5 cm) photon beams in order to investigate the field size influence on the fluence spectra and its resulting perturbation. The photon fluence spectra were used to analyze the impact of absorption and generation of photons. These will have a direct influence on the electrons generated in the detector radiation sensitive volume. The electron fluence spectra were used to quantify the perturbation effects and their relation to output correction factors. The photon fluence spectra obtained for all detectors were similar to the spectrum in water except for the shielded silicon diodes. The photon fluence in the latter group was strongly influenced, mostly in the low-energy region, by photoabsorption in the high-Z shielding material. For the ionization chambers and the diamond detector, the electron fluence spectra were found to be similar to that in water, for both field sizes. In contrast, electron spectra in the silicon diodes were much higher than that in water for both field sizes. The estimated perturbations of the fluence spectra for the silicon diodes were 11-21% for the large fields and 14-27% for the small fields. These perturbations are related to the atomic number, density and mean excitation energy (I-value) of silicon, as well as to the influence of the "extracameral"' components surrounding the detector sensitive volume. For most detectors the fluence perturbation was also found to increase when the field size was decreased, in consistency with the increased small-field effects observed for the smallest field sizes. The present work improves the understanding of small-field effects by relating output correction factors to spectral fluence perturbations in small field detectors. It is shown that the main reasons for the well-known small-field effects in silicon diodes are the high-Z and density of the "extracameral" detector components and the high I-value of silicon relative to that of water and diamond. Compared to these parameters, the density and atomic number of the radiation sensitive volume material play a less significant role. © 2016 American Association of Physicists in Medicine.
White, Gary C.; Hines, J.E.
2004-01-01
The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).
NASA Astrophysics Data System (ADS)
Prisbrey, Shon Thomas
Knowledge of the fundamental structure and magnetic characteristics of dilute magnetic semiconductors (DMSs) is an essential step towards the development of spin-polarized electronics (spintronics). Recently (2001), the report of ferromagnetism in cobalt-doped anatase titania films synthesized by pulse laser deposition (PLD) elicited interest as a possible DMS oxide. Other investigations of the CoxTi1-xO2-delta material system, utilizing a myriad of deposition techniques, yielded conflicting results as to the source of magnetism and the local environment of the deposited cobalt. No complete characterization of PLD synthesized films has been reported. This dissertation quantifies the effect of laser fluence on film morphology, structure, and magnetic properties by fully characterizing CoxTi1-x O2-delta films grown under optimal PLD deposition conditions that were identified separately in prior published work. The construction of a custom PLD system that provided repeatable laser/target interaction via a combination of fluence and target movement is addressed. A brief outline of magnetism and its relation to structure is also given. The remainder of the dissertation details the effect of laser fluence on Co0.049Ti0.951O2-delta and Co 0.038Ti0.962O2-delta films. Film structure, morphology, and magnetic properties were determined for illumination conditions corresponding to laser fluences varying from 0.57 to 1.37 J/cm2. The local cobalt environment is strongly correlated with laser fluence. Cobalt in 4.9% concentration films grown with a laser fluence between 0.7 and 0.93 J/cm2 were octahedrally coordinated, as were 3.8% films grown with a fluence less than 0.93 J/cm2. Departure of the laser fluence from these ranges results in a multitude of cobalt environments in the films. The film magnetization is observed to be a function of laser fluence with a maximum moment of ˜3.19 muB per cobalt atom occurring at 0.93 J/cm2 in the 4.9% films and ˜1.9 muB per cobalt atom at 0.57 J/cm2 in the 3.8% films. There is no evidence of cobalt segregation and subsequent formation of metallic cobalt in the high moment films. A departure in laser fluence from the maximum moment conditions results in a drop in moment to ˜1 muB. An appendix detailing previous work that investigated iridium as an oxidation resistant capping layer is also included.
N+ ion-target interactions in PPO polymer: A structural characterization
NASA Astrophysics Data System (ADS)
Das, A.; Dhara, S.; Patnaik, A.
1999-01-01
N + ion beam induced effects on the spin coated amorphous poly(2,6-dimethyl phenylene oxide) (PPO) films in terms of chemical structure and electronic and vibrational properties were investigated using Fourier Transform Infrared spectroscopy (FTIR) and Ultraviolet-Visible (UV-VIS) spectroscopy. Both techniques revealed that the stability of PPO was very weak towards 100 keV N + ions revealing the threshold fluence to be 10 14 ions/cm 2 for fragmentation of the polymer. FTIR analysis showed disappearance of all characteristic IR bands at a total fluence of 10 14 ions/cm 2 except for the band CC at 1608 cm -1 which was found to shift to a lower wave number along with an enhancement in the full width half maximum (FWHM) value with increasing fluence. A new bond appeared due to oxidation as a shoulder at 1680 cm -1 in FTIR spectra indicating the presence of CO type bond as a result of N + implantation on PPO films. The optical band gap ( Eg) deduced from absorption spectra, was observed to decrease from 4.4 to 0.5 eV with fluence. The implantation induced carbonaceous clusters, determined using Robertson's formula for the optical band gap, were found to consist of ˜160 fused hexagonal aromatic rings at the maximum energy fluence. An enhanced absorption coefficient as a function of fluence indicated incorporation of either much larger concentration of charge carriers or their mobility than that of the pristine sample. Calculated band tail width from Urbach band tail region for the implanted samples pointed the band edge sharpness to be strongly dependent on fluence indicating an increased disorder with increasing fluence.
NASA Technical Reports Server (NTRS)
deGroh, Kim; Berger, Lauren; Roberts, Lily
2009-01-01
The purpose of this study was to determine the effect of atomic oxygen (AO) exposure on the hydrophilicity of nine different polymers for biomedical applications. Atomic oxygen treatment can alter the chemistry and morphology of polymer surfaces, which may increase the adhesion and spreading of cells on Petri dishes and enhance implant growth. Therefore, nine different polymers were exposed to atomic oxygen and water-contact angle, or hydrophilicity, was measured after exposure. To determine whether hydrophilicity remains static after initial atomic oxygen exposure, or changes with higher fluence exposures, the contact angles between the polymer and water droplet placed on the polymer s surface were measured versus AO fluence. The polymers were exposed to atomic oxygen in a 100-W, 13.56-MHz radio frequency (RF) plasma asher, and the treatment was found to significantly alter the hydrophilicity of non-fluorinated polymers. Pristine samples were compared with samples that had been exposed to AO at various fluence levels. Minimum and maximum fluences for the ashing trials were set based on the effective AO erosion of a Kapton witness coupon in the asher. The time intervals for ashing were determined by finding the logarithmic values of the minimum and maximum fluences. The difference of these two values was divided by the desired number of intervals (ideally 10). The initial desired fluence was then multiplied by this result (2.37), as was each subsequent desired fluence. The flux in the asher was determined to be approximately 3.0 x 10(exp 15) atoms/sq cm/sec, and each polymer was exposed to a maximum fluence of 5.16 x 10(exp 20) atoms/sq cm.
SU-E-J-174: Adaptive PET-Based Dose Painting with Tomotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darwish, N; Mackie, T; Thomadsen, B
2014-06-01
Purpose: PET imaging can be converted into dose prescription directly. Due to the variability of the intensity of PET the image, PET prescription maybe superior over uniform dose prescription. Furthermore, unlike the case in image reconstruction of not knowing the image solution in advance, the prescribed dose is known from a PET image a priori. Therefore, optimum beam orientations are derivable. Methods: We can assume the PET image to be the prescribed dose and invert it to determine the energy fluence. The same method used to reconstruct tissue images from projections could be used to solve the inverse problem ofmore » determining beam orientations and modulation patterns from a dose prescription [10]. Unlike standard tomographic reconstruction of images from measured projection profiles, the inversion of the prescribed dose results in photon fluence which may be negative and therefore unphysical. Two-dimensional modulated beams can be modelled in terms of the attenuated or exponential radon transform of the prescribed dose function (assumed to be the PET image in this case), an application of a Ram-Lak filter, and inversion by backprojection. Unlike the case in PET processing, however, the filtered beam obtained from the inversion represents a physical photon fluence. Therefore, a positivity constraint for the fluence (setting negative fluence to zero) must be applied (Brahme et al 1982, Bortfeld et al 1990) Results: Truncating the negative profiles from the PET data results in an approximation of the derivable energy fluence. Backprojection of the deliverable fluence is an approximation of the dose delivered. The deliverable dose is comparable to the original PET image and is similar to the PET image. Conclusion: It is possible to use the PET data or image as a direct indicator of deliverable fluence for cylindrical radiotherapy systems such as TomoTherapy.« less
NASA Astrophysics Data System (ADS)
Kurotobi, K.; Suzuki, Y.; Nakajima, H.; Suzuki, H.; Iwaki, M.
2003-05-01
He + ion implanted collagen-coated tubes with a fluence of 1 × 10 14 ions/cm 2 were exhibited antithrombogenicity. To investigate the mechanisms of antithrombogenicity of these samples, plasma protein adsorption assay and platelet adhesion experiments were performed. The adsorption of fibrinogen (Fg) and von Willebrand factor (vWf) was minimum on the He + ion implanted collagen with a fluence of 1 × 10 14 ions/cm 2. Platelet adhesion (using platelet rich plasma) was inhibited on the He + ion implanted collagen with a fluence of 1 × 10 14 ions/cm 2 and was accelerated on the untreated collagen and ion implanted collagen with fluences of 1 × 10 13, 1 × 10 15 and 1 × 10 16 ions/cm 2. Platelet activation with washed platelets was observed on untreated collagen and He + ion implanted collagen with a fluence of 1 × 10 14 ions/cm 2 and was inhibited with fluences of 1 × 10 13, 1 × 10 15 and 1 × 10 16 ions/cm 2. Generally, platelets can react with a specific ligand inside the collagen (GFOGER sequence). The results of platelets adhesion experiments using washed platelets indicated that there were no ligands such as GFOGER on the He + ion implanted collagen over a fluence of 1 × 10 13 ions/cm 2. On the 1 × 10 14 ions/cm 2 implanted collagen, no platelet activation was observed due to the influence of plasma proteins. From the above, it is concluded that the decrease of adsorbed Fg and vWf caused the antithrombogenicity of He + ion implanted collagen with a fluence of 1 × 10 14 ions/cm 2 and that plasma protein adsorption took an important role repairing the graft surface.
Nanosecond laser-metal ablation at different ambient conditions
NASA Astrophysics Data System (ADS)
Elsied, Ahmed M.; Dieffenbach, Payson C.; Diwakar, Prasoon K.; Hassanein, Ahmed
2018-05-01
Ablation of metals under different ambient conditions and laser fluences, was investigated through series of experiments. A 1064 nm, 6 ns Nd:YAG laser was used to ablate 1 mm thick metal targets with laser energy ranging from 2 mJ to 300 mJ. The experiments were designed to study the effect of material properties, laser fluence, ambient gas, and ambient pressure on laser-metal ablation. The first experiment was conducted under vacuum to study the effect of laser fluence and material properties on metal ablation, using a wide range of laser fluences (2 J/cm2 up to 300 J/cm2) and two different targets, Al and W. The second experiment was conducted at atmospheric pressure using two different ambient gases air and argon, to understand the effect of ambient gas on laser-metal ablation process. The third experiment was conducted at two different pressures (10 Torr and 760 Torr) using the same ambient gas to investigate the effect of ambient pressure on laser-metal ablation. To compare the different ablation processes, the amount of mass ablated, ablation depth, crater profile and melt formation were measured using White Light Profilometer (WLP). The experimental results show that at low laser fluence: the ablated mass, ablation depth, and height of molten layer follow a logarithmic function of the incident laser fluence. While, at high laser fluence they follow a linear function. This dependence on laser fluence was found to be independent on ambient conditions and irradiated material. The effect of ambient pressure was more pronounced than the effect of ambient gas type. Plasma shielding effect was found to be very pronounced in the presence of ambient gas and led to significant reduction in the total mass ablation.
Denny, Margaret; Higgins, Agnes
2003-06-01
Despite the available literature that identifies the value of integrating computer-assisted learning into the curriculum, psychiatric nurse education lags behind in this area of curriculum development. The purpose of this paper is to report on a pilot project involving the use of a computer assisted learning (CAL) interactive multimedia (IMM) package called 'Admissions,' as a self-directed learning tool with two-second year psychiatric nursing students. The students were on a practice placement in an Irish mental health service. The aim of using the multimedia resource was to augment the students' learning during their practice placement and enable them to re-examine the issue of psychosis from a multiplicity of perspectives. This paper provides a brief description of the interactive multimedia package, together with a discussion on the support offered to the students during its use. experiential taxonomy is used as a framework to guide the discussion on the learning and evaluation process used. Feedback from the students suggests that the CAL package is easy to use, informative and promoted independence and self-directed study.
2015-01-01
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser.1 One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing’s capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of “re-dockings” with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing’s docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening. PMID:25151852
User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids
NASA Technical Reports Server (NTRS)
Chan, William M.; Buning, Pieter G.
1996-01-01
In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.
Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee
2014-09-22
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.
Technology Trends: Buying a Computer.
ERIC Educational Resources Information Center
Strot, Melody; Benno, Mark
1997-01-01
Provides guidelines for buying computers for parents of gifted children. Steps for making decisions include deciding who will use the computer, deciding its purposes and what software packages will be used, determining current and future needs, setting a budget, and reviewing needs with salespersons and school-based technology specialists. (CR)
Re-crystallization of ITO films after carbon irradiation
NASA Astrophysics Data System (ADS)
Usman, Muhammad; Khan, Shahid; Khan, Majid; Abbas, Turab Ali
2017-01-01
2.0 MeV carbon ion irradiation effects on Indium Tin Oxide (ITO) thin films on glass substrate are investigated. The films are irradiated with carbon ions in the fluence range of 1 × 1013 to 1 × 1015 ions/cm2. The irradiation induced effects in ITO are compared before and after ion bombardment by systematic study of structural, optical and electrical properties of the films. The XRD results show polycrystalline nature of un-irradiated ITO films which turns to amorphous state after 1 × 1013 ions/cm2 fluence of carbon ions. Further increase in ion fluence to 1 × 1014 ions/cm2 re-crystallizes the structure and retains for even higher fluences. A gradual decrease in the electrical conductivity and transmittance of irradiated samples is observed with increasing ion fluence. The band gap of the films is observed to be decreased after carbon irradiation.
Radiation Resistance Studies of Amorphous Silicon Alloy Photovoltaic Materials
NASA Technical Reports Server (NTRS)
Woodyard, James R.
1994-01-01
The radiation resistance of commercial solar cells fabricated from hydrogenated amorphous silicon alloys was investigated. A number of different device structures were irradiated with 1.0 MeV protons. The cells were insensitive to proton fluences below 1E12 sq cm. The parameters of the irradiated cells were restored with annealing at 200 C. The annealing time was dependent on proton fluence. Annealing devices for one hour restores cell parameters for fluences below lE14 sq cm require longer annealing times. A parametric fitting model was used to characterize current mechanisms observed in dark I-V measurements. The current mechanisms were explored with irradiation fluence, and voltage and light soaking times. The thermal generation current density and quality factor increased with proton fluence. Device simulation shows the degradation in cell characteristics may be explained by the reduction of the electric field in the intrinsic layer.
NASA Technical Reports Server (NTRS)
Lord, Kenneth R., II; Walters, Michael R.; Woodyard, James R.
1994-01-01
The radiation resistance of commercial solar cells fabricated from hydrogenated amorphous silicon alloys is reported. A number of different device structures were irradiated with 1.0 MeV protons. The cells were annealing at 200 C. The annealing time was dependent on proton fluence. Annealing devices for one hour restores cell parameters or fluences below 1(exp 14) cm(exp -2); fluences above 1(exp 14) cm(exp -2) require longer annealing times. A parametric fitting model was used to characterize current mechanisms observed in dark I-V measurements. The current mechanisms were explored with irradiation fluence, and voltage and light soaking times. The thermal generation current density and quality factor increased with proton fluence. Device simulation shows the degradation in cell characteristics may be explained by the reduction of the electric field in the intrinsic layer.
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Bhargava, P.; Biswas, A. K.; Sahu, Shasikiran; Mandloi, V.; Ittoop, M. O.; Khattak, B. Q.; Tiwari, M. K.; Kukreja, L. M.
2013-03-01
It is shown that the threshold fluence for laser paint stripping can be accurately estimated from the heat of gasification and the absorption coefficient of the epoxy-paint. The threshold fluence determined experimentally by stripping of the epoxy-paint on a substrate using a TEA CO2 laser matches closely with the calculated value. The calculated threshold fluence and the measured absorption coefficient of the paint allowed us to determine the epoxy paint thickness that would be removed per pulse at a given laser fluence even without experimental trials. This was used to predict the optimum scan speed required to strip the epoxy-paint of a given thickness using a high average power TEA CO2 laser. Energy Dispersive X-Ray Fluorescence (EDXRF) studies were also carried out on laser paint-stripped concrete substrate to show high efficacy of this modality.
Property changes of G347A graphite due to neutron irradiation
Campbell, Anne A.; Katoh, Yutai; Snead, Mary A.; ...
2016-08-18
A new, fine-grain nuclear graphite, grade G347A from Tokai Carbon Co., Ltd., has been irradiated in the High Flux Isotope Reactor at Oak Ridge National Laboratory to study the materials property changes that occur when exposed to neutron irradiation at temperatures of interest for Generation-IV nuclear reactor applications. Specimen temperatures ranged from 290°C to 800 °C with a maximum neutron fluence of 40 × 10 25 n/m 2 [E > 0.1 MeV] (~30dpa). Lastly, observed behaviors include: anisotropic behavior of dimensional change in an isotropic graphite, Young's modulus showing parabolic fluence dependence, electrical resistivity increasing at low fluence and additionalmore » increase at high fluence, thermal conductivity rapidly decreasing at low fluence followed by continued degradation, and a similar plateau value of the mean coefficient of thermal expansion for all irradiation temperatures.« less
Practical Issues in Estimating Classification Accuracy and Consistency with R Package cacIRT
ERIC Educational Resources Information Center
Lathrop, Quinn N.
2015-01-01
There are two main lines of research in estimating classification accuracy (CA) and classification consistency (CC) under Item Response Theory (IRT). The R package cacIRT provides computer implementations of both approaches in an accessible and unified framework. Even with available implementations, there remains decisions a researcher faces when…
Individualized Math Problems in Fractions. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. This package contains problems involving computation with common…
Hypermedia for Teaching--A European Collaborative Venture.
ERIC Educational Resources Information Center
Barker, Philip; Bartolome, Antonio
The "Hypermedia for Teaching" project is a European collaborative venture designed to produce a hypermedia learning package that is published on CD-ROM. Two versions of the package are to be developed. One of these is intended to be used on a multimedia personal computer (MPC), while the other is to be used in conjunction with…
Teacher's Corner: Structural Equation Modeling with the Sem Package in R
ERIC Educational Resources Information Center
Fox, John
2006-01-01
R is free, open-source, cooperatively developed software that implements the S statistical programming language and computing environment. The current capabilities of R are extensive, and it is in wide use, especially among statisticians. The sem package provides basic structural equation modeling facilities in R, including the ability to fit…
An interactive interface for NCAR Graphics
NASA Technical Reports Server (NTRS)
Buzbee, Bill; Lackman, Bob; Alpert, Ethan
1994-01-01
The NCAR Graphics package has been a valuable research tool for over 20 years. As a low level Fortran library, however, it was difficult to use for nonprogramming researchers. With this grant and NSF support, an interactive interface has been created which greatly facilitates use of the package by researchers of diverse computer skill levels.
Education and Training Packages for CAD/CAM.
ERIC Educational Resources Information Center
Wright, I. C.
1986-01-01
Discusses educational efforts in the fields of Computer Assisted Design and Manufacturing (CAD/CAM). Describes two educational training initiatives underway in the United Kingdom, one of which is a resource materials package for teachers of CAD/CAM at the undergraduate level, and the other a training course for managers of CAD/CAM systems. (TW)
Meeting the needs of an ever-demanding market.
Rigby, Richard
2002-04-01
Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.
ERIC Educational Resources Information Center
Rhein, Deborah; Alibrandi, Mary; Lyons, Mary; Sammons, Janice; Doyle, Luther
This bibliography, developed by Project RIMES (Reading Instructional Methods of Efficacy with Students) lists 80 software packages for teaching early reading and spelling to students at risk for reading and spelling failure. The software packages are presented alphabetically by title. Entries usually include a grade level indicator, a brief…
45 CFR Appendix C to Part 1355 - Electronic Data Transmission Format
Code of Federal Regulations, 2010 CFR
2010-10-01
... mainframe-to-mainframe data exchange system using the Sterling Software data transfer package called “SUPERTRACS.” This package will allow data exchange between most computer platforms (both mini and mainframe... 45 Public Welfare 4 2010-10-01 2010-10-01 false Electronic Data Transmission Format C Appendix C...
45 CFR Appendix C to Part 1355 - Electronic Data Transmission Format
Code of Federal Regulations, 2011 CFR
2011-10-01
... mainframe-to-mainframe data exchange system using the Sterling Software data transfer package called “SUPERTRACS.” This package will allow data exchange between most computer platforms (both mini and mainframe... 45 Public Welfare 4 2011-10-01 2011-10-01 false Electronic Data Transmission Format C Appendix C...
Using R in Introductory Statistics Courses with the pmg Graphical User Interface
ERIC Educational Resources Information Center
Verzani, John
2008-01-01
The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek
Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less
Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...
2017-04-24
Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bastian, Mark; Trigueros, Jose V.
Phoenix is a Java Virtual Machine (JVM) based library for performing mathematical and astrodynamics calculations. It consists of two primary sub-modules, phoenix-math and phoenix-astrodynamics. The mathematics package has a variety of mathematical classes for performing 3D transformations, geometric reasoning, and numerical analysis. The astrodynamics package has various classes and methods for computing locations, attitudes, accesses, and other values useful for general satellite modeling and simulation. Methods for computing celestial locations, such as the location of the Sun and Moon, are also included. Phoenix is meant to be used as a library within the context of a larger application. For example,more » it could be used for a web service, desktop client, or to compute simple values in a scripting environment.« less
Planned development of a 3D computer based on free-space optical interconnects
NASA Astrophysics Data System (ADS)
Neff, John A.; Guarino, David R.
1994-05-01
Free-space optical interconnection has the potential to provide upwards of a million data channels between planes of electronic circuits. This may result in the planar board and backplane structures of today giving away to 3-D stacks of wafers or multi-chip modules interconnected via channels running perpendicular to the processor planes, thereby eliminating much of the packaging overhead. Three-dimensional packaging is very appealing for tightly coupled fine-grained parallel computing where the need for massive numbers of interconnections is severely taxing the capabilities of the planar structures. This paper describes a coordinated effort by four research organizations to demonstrate an operational fine-grained parallel computer that achieves global connectivity through the use of free space optical interconnects.
Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.
2016-01-01
Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.
MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.
1995-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less
Culbertson, C N; Wangerin, K; Ghandourah, E; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for neutron capture therapy related modeling. A boron neutron capture therapy model was analyzed comparing COG calculational results to results from the widely used MCNP4B (Monte Carlo N-Particle) transport code. The approach for computing neutron fluence rate and each dose component relevant in boron neutron capture therapy is described, and calculated values are shown in detail. The differences between the COG and MCNP predictions are qualified and quantified. The differences are generally small and suggest that the COG code can be applied for BNCT research related problems.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1988
1988-01-01
Reviews three computer software packages for Apple II computers. Includes "Simulation of Hemoglobin Function,""Solution Equilibrium Problems," and "Thin-Layer Chromatography." Contains ratings of ease of use, subject matter content, pedagogic value, and student reaction according to two separate reviewers for each…
Computer Center: Software Review.
ERIC Educational Resources Information Center
Duhrkopf, Richard, Ed.; Belshe, John F., Ed.
1988-01-01
Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peavler, J.
1979-06-01
This publication gives details about hardware, software, procedures, and services of the Central Computing Facility, as well as information about how to become an authorized user. Languages, compilers' libraries, and applications packages available are described. 17 tables. (RWR)
Ku-Band rendezvous radar performance computer simulation model
NASA Technical Reports Server (NTRS)
Magnusson, H. G.; Goff, M. F.
1984-01-01
All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.
Ku-Band rendezvous radar performance computer simulation model
NASA Astrophysics Data System (ADS)
Magnusson, H. G.; Goff, M. F.
1984-06-01
All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.
NASA Astrophysics Data System (ADS)
Youssef, Nabil L.; Elgendi, S. G.
2014-03-01
The book “Handbook of Finsler geometry” has been included with a CD containing an elegant Maple package, FINSLER, for calculations in Finsler geometry. Using this package, an example concerning a Finsler generalization of Einstein’s vacuum field equations was treated. In this example, the calculation of the components of the hv-curvature of Cartan connection leads to wrong expressions. On the other hand, the FINSLER package works only in dimension four. We introduce a new Finsler package in which we fix the two problems and solve them. Moreover, we extend this package to compute not only the geometric objects associated with Cartan connection but also those associated with Berwald, Chern and Hashiguchi connections in any dimension. These improvements have been illustrated by a concrete example. Furthermore, the problem of simplifying tensor expressions is treated. This paper is intended to make calculations in Finsler geometry more easier and simpler.
Casey, D T; Volegov, P L; Merrill, F E; Munro, D H; Grim, G P; Landen, O L; Spears, B K; Fittinghoff, D N; Field, J E; Smalyuk, V A
2016-11-01
The Neutron Imaging System at the National Ignition Facility is used to observe the primary ∼14 MeV neutrons from the hotspot and down-scattered neutrons (6-12 MeV) from the assembled shell. Due to the strong spatial dependence of the primary neutron fluence through the dense shell, the down-scattered image is convolved with the primary-neutron fluence much like a backlighter profile. Using a characteristic scattering angle assumption, we estimate the primary neutron fluence and compensate the down-scattered image, which reveals information about asymmetry that is otherwise difficult to extract without invoking complicated models.
Chen, Hu; Liu, Jing; Li, Hong; Ge, Wenqi; Sun, Yuchun; Wang, Yong; Lü, Peijun
2015-02-01
The objective was to study the relationship between laser fluence and ablation efficiency of a femtosecond laser with a Gaussian-shaped pulse used to ablate dentin and enamel for prosthodontic tooth preparation. A diode-pumped thin-disk femtosecond laser with wavelength of 1025 nm and pulse width of 400 fs was used for the ablation of dentin and enamel. The laser spot was guided in a line on the dentin and enamel surfaces to form a groove-shaped ablation zone under a series of laser pulse energies. The width and volume of the ablated line were measured under a three-dimensional confocal microscope to calculate the ablation efficiency. Ablation efficiency for dentin reached a maximum value of 0.020 mm3∕J when the laser fluence was set at 6.51 J∕cm2. For enamel, the maximum ablation efficiency was 0.009 mm3∕J at a fluence of 7.59 J∕cm2.Ablation efficiency of the femtosecond laser on dentin and enamel is closely related to the laser fluence and may reach a maximum when the laser fluence is set to an appropriate value. © 2015 Society of Photo-Optical Instrumentation Engineers (SPIE)
Formation of porous networks on polymeric surfaces by femtosecond laser micromachining
NASA Astrophysics Data System (ADS)
Assaf, Youssef; Kietzig, Anne-Marie
2017-02-01
In this study, porous network structures were successfully created on various polymer surfaces by femtosecond laser micromachining. Six different polymers (poly(tetrafluoroethylene) (PTFE), poly(methyl methacrylate) (PMMA), high density poly(ethylene) (HDPE), poly(lactic acid) (PLA), poly(carbonate) (PC), and poly(ethylene terephthalate) (PET)) were machined at different fluences and pulse numbers, and the resulting structures were identified and compared by lacunarity analysis. At low fluence and pulse numbers, porous networks were confirmed to form on all materials except PLA. Furthermore, all networks except for PMMA were shown to bundle up at high fluence and pulse numbers. In the case of PC, a complete breakdown of the structure at such conditions was observed. Operation slightly above threshold fluence and at low pulse numbers is therefore recommended for porous network formation. Finally, the thickness over which these structures formed was measured and compared to two intrinsic material dependent parameters: the single pulse threshold fluence and the incubation coefficient. Results indicate that a lower threshold fluence at operating conditions favors material removal over structure formation and is hence detrimental to porous network formation. Favorable machining conditions and material-dependent parameters for the formation of porous networks on polymer surfaces have thus been identified.
Laurence, Ted A; Bude, Jeff D; Ly, Sonny; Shen, Nan; Feit, Michael D
2012-05-07
Surface laser damage limits the lifetime of optics for systems guiding high fluence pulses, particularly damage in silica optics used for inertial confinement fusion-class lasers (nanosecond-scale high energy pulses at 355 nm/3.5 eV). The density of damage precursors at low fluence has been measured using large beams (1-3 cm); higher fluences cannot be measured easily since the high density of resulting damage initiation sites results in clustering. We developed automated experiments and analysis that allow us to damage test thousands of sites with small beams (10-30 µm), and automatically image the test sites to determine if laser damage occurred. We developed an analysis method that provides a rigorous connection between these small beam damage test results of damage probability versus laser pulse energy and the large beam damage results of damage precursor densities versus fluence. We find that for uncoated and coated fused silica samples, the distribution of precursors nearly flattens at very high fluences, up to 150 J/cm2, providing important constraints on the physical distribution and nature of these precursors.
NASA Astrophysics Data System (ADS)
Puttaraksa, Nitipon; Norarat, Rattanaporn; Laitinen, Mikko; Sajavaara, Timo; Singkarat, Somsorn; Whitlow, Harry J.
2012-02-01
Poly(methyl methacrylate) is a common polymer used as a lithographic resist for all forms of particle (photon, ion and electron) beam writing. Faithful lithographic reproduction requires that the exposure dose, Θ, lies in the window Θ0⩽Θ<Θ, where Θ0 and Θ represent the clearing and cross-linking onset doses, respectively. In this work we have used the programmable proximity aperture ion beam lithography systems in Chiang Mai and Jyväskylä to determine the exposure characteristics in terms of fluence for 2 MeV protons, 3 MeV 4He and 6 MeV 12C ions, respectively. After exposure the samples were developed in 7:3 by volume propan-2-ol:de-ionised water mixture. At low fluences, where the fluence is below the clearing fluence, the exposed regions were characterised by rough regions, particularly for He with holes around the ion tracks. As the fluence (dose) increases so that the dose exceeds the clearing dose, the PMMA is uniformly removed with sharp vertical walls. When Θ exceeds the cross-linking onset fluence, the bottom of the exposed regions show undissolved PMMA.
Fast approximate delivery of fluence maps for IMRT and VMAT
NASA Astrophysics Data System (ADS)
Balvert, Marleen; Craft, David
2017-02-01
In this article we provide a method to generate the trade-off between delivery time and fluence map matching quality for dynamically delivered fluence maps. At the heart of our method lies a mathematical programming model that, for a given duration of delivery, optimizes leaf trajectories and dose rates such that the desired fluence map is reproduced as well as possible. We begin with the single fluence map case and then generalize the model and the solution technique to the delivery of sequential fluence maps. The resulting large-scale, non-convex optimization problem was solved using a heuristic approach. We test our method using a prostate case and a head and neck case, and present the resulting trade-off curves. Analysis of the leaf trajectories reveals that short time plans have larger leaf openings in general than longer delivery time plans. Our method allows one to explore the continuum of possibilities between coarse, large segment plans characteristic of direct aperture approaches and narrow field plans produced by sliding window approaches. Exposing this trade-off will allow for an informed choice between plan quality and solution time. Further research is required to speed up the optimization process to make this method clinically implementable.
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
QDENSITY—A Mathematica Quantum Computer simulation
NASA Astrophysics Data System (ADS)
Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank
2006-06-01
This Mathematica 5.2 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. Selected examples of the basic commands are presented here and a tutorial notebook, Tutorial.nb is provided with the package (available on our website) that serves as a full guide to the package. Finally, application is made to a variety of relevant cases, including Teleportation, Quantum Fourier transform, Grover's search and Shor's algorithm, in separate notebooks: QFT.nb, Teleportation.nb, Grover.nb and Shor.nb where each algorithm is explained in detail. Finally, two examples of the construction and manipulation of cluster states, which are part of "one way computing" ideas, are included as an additional tool in the notebook Cluster.nb. A Mathematica palette containing most commands in QDENSITY is also included: QDENSpalette.nb. Program summaryTitle of program: QDENSITY Catalogue identifier: ADXH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v1_0 Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Operating systems: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Programming language used: Mathematica 5.2 No. of bytes in distributed program, including test data, etc.: 180 581 No. of lines in distributed program, including test data, etc.: 19 382 Distribution format: tar.gz Method of solution: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. QDENSITY is available at http://www.pitt.edu/~tabakin/QDENSITY.
A computational framework for automation of point defect calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
A computational framework for automation of point defect calculations
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...
2017-01-13
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
Bird impact analysis package for turbine engine fan blades
NASA Technical Reports Server (NTRS)
Hirschbein, M. S.
1982-01-01
A computer program has been developed to analyze the gross structural response of turbine engine fan blades subjected to bird strikes. The program couples a NASTRAN finite element model and modal analysis of a fan blade with a multi-mode bird impact analysis computer program. The impact analysis uses the NASTRAN blade model and a fluid jet model of the bird to interactively calculate blade loading during a bird strike event. The analysis package is computationaly efficient, easy to use and provides a comprehensive history of the gross structual blade response. Example cases are presented for a representative fan blade.
Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Lívia Almeida Bueno; Freitas, Deborah Queiroz
2015-01-01
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Efficient Predictions of Excited State for Nanomaterials Using Aces 3 and 4
2017-12-20
by first-principle methods in the software package ACES by using large parallel computers, growing tothe exascale. 15. SUBJECT TERMS Computer...modeling, excited states, optical properties, structure, stability, activation barriers first principle methods , parallel computing 16. SECURITY...2 Progress with new density functional methods
Computers as Teaching Tools: Some Examples and Guidelines.
ERIC Educational Resources Information Center
Beins, Bernard C.
The use of computers in the classroom has been touted as an important innovation in education. This article features some recently developed software for use in teaching psychology and different approaches to classroom computer use. Uses of software packages for psychological research designs are included as are applications and limitations of…
interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and
The Use of Microcomputers in Distance Teaching Systems. ZIFF Papiere 70.
ERIC Educational Resources Information Center
Rumble, Greville
Microcomputers have revolutionized distance education in virtually every area. Used alone, personal computers provide students with a wide range of utilities, including word processing, graphics packages, and spreadsheets. When linked to a mainframe computer or connected to other personal computers in local area networks, microcomputers can…
Instructional Support Software System. Final Report.
ERIC Educational Resources Information Center
McDonnell Douglas Astronautics Co. - East, St. Louis, MO.
This report describes the development of the Instructional Support System (ISS), a large-scale, computer-based training system that supports both computer-assisted instruction and computer-managed instruction. Written in the Ada programming language, the ISS software package is designed to be machine independent. It is also grouped into functional…
Application of Computer Graphics to Graphing in Algebra and Trigonometry. Final Report.
ERIC Educational Resources Information Center
Morris, J. Richard
This project was designed to improve the graphing competency of students in elementary algebra, intermediate algebra, and trigonometry courses at Virginia Commonwealth University. Computer graphics programs were designed using an Apple II Plus computer and implemented using Pascal. The software package is interactive and gives students control…
ROUTES: a computer program for preliminary route location.
S.E. Reutebuch
1988-01-01
An analytical description of the ROUTES computer program is presented. ROUTES is part of the integrated preliminary harvest- and transportation-planning software package, PLANS. The ROUTES computer program is useful where grade and sideslope limitations are important in determining routes for vehicular travel. With the program, planners can rapidly identify route...
A Novel Use of Computer Simulation in an Applied Pharmacokinetics Course.
ERIC Educational Resources Information Center
Sullivan, Timothy J.
1982-01-01
The use of a package of interactive computer programs designed to simulate pharmacokinetic monitoring of drug therapy in a required undergraduate applied pharmacokinetics course is described. Students were assigned the problem of maintaining therapeutic drug concentrations in a computer generated "patient" as an adjunct to classroom instruction.…
Real-time simulator for designing electron dual scattering foil systems.
Carver, Robert L; Hogstrom, Kenneth R; Price, Michael J; LeBlanc, Justin D; Pitcher, Garrett M
2014-11-08
The purpose of this work was to develop a user friendly, accurate, real-time com- puter simulator to facilitate the design of dual foil scattering systems for electron beams on radiotherapy accelerators. The simulator allows for a relatively quick, initial design that can be refined and verified with subsequent Monte Carlo (MC) calculations and measurements. The simulator also is a powerful educational tool. The simulator consists of an analytical algorithm for calculating electron fluence and X-ray dose and a graphical user interface (GUI) C++ program. The algorithm predicts electron fluence using Fermi-Eyges multiple Coulomb scattering theory with the reduced Gaussian formalism for scattering powers. The simulator also estimates central-axis and off-axis X-ray dose arising from the dual foil system. Once the geometry of the accelerator is specified, the simulator allows the user to continuously vary primary scattering foil material and thickness, secondary scat- tering foil material and Gaussian shape (thickness and sigma), and beam energy. The off-axis electron relative fluence or total dose profile and central-axis X-ray dose contamination are computed and displayed in real time. The simulator was validated by comparison of off-axis electron relative fluence and X-ray percent dose profiles with those calculated using EGSnrc MC. Over the energy range 7-20 MeV, using present foils on an Elekta radiotherapy accelerator, the simulator was able to reproduce MC profiles to within 2% out to 20 cm from the central axis. The central-axis X-ray percent dose predictions matched measured data to within 0.5%. The calculation time was approximately 100 ms using a single Intel 2.93 GHz processor, which allows for real-time variation of foil geometrical parameters using slider bars. This work demonstrates how the user-friendly GUI and real-time nature of the simulator make it an effective educational tool for gaining a better understanding of the effects that various system parameters have on a relative dose profile. This work also demonstrates a method for using the simulator as a design tool for creating custom dual scattering foil systems in the clinical range of beam energies (6-20 MeV).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y M; Han, B; Xing, L
2016-06-15
Purpose: EPID-based patient-specific quality assurance provides verification of the planning setup and delivery process that phantomless QA and log-file based virtual dosimetry methods cannot achieve. We present a method for EPID-based QA utilizing spatially-variant EPID response kernels that allows for direct calculation of the entrance fluence and 3D phantom dose. Methods: An EPID dosimetry system was utilized for 3D dose reconstruction in a cylindrical phantom for the purposes of end-to-end QA. Monte Carlo (MC) methods were used to generate pixel-specific point-spread functions (PSFs) characterizing the spatially non-uniform EPID portal response in the presence of phantom scatter. The spatially-variant PSFs weremore » decomposed into spatially-invariant basis PSFs with the symmetric central-axis kernel as the primary basis kernel and off-axis representing orthogonal perturbations in pixel-space. This compact and accurate characterization enables the use of a modified Richardson-Lucy deconvolution algorithm to directly reconstruct entrance fluence from EPID images without iterative scatter subtraction. High-resolution phantom dose kernels were cogenerated in MC with the PSFs enabling direct recalculation of the resulting phantom dose by rapid forward convolution once the entrance fluence was calculated. A Delta4 QA phantom was used to validate the dose reconstructed in this approach. Results: The spatially-invariant representation of the EPID response accurately reproduced the entrance fluence with >99.5% fidelity with a simultaneous reduction of >60% in computational overhead. 3D dose for 10{sub 6} voxels was reconstructed for the entire phantom geometry. A 3D global gamma analysis demonstrated a >95% pass rate at 3%/3mm. Conclusion: Our approach demonstrates the capabilities of an EPID-based end-to-end QA methodology that is more efficient than traditional EPID dosimetry methods. Displacing the point of measurement external to the QA phantom reduces the necessary complexity of the phantom itself while offering a method that is highly scalable and inherently generalizable to rotational and trajectory based deliveries. This research was partially supported by Varian.« less
Kheyrandish, Ataollah; Mohseni, Madjid; Taghipour, Fariborz
2018-06-15
Determining fluence is essential to derive the inactivation kinetics of microorganisms and to design ultraviolet (UV) reactors for water disinfection. UV light emitting diodes (UV-LEDs) are emerging UV sources with various advantages compared to conventional UV lamps. Unlike conventional mercury lamps, no standard method is available to determine the average fluence of the UV-LEDs, and conventional methods used to determine the fluence for UV mercury lamps are not applicable to UV-LEDs due to the relatively low power output, polychromatic wavelength, and specific radiation profile of UV-LEDs. In this study, a method was developed to determine the average fluence inside a water suspension in a UV-LED experimental setup. In this method, the average fluence was estimated by measuring the irradiance at a few points for a collimated and uniform radiation on a Petri dish surface. New correction parameters were defined and proposed, and several of the existing parameters for determining the fluence of the UV mercury lamp apparatus were revised to measure and quantify the collimation and uniformity of the radiation. To study the effect of polychromatic output and radiation profile of the UV-LEDs, two UV-LEDs with peak wavelengths of 262 and 275 nm and different radiation profiles were selected as the representatives of typical UV-LEDs applied to microbial inactivation. The proper setup configuration for microorganism inactivation studies was also determined based on the defined correction factors.
Two solar proton fluence models based on ground level enhancement observations
NASA Astrophysics Data System (ADS)
Raukunen, Osku; Vainio, Rami; Tylka, Allan J.; Dietrich, William F.; Jiggens, Piers; Heynderickx, Daniel; Dierckxsens, Mark; Crosby, Norma; Ganse, Urs; Siipola, Robert
2018-01-01
Solar energetic particles (SEPs) constitute an important component of the radiation environment in interplanetary space. Accurate modeling of SEP events is crucial for the mitigation of radiation hazards in spacecraft design. In this study we present two new statistical models of high energy solar proton fluences based on ground level enhancement (GLE) observations during solar cycles 19-24. As the basis of our modeling, we utilize a four parameter double power law function (known as the Band function) fits to integral GLE fluence spectra in rigidity. In the first model, the integral and differential fluences for protons with energies between 10 MeV and 1 GeV are calculated using the fits, and the distributions of the fluences at certain energies are modeled with an exponentially cut-off power law function. In the second model, we use a more advanced methodology: by investigating the distributions and relationships of the spectral fit parameters we find that they can be modeled as two independent and two dependent variables. Therefore, instead of modeling the fluences separately at different energies, we can model the shape of the fluence spectrum. We present examples of modeling results and show that the two methodologies agree well except for a short mission duration (1 year) at low confidence level. We also show that there is a reasonable agreement between our models and three well-known solar proton models (JPL, ESP and SEPEM), despite the differences in both the modeling methodologies and the data used to construct the models.
Computing Spacecraft Solar-Cell Damage by Charged Particles
NASA Technical Reports Server (NTRS)
Gaddy, Edward M.
2006-01-01
General EQFlux is a computer program that converts the measure of the damage done to solar cells in outer space by impingement of electrons and protons having many different kinetic energies into the measure of the damage done by an equivalent fluence of electrons, each having kinetic energy of 1 MeV. Prior to the development of General EQFlux, there was no single computer program offering this capability: For a given type of solar cell, it was necessary to either perform the calculations manually or to use one of three Fortran programs, each of which was applicable to only one type of solar cell. The problem in developing General EQFlux was to rewrite and combine the three programs into a single program that could perform the calculations for three types of solar cells and run in a Windows environment with a Windows graphical user interface. In comparison with the three prior programs, General EQFlux is easier to use.
Tretola, M; Di Rosa, A R; Tirloni, E; Ottoboni, M; Giromini, C; Leone, F; Bernardi, C E M; Dell'Orto, V; Chiofalo, V; Pinotti, L
2017-08-01
The use of alternative feed ingredients in farm animal's diets can be an interesting choice from several standpoints, including safety. In this respect, this study investigated the safety features of selected former food products (FFPs) intended for animal nutrition produced in the framework of the IZS PLV 06/14 RC project by an FFP processing plant. Six FFP samples, both mash and pelleted, were analysed for the enumeration of total viable count (TVC) (ISO 4833), Enterobacteriaceae (ISO 21528-1), Escherichia coli (ISO 16649-1), coagulase-positive Staphylococci (CPS) (ISO 6888), presumptive Bacillus cereus and its spores (ISO 7932), sulphite-reducing Clostridia (ISO 7937), yeasts and moulds (ISO 21527-1), and the presence in 25 g of Salmonella spp. (ISO 6579). On the same samples, the presence of undesired ingredients, which can be identified as remnants of packaging materials, was evaluated by two different methods: stereomicroscopy according to published methods; and stereomicroscopy coupled with a computer vision system (IRIS Visual Analyzer VA400). All FFPs analysed were safe from a microbiological point of view. TVC was limited and Salmonella was always absent. When remnants of packaging materials were considered, the contamination level was below 0.08% (w/w). Of note, packaging remnants were found mainly from the 1-mm sieve mesh fractions. Finally, the innovative computer vision system demonstrated the possibility of rapid detection for the presence of packaging remnants in FFPs when combined with a stereomicroscope. In conclusion, the FFPs analysed in the present study can be considered safe, even though some improvements in FFP processing in the feeding plant can be useful in further reducing their microbial loads and impurity.
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei
2014-01-01
Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
NASA Astrophysics Data System (ADS)
Wang, Yu; Liu, Qun
2013-01-01
Surplus-production models are widely used in fish stock assessment and fisheries management due to their simplicity and lower data demands than age-structured models such as Virtual Population Analysis. The CEDA (catch-effort data analysis) and ASPIC (a surplus-production model incorporating covariates) computer packages are data-fitting or parameter estimation tools that have been developed to analyze catch-and-effort data using non-equilibrium surplus production models. We applied CEDA and ASPIC to the hairtail ( Trichiurus japonicus) fishery in the East China Sea. Both packages produced robust results and yielded similar estimates. In CEDA, the Schaefer surplus production model with log-normal error assumption produced results close to those of ASPIC. CEDA is sensitive to the choice of initial proportion, while ASPIC is not. However, CEDA produced higher R 2 values than ASPIC.
The Ettention software package.
Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp
2016-02-01
We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.
Parallel Adaptive Mesh Refinement Library
NASA Technical Reports Server (NTRS)
Mac-Neice, Peter; Olson, Kevin
2005-01-01
Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.
NASA Technical Reports Server (NTRS)
Peredo, James P.
1988-01-01
Like many large companies, Ames relies very much on its computing power to get work done. And, like many other large companies, finding the IBM PC a reliable tool, Ames uses it for many of the same types of functions as other companies. Presentation and clarification needs demand much of graphics packages. Programming and text editing needs require simpler, more-powerful packages. The storage space needed by NASA's scientists and users for the monumental amounts of data that Ames needs to keep demand the best database packages that are large and easy to use. Availability to the Micom Switching Network combines the powers of the IBM PC with the capabilities of other computers and mainframes and allows users to communicate electronically. These four primary capabilities of the PC are vital to the needs of NASA's users and help to continue and support the vast amounts of work done by the NASA employees.
Implementation of Audio Computer-Assisted Interviewing Software in HIV/AIDS Research
Pluhar, Erika; Yeager, Katherine A.; Corkran, Carol; McCarty, Frances; Holstad, Marcia McDonnell; Denzmore-Nwagbara, Pamela; Fielder, Bridget; DiIorio, Colleen
2007-01-01
Computer assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology due to lack of familiarity with the practical issues related to using these software packages. The purpose of this paper is to describe the implementation of one particular ACASI software package, the Questionnaire Development System™ (QDS™), in several nursing and HIV/AIDS prevention research settings. We present acceptability and satisfaction data from two large-scale public health studies in which we have used QDS with diverse populations. We also address issues related to developing and programming a questionnaire, discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data, and summarize advantages and disadvantages of computer assisted research methods. PMID:17662924
The Computer Aided Aircraft-design Package (CAAP)
NASA Technical Reports Server (NTRS)
Yalif, Guy U.
1994-01-01
The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.
NASA Astrophysics Data System (ADS)
Hoeferkamp, M. R.; Grummer, A.; Rajen, I.; Seidel, S.
2018-05-01
Methods are developed for the application of forward biased p-i-n photodiodes to measurements of charged particle fluence beyond 1015 1-MeV-neutron-equivalent/cm2. An order of magnitude extension of the regime where forward voltage can be used to infer fluence is achieved for OSRAM BPW34F devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saunders, Sara L; Andreozzi, Jacqueline M; Pogue, Brian W
Purpose: The irradiation of photodynamic agents with radiotherapy beams has been demonstrated to enhance tumor killing in various studies, and one proposed mechanism is the optical fluence of Cherenkov emission activating the photosensitizer. This mechanism is explored in Monte Carlo simulations of fluence as well as laboratory measurements of fluence and radical oxygen species. Methods: Simulations were completed using GAMOS/GEANT4 with a 6 MV photon beam in tissue. The effects of blood vessel diameter, blood oxygen saturation, and beam size were examined, recording spectral fluence. Experiments were carried out in solutions of photosensitizer and phantoms. Results: Cherenkov produced by amore » 100×100um{sup 2} 6 MV beam resulted in fluence of less than 1 nJ/cm{sup 2}/Gy per 1 nm wavelength. At this microscopic level, differences in absorption of blood and water in the tissue affected the fluence spectrum, but variation in blood oxygenation had little effect. Light in tissue resulting from larger (10mm ×10mm) 6 MV beams had greater fluence due to light transport and elastic scattering of optical photons, but this transport process also resulted in higher absorption shifts. Therefore, the spectrum produced by a microscopic beam was weighted more heavily in UV/blue wavelengths than the spectrum at the macroscopic level. At the macroscopic level, the total fluence available for absorption by Verteporfin (BPD) in tissue approached uJ/cm{sup 2} for a high radiation dose, indicating that photodynamic activation seems unlikely. Tissue phantom confirmation of these light levels supported this observation, and photosensitization measurements with a radical oxygen species reporter are ongoing. Conclusion: Simulations demonstrated that fluence produced by Cherenkov in tissue by 6 MV photon beams at typical radiotherapy doses appears insufficient to activate photosensitizers to the level required for threshold effects, yet this disagrees with published biological experiments. Experimental validation in tissue phantoms and cell studies are ongoing to clarify this discrepancy. Funding from NIH grant R01CA109558.« less
Effects of very low fluences of high-energy protons or iron ions on irradiated and bystander cells.
Yang, H; Magpayo, N; Rusek, A; Chiang, I-H; Sivertz, M; Held, K D
2011-12-01
In space, astronauts are exposed to radiation fields consisting of energetic protons and high atomic number, high-energy (HZE) particles at very low dose rates or fluences. Under these conditions, it is likely that, in addition to cells in an astronaut's body being traversed by ionizing radiation particles, unirradiated cells can also receive intercellular bystander signals from irradiated cells. Thus this study was designed to determine the dependence of DNA damage induction on dose at very low fluences of charged particles. Novel techniques to quantify particle fluence have been developed at the NASA Space Radiation Biology Laboratory (NSRL) at Brookhaven National Laboratory (BNL). The approach uses a large ionization chamber to visualize the radiation beam coupled with a scintillation counter to measure fluence. This development has allowed us to irradiate cells with 1 GeV/nucleon protons and iron ions at particle fluences as low as 200 particles/cm(2) and quantify biological responses. Our results show an increased fraction of cells with DNA damage in both the irradiated population and bystander cells sharing medium with irradiated cells after low fluences. The fraction of cells with damage, manifest as micronucleus formation and 53BP1 focus induction, is about 2-fold higher than background at doses as low as ∼0.47 mGy iron ions (∼0.02 iron ions/cell) or ∼70 μGy protons (∼2 protons/cell). In the irradiated population, irrespective of radiation type, the fraction of damaged cells is constant from the lowest damaging fluence to about 1 cGy, above which the fraction of damaged cells increases with dose. In the bystander population, the level of damage is the same as in the irradiated population up to 1 cGy, but it does not increase above that plateau level with increasing dose. The data suggest that at fluences of high-energy protons or iron ions less than about 5 cGy, the response in irradiated cell populations may be dominated by the bystander response.
Analysis of USAREUR Family Housing.
1985-04-01
Standard Installation/Division Personnel System SJA ................ Staff Judge Advocate SPSS ............... Statistical Package for the...for Projecting Family Housing Requirements. a. Attempts to define USAREUR’s programmable family housing deficit Sbased on the FHS have caused anguish ...responses using the Statistical Package for the Social Sciences ( SPSS ) computer program. E-2 ANNEX E RESPONSE TO ESC HOUSING QUESTIONNAIRE Section Page I
A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.
ERIC Educational Resources Information Center
McConkie, George W.; And Others
A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…
Parachute Dynamics Investigations Using a Sensor Package Airdropped from a Small-Scale Airplane
NASA Technical Reports Server (NTRS)
Dooley, Jessica; Lorenz, Ralph D.
2005-01-01
We explore the utility of various sensors by recovering parachute-probe dynamics information from a package released from a small-scale, remote-controlled airplane. The airdrops aid in the development of datasets for the exploration of planetary probe trajectory recovery algorithms, supplementing data collected from instrumented, full-scale tests and computer models.
A Computer Evolution in Teaching Undergraduate Time Series
ERIC Educational Resources Information Center
Hodgess, Erin M.
2004-01-01
In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…
21 CFR 111.35 - Under this subpart D, what records must you make and keep?
Code of Federal Regulations, 2013 CFR
2013-04-01
..., and any other contact surfaces that are used to manufacture, package, label, or hold components or... current software is not able to retrieve such records) and of data entered into computer systems that you use to manufacture, package, label, or hold dietary supplements. (i) Your backup file (e.g., a hard...
21 CFR 111.35 - Under this subpart D, what records must you make and keep?
Code of Federal Regulations, 2010 CFR
2010-04-01
..., and any other contact surfaces that are used to manufacture, package, label, or hold components or... current software is not able to retrieve such records) and of data entered into computer systems that you use to manufacture, package, label, or hold dietary supplements. (i) Your backup file (e.g., a hard...
21 CFR 111.35 - Under this subpart D, what records must you make and keep?
Code of Federal Regulations, 2011 CFR
2011-04-01
..., and any other contact surfaces that are used to manufacture, package, label, or hold components or... current software is not able to retrieve such records) and of data entered into computer systems that you use to manufacture, package, label, or hold dietary supplements. (i) Your backup file (e.g., a hard...
BEARCLAW: Boundary Embedded Adaptive Refinement Conservation LAW package
NASA Astrophysics Data System (ADS)
Mitran, Sorin
2011-04-01
The BEARCLAW package is a multidimensional, Eulerian AMR-capable computational code written in Fortran to solve hyperbolic systems for astrophysical applications. It is part of AstroBEAR, a hydrodynamic & magnetohydrodynamic code environment designed for a variety of astrophysical applications which allows simulations in 2, 2.5 (i.e., cylindrical), and 3 dimensions, in either cartesian or curvilinear coordinates.
ARM Data-Oriented Metrics and Diagnostics Package for Climate Model Evaluation Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chengzhu; Xie, Shaocheng
A Python-based metrics and diagnostics package is currently being developed by the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Infrastructure Team at Lawrence Livermore National Laboratory (LLNL) to facilitate the use of long-term, high-frequency measurements from the ARM Facility in evaluating the regional climate simulation of clouds, radiation, and precipitation. This metrics and diagnostics package computes climatological means of targeted climate model simulation and generates tables and plots for comparing the model simulation with ARM observational data. The Coupled Model Intercomparison Project (CMIP) model data sets are also included in the package to enable model intercomparison as demonstratedmore » in Zhang et al. (2017). The mean of the CMIP model can serve as a reference for individual models. Basic performance metrics are computed to measure the accuracy of mean state and variability of climate models. The evaluated physical quantities include cloud fraction, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, and radiative fluxes, with plan to extend to more fields, such as aerosol and microphysics properties. Process-oriented diagnostics focusing on individual cloud- and precipitation-related phenomena are also being developed for the evaluation and development of specific model physical parameterizations. The version 1.0 package is designed based on data collected at ARM’s Southern Great Plains (SGP) Research Facility, with the plan to extend to other ARM sites. The metrics and diagnostics package is currently built upon standard Python libraries and additional Python packages developed by DOE (such as CDMS and CDAT). The ARM metrics and diagnostic package is available publicly with the hope that it can serve as an easy entry point for climate modelers to compare their models with ARM data. In this report, we first present the input data, which constitutes the core content of the metrics and diagnostics package in section 2, and a user's guide documenting the workflow/structure of the version 1.0 codes, and including step-by-step instruction for running the package in section 3.« less
High resolution X-ray CT for advanced electronics packaging
NASA Astrophysics Data System (ADS)
Oppermann, M.; Zerna, T.
2017-02-01
Advanced electronics packaging is a challenge for non-destructive Testing (NDT). More, smaller and mostly hidden interconnects dominate modern electronics components and systems. To solve the demands of customers to get products with a high functionality by low volume, weight and price (e.g. mobile phones, personal medical monitoring systems) often the designers use System-in-Package solutions (SiP). The non-destructive testing of such devices is a big challenge. So our paper will impart fundamentals and applications for non-destructive evaluation of inner structures of electronics packaging for quality assurance and reliability investigations with a focus on X-ray methods, especially on high resolution X-ray computed tomography (CT).
Probalistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Xapsos, Michael
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element
Heavy Ion Irradiation Fluence Dependence for Single-Event Upsets in a NAND Flash Memory
NASA Technical Reports Server (NTRS)
Chen, Dakai; Wilcox, Edward; Ladbury, Raymond L.; Kim, Hak; Phan, Anthony; Seidleck, Christina; Label, Kenneth
2016-01-01
We investigated the single-event effect (SEE) susceptibility of the Micron 16 nm NAND flash, and found that the single-event upset (SEU) cross section varied inversely with cumulative fluence. We attribute the effect to the variable upset sensitivities of the memory cells. Furthermore, the effect impacts only single cell upsets in general. The rate of multiple-bit upsets remained relatively constant with fluence. The current test standards and procedures assume that SEU follow a Poisson process and do not take into account the variability in the error rate with fluence. Therefore, traditional SEE testing techniques may underestimate the on-orbit event rate for a device with variable upset sensitivity.
Integrating Commercial Off-The-Shelf (COTS) graphics and extended memory packages with CLIPS
NASA Technical Reports Server (NTRS)
Callegari, Andres C.
1990-01-01
This paper addresses the question of how to mix CLIPS with graphics and how to overcome PC's memory limitations by using the extended memory available in the computer. By adding graphics and extended memory capabilities, CLIPS can be converted into a complete and powerful system development tool, on the other most economical and popular computer platform. New models of PCs have amazing processing capabilities and graphic resolutions that cannot be ignored and should be used to the fullest of their resources. CLIPS is a powerful expert system development tool, but it cannot be complete without the support of a graphics package needed to create user interfaces and general purpose graphics, or without enough memory to handle large knowledge bases. Now, a well known limitation on the PC's is the usage of real memory which limits CLIPS to use only 640 Kb of real memory, but now that problem can be solved by developing a version of CLIPS that uses extended memory. The user has access of up to 16 MB of memory on 80286 based computers and, practically, all the available memory (4 GB) on computers that use the 80386 processor. So if we give CLIPS a self-configuring graphics package that will automatically detect the graphics hardware and pointing device present in the computer, and we add the availability of the extended memory that exists in the computer (with no special hardware needed), the user will be able to create more powerful systems at a fraction of the cost and on the most popular, portable, and economic platform available such as the PC platform.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
A Modular Three-Dimensional Finite-Difference Ground-Water Flow Model
McDonald, Michael G.; Harbaugh, Arlen W.; Guo, Weixing; Lu, Guoping
1988-01-01
This report presents a finite-difference model and its associated modular computer program. The model simulates flow in three dimensions. The report includes detailed explanations of physical and mathematical concepts on which the model is based and an explanation of how those concepts are incorporated in the modular structure of the computer program. The modular structure consists of a Main Program and a series of highly independent subroutines called 'modules.' The modules are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system which is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving linear equations which describe the flow system, such as the Strongly Implicit Procedure or Slice-Successive Overrelaxation. The division of the program into modules permits the user to examine specific hydrologic features of the model independently. This also facilita development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program are also designed to permit maximum flexibility. Ground-water flow within the aquifer is simulated using a block-centered finite-difference approach. Layers can be simulated as confined, unconfined, or a combination of confined and unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and streams, can also be simulated. The finite-difference equations can be solved using either the Strongly Implicit Procedure or Slice-Successive Overrelaxation. The program is written in FORTRAN 77 and will run without modification on most computers that have a FORTRAN 77 compiler. For each program ,module, this report includes a narrative description, a flow chart, a list of variables, and a module listing.
GillesPy: A Python Package for Stochastic Model Building and Simulation.
Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R
2016-09-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.
GillesPy: A Python Package for Stochastic Model Building and Simulation
Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.
2017-01-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888
''Do-it-yourself'' software program calculates boiler efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-03-01
An easy-to-use software package is described which runs on the IBM Personal Computer. The package calculates boiler efficiency, an important parameter of operating costs and equipment wellbeing. The program stores inputs and calculated results for 20 sets of boiler operating data, called cases. Cases can be displayed and modified on the CRT screen through multiple display pages or copied to a printer. All intermediate calculations are performed by this package. They include: steam enthalpy; water enthalpy; air humidity; gas, oil, coal, and wood heat capacity; and radiation losses.
Basic mathematical function libraries for scientific computation
NASA Technical Reports Server (NTRS)
Galant, David C.
1989-01-01
Ada packages implementing selected mathematical functions for the support of scientific and engineering applications were written. The packages provide the Ada programmer with the mathematical function support found in the languages Pascal and FORTRAN as well as an extended precision arithmetic and a complete complex arithmetic. The algorithms used are fully described and analyzed. Implementation assumes that the Ada type FLOAT objects fully conform to the IEEE 754-1985 standard for single binary floating-point arithmetic, and that INTEGER objects are 32-bit entities. Codes for the Ada packages are included as appendixes.
Parallel Climate Data Assimilation PSAS Package
NASA Technical Reports Server (NTRS)
Ding, Hong Q.; Chan, Clara; Gennery, Donald B.; Ferraro, Robert D.
1996-01-01
We have designed and implemented a set of highly efficient and highly scalable algorithms for an unstructured computational package, the PSAS data assimilation package, as demonstrated by detailed performance analysis of systematic runs on up to 512node Intel Paragon. The equation solver achieves a sustained 18 Gflops performance. As the results, we achieved an unprecedented 100-fold solution time reduction on the Intel Paragon parallel platform over the Cray C90. This not only meets and exceeds the DAO time requirements, but also significantly enlarges the window of exploration in climate data assimilations.
PCE: web tools to compute protein continuum electrostatics
Miteva, Maria A.; Tufféry, Pierre; Villoutreix, Bruno O.
2005-01-01
PCE (protein continuum electrostatics) is an online service for protein electrostatic computations presently based on the MEAD (macroscopic electrostatics with atomic detail) package initially developed by D. Bashford [(2004) Front Biosci., 9, 1082–1099]. This computer method uses a macroscopic electrostatic model for the calculation of protein electrostatic properties, such as pKa values of titratable groups and electrostatic potentials. The MEAD package generates electrostatic energies via finite difference solution to the Poisson–Boltzmann equation. Users submit a PDB file and PCE returns potentials and pKa values as well as color (static or animated) figures displaying electrostatic potentials mapped on the molecular surface. This service is intended to facilitate electrostatics analyses of proteins and thereby broaden the accessibility to continuum electrostatics to the biological community. PCE can be accessed at . PMID:15980492
Automatic computation of the travelling wave solutions to nonlinear PDEs
NASA Astrophysics Data System (ADS)
Liang, Songxin; Jeffrey, David J.
2008-05-01
Various extensions of the tanh-function method and their implementations for finding explicit travelling wave solutions to nonlinear partial differential equations (PDEs) have been reported in the literature. However, some solutions are often missed by these packages. In this paper, a new algorithm and its implementation called TWS for solving single nonlinear PDEs are presented. TWS is implemented in MAPLE 10. It turns out that, for PDEs whose balancing numbers are not positive integers, TWS works much better than existing packages. Furthermore, TWS obtains more solutions than existing packages for most cases. Program summaryProgram title:TWS Catalogue identifier:AEAM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAM_v1_0.html Program obtainable from:CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.:1250 No. of bytes in distributed program, including test data, etc.:78 101 Distribution format:tar.gz Programming language:Maple 10 Computer:A laptop with 1.6 GHz Pentium CPU Operating system:Windows XP Professional RAM:760 Mbytes Classification:5 Nature of problem:Finding the travelling wave solutions to single nonlinear PDEs. Solution method:Based on tanh-function method. Restrictions:The current version of this package can only deal with single autonomous PDEs or ODEs, not systems of PDEs or ODEs. However, the PDEs can have any finite number of independent space variables in addition to time t. Unusual features:For PDEs whose balancing numbers are not positive integers, TWS works much better than existing packages. Furthermore, TWS obtains more solutions than existing packages for most cases. Additional comments:It is easy to use. Running time:Less than 20 seconds for most cases, between 20 to 100 seconds for some cases, over 100 seconds for few cases. References: [1] E.S. Cheb-Terrab, K. von Bulow, Comput. Phys. Comm. 90 (1995) 102. [2] S.A. Elwakil, S.K. El-Labany, M.A. Zahran, R. Sabry, Phys. Lett. A 299 (2002) 179. [3] E. Fan, Phys. Lett. 277 (2000) 212. [4] W. Malfliet, Amer. J. Phys. 60 (1992) 650. [5] W. Malfliet, W. Hereman, Phys. Scripta 54 (1996) 563. [6] E.J. Parkes, B.R. Duffy, Comput. Phys. Comm. 98 (1996) 288.
Radiation damage induced in Al2O3 single crystal by 90 MeV Xe ions
NASA Astrophysics Data System (ADS)
Zirour, H.; Izerrouken, M.; Sari, A.
2015-12-01
Radiation damage induced in Al2O3 single crystal by 90 MeV Xe ions were investigated by optical absorption measurements, Raman spectroscopy and X-ray diffraction (XRD) techniques. The irradiations were performed at the GANIL accelerator in Caen, France for the fluence in the range from 1012 to 6 × 1013 cm-2 at room temperature under normal incidence. The F+ and F22+ centers kinetic as a function of fluence deduced from the optical measurements explains that the single defects (F and F+) aggregate to F center clusters (F2 , F2+, F22+) during irradiation at high fluence (>1013 cm-2). Raman and XRD analysis reveal a partial disorder of 40% of Al2O3 in the studied fluence range in accordance with Kabir et al. (2008) study. The result suggests that this is due to the stress relaxation process which occurs at high fluence (>1013 cm-2).
308-nm excimer laser ablation of human cartilage
NASA Astrophysics Data System (ADS)
Prodoehl, John A.; Rhodes, Anthony L.; Meller, Menachem M.; Sherk, Henry H.
1993-07-01
The XeCl excimer laser was investigated as an ablating tool for human fibrocartilage and hyaline cartilage. Quantitative measurements were made of tissue ablation rates as a function of fluence in meniscal fibrocartilage and articular hyaline cartilage. A force of 1.47 Newtons was applied to an 800 micrometers fiber with the laser delivering a range of fluences (40 to 190 mj/mm2) firing at a frequency of 5 Hz. To assess the effect of repetition rate on ablation rate, a set of measurements was made at a constant fluence of 60 mj/mm2, with the repetition rate varying from 10 to 40 Hz. Histologic and morphometric analysis was performed using light microscopy. The results of these studies revealed that the ablation rate was directly proportional to fluence over the range tested. Fibrocartilage was ablated at a rate 2.56 times faster than hyaline cartilage at the maximum fluence tested. Repetition rate had no effect on the penetration per pulse. Adjacent tissue damage was noted to be minimal (10 - 70 micrometers ).
Atomic Oxygen Erosion Yield Dependence Upon Texture Development in Polymers
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Loftus, Ryan J.; Miller, Sharon K.
2016-01-01
The atomic oxygen erosion yield (volume of a polymer that is lost due to oxidation per incident atom) of polymers is typically assumed to be reasonably constant with increasing fluence. However polymers containing ash or inorganic pigments, tend to have erosion yields that decrease with fluence due to an increasing presence of protective particles on the polymer surface. This paper investigates two additional possible causes for erosion yields of polymers that are dependent upon atomic oxygen. These are the development of surface texture which can cause the erosion yield to change with fluence due to changes in the aspect ratio of the surface texture that develops and polymer specific atomic oxygen interaction parameters. The surface texture development under directed hyperthermal attack produces higher aspect ratio surface texture than isotropic thermal energy atomic oxygen attack. The fluence dependence of erosion yields is documented for low Kapton H (DuPont, Wilmington, DE) effective fluences for a variety of polymers under directed hyperthermal and isotropic thermal energy attack.
Lattice disorder produced in GaN by He-ion implantation
NASA Astrophysics Data System (ADS)
Han, Yi; Peng, Jinxin; Li, Bingsheng; Wang, Zhiguang; Wei, Kongfang; Shen, Tielong; Sun, Jianrong; Zhang, Limin; Yao, Cunfeng; Gao, Ning; Gao, Xing; Pang, Lilong; Zhu, Yabin; Chang, Hailong; Cui, Minghuan; Luo, Peng; Sheng, Yanbin; Zhang, Hongpeng; Zhang, Li; Fang, Xuesong; Zhao, Sixiang; Jin, Jin; Huang, Yuxuan; Liu, Chao; Tai, Pengfei; Wang, Dong; He, Wenhao
2017-09-01
The lattice disorders induced by He-ion implantation in GaN epitaxial films to fluences of 2 × 1016, 5 × 1016 and 1 × 1017 cm-2 at room temperature (RT) have been investigated by a combination of Raman spectroscopy, high-resolution X-ray diffraction (HRXRD), nano-indentation, and transmission electron microscopy (TEM). The experimental results present that Raman intensity decreases with increasing fluence. Raman frequency "red shift" occurs after He-ion implantation. Strain increases with increasing fluence. The hardness of the highly damaged layer increases monotonically with increasing fluence. Microstructural results demonstrate that the width of the damage band and the number density of observed dislocation loops increases with increasing fluence. High-resolution TEM images exhibit that He-ion implantation lead to the formation of planar defects and most of the lattice defects are of interstitial-type basal loops. The relationships of Raman intensity, lattice strain, swelling and hardness with He-implantation-induced lattice disorders are discussed.
Design of dual multiple aperture devices for dynamical fluence field modulated CT.
Mathews, Aswin John; Tilley, Steven; Gang, Grace; Kawamoto, Satomi; Zbijewski, Wojciech; Siewerdsen, Jeffrey H; Levinson, Reuven; Webster Stayman, J
2016-07-01
A Multiple Aperture Device (MAD) is a novel x-ray beam modulator that uses binary filtration on a fine scale to spatially modulate an x-ray beam. Using two MADs in series enables a large variety of fluence profiles by shifting the MADS relative to each other. This work details the design and control of dual MADs for a specific class of desired fluence patterns. Specifically, models of MAD operation are integrated into a best fit objective followed by CMA-ES optimization. To illustrate this framework we demonstrate the design process for an abdominal phantom with the goal of uniform detected signal. Achievable fluence profiles show good agreement with target fluence profiles, and the ability to flatten projections when a phantom is scanned is demonstrated. Simulated data reconstruction using traditional tube current modulation (TCM) and MAD filtering with TCM are investigated with the dual MAD system demonstrating more uniformity in noise and illustrating the potential for dose reduction under a maximum noise level constraint.
Synthesis of sponge-like hydrophobic NiBi3 surface by 200 keV Ar ion implantation
NASA Astrophysics Data System (ADS)
Siva, Vantari; Datta, D. P.; Chatterjee, S.; Varma, S.; Kanjilal, D.; Sahoo, Pratap K.
2017-07-01
Sponge-like nanostructures develop under Ar-ion implantation of a Ni-Bi bilayer with increasing ion fluence at room temperature. The surface morphology features different stages of evolution as a function of ion fluence, finally resulting in a planar surface at the highest fluence. Our investigations on the chemical composition reveal a spontaneous formation of NiBi3 phase on the surface of the as deposited bilayer film. Interestingly, we observe a competition between crystallization and amorphization of the existing poly-crystalline phases as a function of the implanted fluence. Measurements of contact angle by sessile drop method clearly show the ion-fluence dependent hydrophobic nature of the nano-structured surfaces. The wettability has been correlated with the variation in roughness and composition of the implanted surface. In fact, our experimental results confirm dominant effect of ion-sputtering as well as ion-induced mixing at the bilayer interface in the evolution of the sponge-like surface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Böcklin, Christoph, E-mail: boecklic@ethz.ch; Baumann, Dirk; Fröhlich, Jürg
A novel way to attain three dimensional fluence rate maps from Monte-Carlo simulations of photon propagation is presented in this work. The propagation of light in a turbid medium is described by the radiative transfer equation and formulated in terms of radiance. For many applications, particularly in biomedical optics, the fluence rate is a more useful quantity and directly derived from the radiance by integrating over all directions. Contrary to the usual way which calculates the fluence rate from absorbed photon power, the fluence rate in this work is directly calculated from the photon packet trajectory. The voxel based algorithmmore » works in arbitrary geometries and material distributions. It is shown that the new algorithm is more efficient and also works in materials with a low or even zero absorption coefficient. The capabilities of the new algorithm are demonstrated on a curved layered structure, where a non-scattering, non-absorbing layer is sandwiched between two highly scattering layers.« less
New method for estimation of fluence complexity in IMRT fields and correlation with gamma analysis
NASA Astrophysics Data System (ADS)
Hanušová, T.; Vondráček, V.; Badraoui-Čuprová, K.; Horáková, I.; Koniarová, I.
2015-01-01
A new method for estimation of fluence complexity in Intensity Modulated Radiation Therapy (IMRT) fields is proposed. Unlike other previously published works, it is based on portal images calculated by the Portal Dose Calculation algorithm in Eclipse (version 8.6, Varian Medical Systems) in the plane of the EPID aS500 detector (Varian Medical Systems). Fluence complexity is given by the number and the amplitudes of dose gradients in these matrices. Our method is validated using a set of clinical plans where fluence has been smoothed manually so that each plan has a different level of complexity. Fluence complexity calculated with our tool is in accordance with the different levels of smoothing as well as results of gamma analysis, when calculated and measured dose matrices are compared. Thus, it is possible to estimate plan complexity before carrying out the measurement. If appropriate thresholds are determined which would distinguish between acceptably and overly modulated plans, this might save time in the re-planning and re-measuring process.
Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.
Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B
2010-09-01
The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1988
1988-01-01
Reviews three computer software packages for chemistry education including "Osmosis and Diffusion" and "E.M.E. Titration Lab" for Apple II and "Simplex-V: An Interactive Computer Program for Experimental Optimization" for IBM PC. Summary ratings include ease of use, content, pedagogic value, student reaction, and…
Technology in the College Classroom.
ERIC Educational Resources Information Center
Earl, Archie W., Sr.
An analysis was made of the use of computing tools at the graduate and undergraduate levels in colleges and universities in the United States. Topics ranged from hand-held calculators to the use of main-frame computers and the assessment of the SPSSX, SPSS, LINDO, and MINITAB computer software packages. Hand-held calculators are being increasingly…
NASA Technical Reports Server (NTRS)
Svalbonas, V.; Ogilvie, P.
1975-01-01
A special data debugging package called SAT-1P created for the STARS-2P computer program is described. The program was written exclusively in FORTRAN 4 for the IBM 370-165 computer, and then converted to the UNIVAC 1108.
A Systematic Approach for Understanding Slater-Gaussian Functions in Computational Chemistry
ERIC Educational Resources Information Center
Stewart, Brianna; Hylton, Derrick J.; Ravi, Natarajan
2013-01-01
A systematic way to understand the intricacies of quantum mechanical computations done by a software package known as "Gaussian" is undertaken via an undergraduate research project. These computations involve the evaluation of key parameters in a fitting procedure to express a Slater-type orbital (STO) function in terms of the linear…
Psychology on Computers: Simulations, Experiments and Projects.
ERIC Educational Resources Information Center
Belcher, Duane M.; Smith, Stephen D.
PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…
An Application of a Computer Instructional Management Package.
ERIC Educational Resources Information Center
Sullivan, David W.
Following the presentation of a conceptual framework for computer-based education (CBE), this paper examines the use of one aspect of CBE, computer-managed instruction (CMI), in a Major Appliance Serving Program. The paper begins with definitions and a graphic illustration of CBE and its components and uses, i.e., CMI, tutorial or…