Sample records for transp analysis code

  1. Status and Plans for the TRANSP Interpretive and Predictive Simulation Code

    NASA Astrophysics Data System (ADS)

    Kaye, Stanley; Andre, Robert; Marina, Gorelenkova; Yuan, Xingqui; Hawryluk, Richard; Jardin, Steven; Poli, Francesca

    2015-11-01

    TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT_SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP also incorporates such source models as NUBEAM for neutral beam injection, GENRAY, TORAY, TORBEAM, TORIC and CQL3D for ICRH, LHCD, ECH and HHFW. The implementation of selected components makes efficient use of MPI for speed up of code calculations. TRANSP has a wide international user-base, and it is run on the FusionGrid to allow for timely support and quick turnaround by the PPPL Computational Plasma Physics Group. It is being used as a basis for both analysis and development of control algorithms and discharge operational scenarios, including simulation of ITER plasmas. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Progress on implementing TRANSP as a component in the ITER IMAS will also be described. This research was supported by the U.S. Department of Energy under contracts DE-AC02-09CH11466.

  2. Implementation of a 3D halo neutral model in the TRANSP code and application to projected NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Medley, S. S.; Liu, D.; Gorelenkova, M. V.; Heidbrink, W. W.; Stagner, L.

    2016-02-01

    A 3D halo neutral code developed at the Princeton Plasma Physics Laboratory and implemented for analysis using the TRANSP code is applied to projected National Spherical Torus eXperiment-Upgrade (NSTX-U plasmas). The legacy TRANSP code did not handle halo neutrals properly since they were distributed over the plasma volume rather than remaining in the vicinity of the neutral beam footprint as is actually the case. The 3D halo neutral code uses a ‘beam-in-a-box’ model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components produce first generation halo neutrals that are tracked through successive generations until an ionization event occurs or the descendant halos exit the box. The 3D halo neutral model and neutral particle analyzer (NPA) simulator in the TRANSP code have been benchmarked with the Fast-Ion D-Alpha simulation (FIDAsim) code, which provides Monte Carlo simulations of beam neutral injection, attenuation, halo generation, halo spatial diffusion, and photoemission processes. When using the same atomic physics database, TRANSP and FIDAsim simulations achieve excellent agreement on the spatial profile and magnitude of beam and halo neutral densities and the NPA energy spectrum. The simulations show that the halo neutral density can be comparable to the beam neutral density. These halo neutrals can double the NPA flux, but they have minor effects on the NPA energy spectrum shape. The TRANSP and FIDAsim simulations also suggest that the magnitudes of beam and halo neutral densities are relatively sensitive to the choice of the atomic physics databases.

  3. Implementation of a 3D halo neutral model in the TRANSP code and application to projected NSTX-U plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medley, S. S.; Liu, D.; Gorelenkova, M. V.

    2016-01-12

    A 3D halo neutral code developed at the Princeton Plasma Physics Laboratory and implemented for analysis using the TRANSP code is applied to projected National Spherical Torus eXperiment-Upgrade (NSTX-U plasmas). The legacy TRANSP code did not handle halo neutrals properly since they were distributed over the plasma volume rather than remaining in the vicinity of the neutral beam footprint as is actually the case. The 3D halo neutral code uses a 'beam-in-a-box' model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components producemore » first generation halo neutrals that are tracked through successive generations until an ionization event occurs or the descendant halos exit the box. The 3D halo neutral model and neutral particle analyzer (NPA) simulator in the TRANSP code have been benchmarked with the Fast-Ion D-Alpha simulation (FIDAsim) code, which provides Monte Carlo simulations of beam neutral injection, attenuation, halo generation, halo spatial diffusion, and photoemission processes. When using the same atomic physics database, TRANSP and FIDAsim simulations achieve excellent agreement on the spatial profile and magnitude of beam and halo neutral densities and the NPA energy spectrum. The simulations show that the halo neutral density can be comparable to the beam neutral density. These halo neutrals can double the NPA flux, but they have minor effects on the NPA energy spectrum shape. The TRANSP and FIDAsim simulations also suggest that the magnitudes of beam and halo neutral densities are relatively sensitive to the choice of the atomic physics databases.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold H. Kritz

    PTRANSP, which is the predictive version of the TRANSP code, was developed in a collaborative effort involving the Princeton Plasma Physics Laboratory, General Atomics Corporation, Lawrence Livermore National Laboratory, and Lehigh University. The PTRANSP/TRANSP suite of codes is the premier integrated tokamak modeling software in the United States. A production service for PTRANSP/TRANSP simulations is maintained at the Princeton Plasma Physics Laboratory; the server has a simple command line client interface and is subscribed to by about 100 researchers from tokamak projects in the US, Europe, and Asia. This service produced nearly 13000 PTRANSP/TRANSP simulations in the four year periodmore » FY 2005 through FY 2008. Major archives of TRANSP results are maintained at PPPL, MIT, General Atomics, and JET. Recent utilization, counting experimental analysis simulations as well as predictive simulations, more than doubled from slightly over 2000 simulations per year in FY 2005 and FY 2006 to over 4300 simulations per year in FY 2007 and FY 2008. PTRANSP predictive simulations applied to ITER increased eight fold from 30 simulations per year in FY 2005 and FY 2006 to 240 simulations per year in FY 2007 and FY 2008, accounting for more than half of combined PTRANSP/TRANSP service CPU resource utilization in FY 2008. PTRANSP studies focused on ITER played a key role in journal articles. Examples of validation studies carried out for momentum transport in PTRANSP simulations were presented at the 2008 IAEA conference. The increase in number of PTRANSP simulations has continued (more than 7000 TRANSP/PTRANSP simulations in 2010) and results of PTRANSP simulations appear in conference proceedings, for example the 2010 IAEA conference, and in peer reviewed papers. PTRANSP provides a bridge to the Fusion Simulation Program (FSP) and to the future of integrated modeling. Through years of widespread usage, each of the many parts of the PTRANSP suite of codes has been thoroughly validated against experimental data and benchmarked against other codes. At the same time, architectural modernizations are improving the modularity of the PTRANSP code base. The NUBEAM neutral beam and fusion products fast ion model, the Plasma State data repository (developed originally in the SWIM SciDAC project and adapted for use in PTRANSP), and other components are already shared with the SWIM, FACETS, and CPES SciDAC FSP prototype projects. Thus, the PTRANSP code is already serving as a bridge between our present integrated modeling capability and future capability. As the Fusion Simulation Program builds toward the facility currently available in the PTRANSP suite of codes, early versions of the FSP core plasma model will need to be benchmarked against the PTRANSP simulations. This will be necessary to build user confidence in FSP, but this benchmarking can only be done if PTRANSP itself is maintained and developed.« less

  5. TRANSP: status and planning

    NASA Astrophysics Data System (ADS)

    Andre, R.; Carlsson, J.; Gorelenkova, M.; Jardin, S.; Kaye, S.; Poli, F.; Yuan, X.

    2016-10-01

    TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT- SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP incorporates high fidelity heating and current drive source models, such as NUBEAM for neutral beam injection, the beam tracing code TORBEAM for EC, TORIC for ICRF, the ray tracing TORAY and GENRAY for EC. The implementation of selected components makes efficient use of MPI for speed up of code calculations. Recently the GENRAY-CQL3D solver for modeling of LH heating and current drive has been implemented and currently being extended to multiple antennas, to allow modeling of EAST discharges. Also, GENRAY+CQL3D is being extended to the use of EC/EBW and of HHFW for NSTX-U. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Work supported by the US Department of Energy under DE-AC02-CH0911466.

  6. Tearing Mode Stability of Evolving Toroidal Equilibria

    NASA Astrophysics Data System (ADS)

    Pletzer, A.; McCune, D.; Manickam, J.; Jardin, S. C.

    2000-10-01

    There are a number of toroidal equilibrium (such as JSOLVER, ESC, EFIT, and VMEC) and transport codes (such as TRANSP, BALDUR, and TSC) in our community that utilize differing equilibrium representations. There are also many heating and current drive (LSC and TORRAY), and stability (PEST1-3, GATO, NOVA, MARS, DCON, M3D) codes that require this equilibrium information. In an effort to provide seamless compatibility between the codes that produce and need these equilibria, we have developed two Fortran 90 modules, MEQ and XPLASMA, that serve as common interfaces between these two classes of codes. XPLASMA provides a common equilibrium representation for the heating and current drive applications while MEQ provides common equilibrium and associated metric information needed by MHD stability codes. We illustrate the utility of this approach by presenting results of PEST-3 tearing stability calculations of an NSTX discharge performed on profiles provided by the TRANSP code. Using the MEQ module, the TRANSP equilibrium data are stored in a Fortran 90 derived type and passed to PEST3 as a subroutine argument. All calculations are performed on the fly, as the profiles evolve.

  7. Benchmark of 3D halo neutral simulation in TRANSP and FIDASIM and application to projected neutral-beam-heated NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Liu, D.; Medley, S. S.; Gorelenkova, M. V.; Heidbrink, W. W.; Stagner, L.

    2014-10-01

    A cloud of halo neutrals is created in the vicinity of beam footprint during the neutral beam injection and the halo neutral density can be comparable with beam neutral density. Proper modeling of halo neutrals is critical to correctly interpret neutral particle analyzers (NPA) and fast ion D-alpha (FIDA) signals since these signals strongly depend on local beam and halo neutral density. A 3D halo neutral model has been recently developed and implemented inside TRANSP code. The 3D halo neutral code uses a ``beam-in-a-box'' model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components produce thermal halo neutrals that are tracked through successive halo neutral generations until an ionization event occurs or a descendant halo exits the box. A benchmark between 3D halo neural model in TRANSP and in FIDA/NPA synthetic diagnostic code FIDASIM is carried out. Detailed comparison of halo neutral density profiles from two codes will be shown. The NPA and FIDA simulations with and without 3D halos are applied to projections of plasma performance for the National Spherical Tours eXperiment-Upgrade (NSTX-U) and the effects of halo neutral density on NPA and FIDA signal amplitude and profile will be presented. Work supported by US DOE.

  8. Measurements of confined alphas and tritons in the MHD quiescent core of TFTR plasmas using the pellet charge exchange diagnostic

    NASA Astrophysics Data System (ADS)

    Medley, S. S.; Budny, R. V.; Mansfield, D. K.; Redi, M. H.; Roquemore, A. L.; Fisher, R. K.; Duong, H. H.; McChesney, J. M.; Parks, P. B.; Petrov, M. P.; Gorelenkov, N. N.

    1996-10-01

    The energy distributions and radial density profiles of the fast confined trapped alpha particles in DT experiments on TFTR are being measured in the energy range 0.5 - 3.5 MeV using the pellet charge exchange (PCX) diagnostic. A brief description of the measurement technique which involves active neutral particle analysis using the ablation cloud surrounding an injected impurity pellet as the neutralizer is presented. This paper focuses on alpha and triton measurements in the core of MHD quiescent TFTR discharges where the expected classical slowing-down and pitch angle scattering effects are not complicated by stochastic ripple diffusion and sawtooth activity. In particular, the first measurement of the alpha slowing-down distribution up to the birth energy, obtained using boron pellet injection, is presented. The measurements are compared with predictions using either the TRANSP Monte Carlo code and/or a Fokker - Planck Post-TRANSP processor code, which assumes that the alphas and tritons are well confined and slow down classically. Both the shape of the measured alpha and triton energy distributions and their density ratios are in good agreement with the code calculations. We can conclude that the PCX measurements are consistent with classical thermalization of the fusion-generated alphas and tritons.

  9. Comparison of simulated heat transport in NSTX via high frequency Alfvén eigenmode-induced electron orbit modification with TRANSP power balance modeling

    NASA Astrophysics Data System (ADS)

    Crocker, N. A.; Tritz, K.; White, R. B.; Fredrickson, E. D.; Gorelenkov, N. N.; NSTX-U Team

    2016-10-01

    Compressional (CAE) and global (GAE) AEs have been hypothesized to cause an anomalously high electron thermal diffusivity (χe) routinely inferred via TRANSP power balance modeling in the core (r / a < 0.3) of NSTX beam heated plasmas. New simulations with the guiding-center code ORBIT test a leading proposed transport mechanism: electron orbit stochastization by multiple modes. Simulations with a set of modes identified as GAEs in a high performance, beam heated plasma-using experimentally determined amplitudes, frequencies and wave numbers-yield a χe insufficient to match TRANSP. To produce a comparable χe, the amplitudes must be increased by a factor of 10, which is outside the bounds of measurement uncertainty. Many observed modes, identified as CAEs, could not be included without modifications to ORBIT. These are in progress. However, given the uncertainties in identification, it is informative to calculate χe assuming all the observed modes are GAEs. This leads to substantially higher χe, although an amplitude increase by a factor > 3 is still necessary to match TRANSP. Supported by US DOE Contracts DE-SC0011810, DE-FG02-99ER54527 and DE-AC02-09CH11466.

  10. Feedback control design for non-inductively sustained scenarios in NSTX-U using TRANSP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyer, M. D.; Andre, R. G.; Gates, D. A.

    This paper examines a method for real-time control of non-inductively sustained scenarios in NSTX-U by using TRANSP, a time-dependent integrated modeling code for prediction and interpretive analysis of tokamak experimental data, as a simulator. The actuators considered for control in this work are the six neutral beam sources and the plasma boundary shape. To understand the response of the plasma current, stored energy, and central safety factor to these actuators and to enable systematic design of control algorithms, simulations were run in which the actuators were modulated and a linearized dynamic response model was generated. A multi-variable model-based control schememore » that accounts for the coupling and slow dynamics of the system while mitigating the effect of actuator limitations was designed and simulated. Simulations show that modest changes in the outer gap and heating power can improve the response time of the system, reject perturbations, and track target values of the controlled values.« less

  11. Feedback control design for non-inductively sustained scenarios in NSTX-U using TRANSP

    DOE PAGES

    Boyer, M. D.; Andre, R. G.; Gates, D. A.; ...

    2017-04-24

    This paper examines a method for real-time control of non-inductively sustained scenarios in NSTX-U by using TRANSP, a time-dependent integrated modeling code for prediction and interpretive analysis of tokamak experimental data, as a simulator. The actuators considered for control in this work are the six neutral beam sources and the plasma boundary shape. To understand the response of the plasma current, stored energy, and central safety factor to these actuators and to enable systematic design of control algorithms, simulations were run in which the actuators were modulated and a linearized dynamic response model was generated. A multi-variable model-based control schememore » that accounts for the coupling and slow dynamics of the system while mitigating the effect of actuator limitations was designed and simulated. Simulations show that modest changes in the outer gap and heating power can improve the response time of the system, reject perturbations, and track target values of the controlled values.« less

  12. Feedback control design for non-inductively sustained scenarios in NSTX-U using TRANSP

    NASA Astrophysics Data System (ADS)

    Boyer, M. D.; Andre, R. G.; Gates, D. A.; Gerhardt, S. P.; Menard, J. E.; Poli, F. M.

    2017-06-01

    This paper examines a method for real-time control of non-inductively sustained scenarios in NSTX-U by using TRANSP, a time-dependent integrated modeling code for prediction and interpretive analysis of tokamak experimental data, as a simulator. The actuators considered for control in this work are the six neutral beam sources and the plasma boundary shape. To understand the response of the plasma current, stored energy, and central safety factor to these actuators and to enable systematic design of control algorithms, simulations were run in which the actuators were modulated and a linearized dynamic response model was generated. A multi-variable model-based control scheme that accounts for the coupling and slow dynamics of the system while mitigating the effect of actuator limitations was designed and simulated. Simulations show that modest changes in the outer gap and heating power can improve the response time of the system, reject perturbations, and track target values of the controlled values.

  13. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    NASA Astrophysics Data System (ADS)

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; White, R. B.

    2017-09-01

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. In this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that has been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Additional information from the actual experiment enables further tuning of the model’s parameters to achieve a close match with measurements.

  14. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less

  15. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    DOE PAGES

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; ...

    2017-07-20

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less

  16. Initial applications of the non-Maxwellian extension of the full-wave TORIC v.5 code in the mid/high harmonic and minority heating regimes

    NASA Astrophysics Data System (ADS)

    Bertelli, N.; Valeo, E. J.; Phillips, C. K.

    2015-11-01

    A non Maxwellian extension of the full wave TORIC v.5 code in the mid/high harmonic and minority heating regimes has been revisited. In both regimes the treatment of the non-Maxwellian ions is needed in order to improve the analysis of combined fast wave (FW) and neutral beam injection (NBI) heated discharges in the current fusion devices. Additionally, this extension is also needed in time-dependent analysis where the combined heating experiments are generally considered. Initial numerical cases with thermal ions and with a non-Maxwellian ions are presented for both regimes. The simulations are then compared with results from the AORSA code, which has already been extended to include non-Maxwellian ions. First attempts to apply this extension in a self-consistent way with the NUBEAM module, which is included in the TRANSP code, are also discussed. Work supported by US DOE Contracts # DE-FC02-01ER54648 and DE-AC02-09CH11466.

  17. Analysis of activation and shutdown contact dose rate for EAST neutral beam port

    NASA Astrophysics Data System (ADS)

    Chen, Yuqing; Wang, Ji; Zhong, Guoqiang; Li, Jun; Wang, Jinfang; Xie, Yahong; Wu, Bin; Hu, Chundong

    2017-12-01

    For the safe operation and maintenance of neutral beam injector (NBI), specific activity and shutdown contact dose rate of the sample material SS316 are estimated around the experimental advanced superconducting tokamak (EAST) neutral beam port. Firstly, the neutron emission intensity is calculated by TRANSP code while the neutral beam is co-injected to EAST. Secondly, the neutron activation and shutdown contact dose rates for the neutral beam sample materials SS316 are derived by the Monte Carlo code MCNP and the inventory code FISPACT-2007. The simulations indicate that the primary radioactive nuclides of SS316 are 58Co and 54Mn. The peak contact dose rate is 8.52 × 10-6 Sv/h after EAST shutdown one second. That is under the International Thermonuclear Experimental Reactor (ITER) design values 1 × 10-5 Sv/h.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.

    In this study, a model-based feedback system is presented enabling the simultaneous control of the stored energy through β n and the toroidal rotation profile of the plasma in National Spherical Torus eXperiment Upgrade device. Actuation is obtained using the momentum from six injected neutral beams and the neoclassical toroidal viscosity generated by applying three-dimensional magnetic fields. Based on a model of the momentum diffusion and torque balance, a feedback controller is designed and tested in closed-loop simulations using TRANSP, a time dependent transport analysis code, in predictive mode. Promising results for the ongoing experimental implementation of controllers are obtained.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Guoyong; Budny, Robert; Gorelenkov, Nikolai

    We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfvenmore » modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (R. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady-state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally, the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.« less

  20. Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grierson, B. A.; Yuan, X.; Gorelenkova, M.

    TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less

  1. Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT

    DOE PAGES

    Grierson, B. A.; Yuan, X.; Gorelenkova, M.; ...

    2018-02-21

    TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less

  2. Central safety factor and β N control on NSTX-U via beam power and plasma boundary shape modification, using TRANSP for closed loop simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyer, M. D.; Andre, R.; Gates, D. A.

    The high-performance operational goals of NSTX-U will require development of advanced feedback control algorithms, including control of ßN and the safety factor profile. In this work, a novel approach to simultaneously controlling ßN and the value of the safety factor on the magnetic axis, q0, through manipulation of the plasma boundary shape and total beam power, is proposed. Simulations of the proposed scheme show promising results and motivate future experimental implementation and eventual integration into a more complex current profile control scheme planned to include actuation of individual beam powers, density, and loop voltage. As part of this work, amore » flexible framework for closed loop simulations within the high-fidelity code TRANSP was developed. The framework, used here to identify control-design-oriented models and to tune and test the proposed controller, exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc.). The flexible framework should enable high-fidelity testing of a variety of control algorithms, thereby reducing the amount of expensive experimental time needed to implement new control algorithms on NSTX-U and other devices.« less

  3. TRANSP-based Trajectory Optimization of the Current Profile Evolution to Facilitate Robust Non-inductive Ramp-up in NSTX-U

    NASA Astrophysics Data System (ADS)

    Wehner, William; Schuster, Eugenio; Poli, Francesca

    2016-10-01

    Initial progress towards the design of non-inductive current ramp-up scenarios in the National Spherical Torus Experiment Upgrade (NSTX-U) has been made through the use of TRANSP predictive simulations. The strategy involves, first, ramping the plasma current with high harmonic fast waves (HHFW) to about 400 kA, and then further ramping to 900 kA with neutral beam injection (NBI). However, the early ramping of neutral beams and application of HHFW leads to an undesirably peaked current profile making the plasma unstable to ballooning modes. We present an optimization-based control approach to improve on the non-inductive ramp-up strategy. We combine the TRANSP code with an optimization algorithm based on sequential quadratic programming to search for time evolutions of the NBI powers, the HHFW powers, and the line averaged density that define an open-loop actuator strategy that maximizes the non-inductive current while satisfying constraints associated with the current profile evolution for MHD stable plasmas. This technique has the potential of playing a critical role in achieving robustly stable non-inductive ramp-up, which will ultimately be necessary to demonstrate applicability of the spherical torus concept to larger devices without sufficient room for a central coil. Supported by the US DOE under the SCGSR Program.

  4. Central safety factor and βN control on NSTX-U via beam power and plasma boundary shape modification, using TRANSP for closed loop simulations

    NASA Astrophysics Data System (ADS)

    Boyer, M. D.; Andre, R.; Gates, D. A.; Gerhardt, S.; Goumiri, I. R.; Menard, J.

    2015-05-01

    The high-performance operational goals of NSTX-U will require development of advanced feedback control algorithms, including control of βN and the safety factor profile. In this work, a novel approach to simultaneously controlling βN and the value of the safety factor on the magnetic axis, q0, through manipulation of the plasma boundary shape and total beam power, is proposed. Simulations of the proposed scheme show promising results and motivate future experimental implementation and eventual integration into a more complex current profile control scheme planned to include actuation of individual beam powers, density, and loop voltage. As part of this work, a flexible framework for closed loop simulations within the high-fidelity code TRANSP was developed. The framework, used here to identify control-design-oriented models and to tune and test the proposed controller, exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc). The flexible framework should enable high-fidelity testing of a variety of control algorithms, thereby reducing the amount of expensive experimental time needed to implement new control algorithms on NSTX-U and other devices.

  5. Statistical validation of predictive TRANSP simulations of baseline discharges in preparation for extrapolation to JET D-T

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Tae; Romanelli, M.; Yuan, X.; Kaye, S.; Sips, A. C. C.; Frassinetti, L.; Buchanan, J.; Contributors, JET

    2017-06-01

    This paper presents for the first time a statistical validation of predictive TRANSP simulations of plasma temperature using two transport models, GLF23 and TGLF, over a database of 80 baseline H-mode discharges in JET-ILW. While the accuracy of the predicted T e with TRANSP-GLF23 is affected by plasma collisionality, the dependency of predictions on collisionality is less significant when using TRANSP-TGLF, indicating that the latter model has a broader applicability across plasma regimes. TRANSP-TGLF also shows a good matching of predicted T i with experimental measurements allowing for a more accurate prediction of the neutron yields. The impact of input data and assumptions prescribed in the simulations are also investigated in this paper. The statistical validation and the assessment of uncertainty level in predictive TRANSP simulations for JET-ILW-DD will constitute the basis for the extrapolation to JET-ILW-DT experiments.

  6. Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas

    NASA Astrophysics Data System (ADS)

    Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.

    2017-10-01

    KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.

  7. RF current profile control studies in the alcator C-mod tokamak

    NASA Astrophysics Data System (ADS)

    Bonoli, P. T.; Porkolab, M.; Wukitch, S. J.; Bernabei, S.; Kaita, R.; Mikkelsen, D.; Phillips, C. K.; Schilling, G.

    1999-09-01

    Time dependent calculations of lower hybrid (LH) current profile control in Alcator C-Mod have been done using the TRANSP [1], FPPRF [2], and LSC [3] codes. Up to 3 MW of LH current drive power was applied in plasmas with high power ICRF minority heating (PICH=1.8-3 MW) and fast current ramp up. Using the experimentally measured temperature profiles, off-axis current generation resulted in nonmonotonic q-profiles with qmin~=1.6. Self-consistent effects of off-axis electron heating by the LH power were also included in the analysis and significant broadening of the electron temperature profile was found with qmin>~2 and a larger shear reversal radius.

  8. Simultaneous feedback control of plasma rotation and stored energy on NSTX-U using neoclassical toroidal viscosity and neutral beam injection

    NASA Astrophysics Data System (ADS)

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.; Gates, D. A.; Boyer, M. D.; Gerhardt, S. P.; Kolemen, E.; Menard, J. E.

    2017-05-01

    A model-based feedback system is presented enabling the simultaneous control of the stored energy through βn and the toroidal rotation profile of the plasma in National Spherical Torus eXperiment Upgrade device. Actuation is obtained using the momentum from six injected neutral beams and the neoclassical toroidal viscosity generated by applying three-dimensional magnetic fields. Based on a model of the momentum diffusion and torque balance, a feedback controller is designed and tested in closed-loop simulations using TRANSP, a time dependent transport analysis code, in predictive mode. Promising results for the ongoing experimental implementation of controllers are obtained.

  9. Simultaneous feedback control of plasma rotation and stored energy on NSTX-U using neoclassical toroidal viscosity and neutral beam injection

    PubMed Central

    Goumiri, I. R.; Sabbagh, S. A.; Boyer, M. D.; Gerhardt, S. P.; Kolemen, E.; Menard, J. E.

    2017-01-01

    A model-based feedback system is presented enabling the simultaneous control of the stored energy through βn and the toroidal rotation profile of the plasma in National Spherical Torus eXperiment Upgrade device. Actuation is obtained using the momentum from six injected neutral beams and the neoclassical toroidal viscosity generated by applying three-dimensional magnetic fields. Based on a model of the momentum diffusion and torque balance, a feedback controller is designed and tested in closed-loop simulations using TRANSP, a time dependent transport analysis code, in predictive mode. Promising results for the ongoing experimental implementation of controllers are obtained. PMID:28435207

  10. Simultaneous feedback control of plasma rotation and stored energy on NSTX-U using neoclassical toroidal viscosity and neutral beam injection

    DOE PAGES

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.; ...

    2017-02-23

    In this study, a model-based feedback system is presented enabling the simultaneous control of the stored energy through β n and the toroidal rotation profile of the plasma in National Spherical Torus eXperiment Upgrade device. Actuation is obtained using the momentum from six injected neutral beams and the neoclassical toroidal viscosity generated by applying three-dimensional magnetic fields. Based on a model of the momentum diffusion and torque balance, a feedback controller is designed and tested in closed-loop simulations using TRANSP, a time dependent transport analysis code, in predictive mode. Promising results for the ongoing experimental implementation of controllers are obtained.

  11. Ion absorption of the high harmonic fast wave in the National Spherical Torus Experiment

    NASA Astrophysics Data System (ADS)

    Rosenberg, Adam Lewis

    Ion absorption of the high harmonic fast wave in a spherical torus is of critical importance to assessing the viability of the wave as a means of heating and driving current. Analysis of recent NSTX shots has revealed that under some conditions when neutral beam and RF power are injected into the plasma simultaneously, a fast ion population with energy above the beam injection energy is sustained by the wave. In agreement with modeling, these experiments find the RF-induced fast ion tail strength and neutron rate at lower B-fields to be less enhanced, likely due to a larger β profile, which promotes greater off-axis absorption where the fast ion population is small. Ion loss codes find the increased loss fraction with decreased B insufficient to account for the changes in tail strength, providing further evidence that this is an RF interaction effect. Though greater ion absorption is predicted with lower k∥, surprisingly little variation in the tail was observed, along with a neutron rate enhancement with higher k∥. Data from the neutral particle analyzer, neutron detectors, x-ray crystal spectrometer, and Thomson scattering is presented, along with results from the TRANSP transport analysis code, ray-tracing codes HPRT and CURRAY, full-wave code and AORSA, quasilinear code CQL3D, and ion loss codes EIGOL and CONBEAM.

  12. Study of the effect of sawteeth on fast ions and neutron emission in MAST using a neutron camera

    NASA Astrophysics Data System (ADS)

    Cecconello, M.; Sperduti, A.; the MAST team

    2018-05-01

    The effect of the sawtooth instability on the confinement of fast ions on MAST, and the impact it has on the neutron emission, has been studied in detail using the TRANSP/NUBEAM codes coupled to a full orbit following code. The sawtooth models in TRANSP/NUBEAM indicate that, on MAST, passing and trapped fast ions are redistributed in approximately equal number and on a level that is consistent with the observations. It has not been possible to discriminate between the different sawtooth models since their predictions are all compatible with the neutron camera observations. Full orbit calculations of the fast ion motion have been used to estimate the characteristic time scales and energy thresholds that according to theoretical predictions govern the fast ions redistribution: no energy threshold for the redistribution for either passing and trapped fast ions was found. The characteristic times have, however, frequencies that are comparable with the frequencies of a m = 1, n = 1 perturbation and its harmonics with toroidal mode numbers n=2, \\ldots , 4, suggesting that on spherical tokamaks, in addition to the classical sawtooth-induced transport mechanisms of fast ions by attachment to the evolving perturbation and the associated E × B drift, a resonance mechanism between the m = 1 perturbation and the fast ions orbits might be at play.

  13. Real-time diamagnetic flux measurements on ASDEX Upgrade.

    PubMed

    Giannone, L; Geiger, B; Bilato, R; Maraschek, M; Odstrčil, T; Fischer, R; Fuchs, J C; McCarthy, P J; Mertens, V; Schuhbeck, K H

    2016-05-01

    Real-time diamagnetic flux measurements are now available on ASDEX Upgrade. In contrast to the majority of diamagnetic flux measurements on other tokamaks, no analog summation of signals is necessary for measuring the change in toroidal flux or for removing contributions arising from unwanted coupling to the plasma and poloidal field coil currents. To achieve the highest possible sensitivity, the diamagnetic measurement and compensation coil integrators are triggered shortly before plasma initiation when the toroidal field coil current is close to its maximum. In this way, the integration time can be chosen to measure only the small changes in flux due to the presence of plasma. Two identical plasma discharges with positive and negative magnetic field have shown that the alignment error with respect to the plasma current is negligible. The measured diamagnetic flux is compared to that predicted by TRANSP simulations. The poloidal beta inferred from the diamagnetic flux measurement is compared to the values calculated from magnetic equilibrium reconstruction codes. The diamagnetic flux measurement and TRANSP simulation can be used together to estimate the coupled power in discharges with dominant ion cyclotron resonance heating.

  14. Performance Assessment of Model-Based Optimal Feedforward and Feedback Current Profile Control in NSTX-U using the TRANSP Code

    NASA Astrophysics Data System (ADS)

    Ilhan, Z.; Wehner, W. P.; Schuster, E.; Boyer, M. D.; Gates, D. A.; Gerhardt, S.; Menard, J.

    2015-11-01

    Active control of the toroidal current density profile is crucial to achieve and maintain high-performance, MHD-stable plasma operation in NSTX-U. A first-principles-driven, control-oriented model describing the temporal evolution of the current profile has been proposed earlier by combining the magnetic diffusion equation with empirical correlations obtained at NSTX-U for the electron density, electron temperature, and non-inductive current drives. A feedforward + feedback control scheme for the requlation of the current profile is constructed by embedding the proposed nonlinear, physics-based model into the control design process. Firstly, nonlinear optimization techniques are used to design feedforward actuator trajectories that steer the plasma to a desired operating state with the objective of supporting the traditional trial-and-error experimental process of advanced scenario planning. Secondly, a feedback control algorithm to track a desired current profile evolution is developed with the goal of adding robustness to the overall control scheme. The effectiveness of the combined feedforward + feedback control algorithm for current profile regulation is tested in predictive simulations carried out in TRANSP. Supported by PPPL.

  15. Calculation of prompt loss and toroidal field ripple loss under neutral beam injection on EAST

    NASA Astrophysics Data System (ADS)

    Wu, Bin; Hao, Baolong; White, Roscoe; Wang, Jinfang; Zang, Qing; Han, Xiaofeng; Hu, Chundong

    2017-02-01

    Neutral beam injection is a major auxiliary heating method in the EAST experimental campaign. This paper gives detailed calculations of beam loss with different plasma equilibria using the guiding center code ORBIT and NUBEAM/TRANSP. Increasing plasma current can dramatically lower the beam ion prompt loss and ripple loss. Countercurrent beam injection gives a much larger prompt loss fraction than co-injection, and ripple-induced collisionless stochastic diffusion is the dominant loss channel.

  16. Calculation of prompt loss and toroidal field ripple loss under neutral beam injection on EAST

    DOE PAGES

    Wu, Bin; Hao, Baolong; White, Roscoe; ...

    2016-12-09

    Here, neutral beam injection is a major auxiliary heating method in the EAST experimental campaign. This paper gives detailed calculations of beam loss with different plasma equilibria using the guiding center code ORBIT and NUBEAM/TRANSP. Increasing plasma current can dramatically lower the beam ion prompt loss and ripple loss. Countercurrent beam injection gives a much larger prompt loss fraction than co-injection, and ripple-induced collisionless stochastic diffusion is the dominant loss channel.

  17. Calculation of prompt loss and toroidal field ripple loss under neutral beam injection on EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bin; Hao, Baolong; White, Roscoe

    Here, neutral beam injection is a major auxiliary heating method in the EAST experimental campaign. This paper gives detailed calculations of beam loss with different plasma equilibria using the guiding center code ORBIT and NUBEAM/TRANSP. Increasing plasma current can dramatically lower the beam ion prompt loss and ripple loss. Countercurrent beam injection gives a much larger prompt loss fraction than co-injection, and ripple-induced collisionless stochastic diffusion is the dominant loss channel.

  18. Modeling and control of plasma rotation and βn for NSTX-U using Neoclassical Toroidal Viscosity and Neutral Beam Injection

    NASA Astrophysics Data System (ADS)

    Goumiri, Imene; Rowley, Clarence; Sabbagh, Steven; Gates, David; Gerhardt, Stefan; Boyer, Mark

    2015-11-01

    A model-based system is presented allowing control of the plasma rotation profile in a magnetically confined toroidal fusion device to maintain plasma stability for long pulse operation. The analysis, using NSTX data and NSTX-U TRANSP simulations, is aimed at controlling plasma rotation using momentum from six injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields as actuators. Based on the momentum diffusion and torque balance model obtained, a feedback controller is designed and predictive simulations using TRANSP will be presented. Robustness of the model and the rotation controller will be discussed.

  19. Beam ion acceleration by ICRH in JET discharges

    NASA Astrophysics Data System (ADS)

    Budny, R. V.; Gorelenkova, M.; Bertelli, N.; JET Collaboration

    2015-11-01

    The ion Monte-Carlo orbit integrator NUBEAM, used in TRANSP has been enhanced to include an ``RF-kick'' operator to simulate the interaction of RF fields and fast ions. The RF quasi-linear operator (localized in space) uses a second R-Z orbit integrator. We apply this to analysis of recent JET discharges using ICRH with the ITER-like first wall. An example of results for a high performance Hybrid discharge for which standard TRANSP analysis simulated the DD neutron emission rate below measurements, re-analysis using the RF-kick operator results in increased beam parallel and perpendicular energy densities (~=40% and 15% respectively), and increased beam-thermal neutron emission (~= 35%), making the total rate closer to the measurement. Checks of the numerics, comparisons with measurements, and ITER implications will be presented. Supported in part by the US DoE contract DE-AC02-09CH11466 and by EUROfusion No 633053.

  20. Investigation of fast ion pressure effects in ASDEX Upgrade by spectral MSE measurements

    NASA Astrophysics Data System (ADS)

    Reimer, René; Dinklage, Andreas; Wolf, Robert; Dunne, Mike; Geiger, Benedikt; Hobirk, Jörg; Reich, Matthias; ASDEX Upgrade Team; McCarthy, Patrick J.

    2017-04-01

    High precision measurements of fast ion effects on the magnetic equilibrium in the ASDEX Upgrade tokamak have been conducted in a high-power (10 MW) neutral-beam injection discharge. An improved analysis of the spectral motional Stark effect data based on forward-modeling, including the Zeeman effect, fine-structure and non-statistical sub-level distribution, revealed changes in the order of 1% in |B| . The results were found to be consistent with results from the equilibrium solver CLISTE. The measurements allowed us to derive the fast ion pressure fraction to be Δ {{p}\\text{FI}}/{{p}\\text{mhd}}≈ 10 % and variations of the fast ion pressure are consistent with calculations of the transport code TRANSP. The results advance the understanding of fast ion confinement and magneto-hydrodynamic stability in the presence of fast ions.

  1. Bounce frequency fishbone analysis

    NASA Astrophysics Data System (ADS)

    White, Roscoe; Fredrickson, Eric; Chen, Liu

    2002-11-01

    Large amplitude bursting modes are observed on NSTX, which are identified as bounce frequency fishbone modes(PDX Group, Princeton Plasma Physics Lab, Phys Rev. Lett) 50, 891 (1983)^,(L. Chen, R. B. White, and M. N. Rosenbluth Phys Rev. Lett) 52, 1122 (1984). The identification is carried out using numerical equilibria obtained from TRANSP( R. V. Budny, M. G. Bell A. C. Janos et al), Nucl Fusion 35, 1497 (1995) and the numerical guiding center code ORBIT( R.B. White, Phys. Fluids B 2)(4), 845 (1990). These modes are important for high energy particle distributions which have large average bounce angle, such as the nearly tangentially injected beam ions in NSTX and isotropic alpha particle distributions. They are particularly important in high q low shear advanced plasma scenarios. Different ignited plasma scenarios are investigated with these modes in view.

  2. Improvements to the National Transport Code Collaboration Data Server

    NASA Astrophysics Data System (ADS)

    Alexander, David A.

    2001-10-01

    The data server of the National Transport Code Colaboration Project provides a universal network interface to interpolated or raw transport data accessible by a universal set of names. Data can be acquired from a local copy of the Iternational Multi-Tokamak (ITER) profile database as well as from TRANSP trees of MDS Plus data systems on the net. Data is provided to the user's network client via a CORBA interface, thus providing stateful data server instances, which have the advantage of remembering the desired interpolation, data set, etc. This paper will review the status and discuss the recent improvements made to the data server, such as the modularization of the data server and the addition of hdf5 and MDS Plus data file writing capability.

  3. Fast-ion transport in low density L-mode plasmas at TCV using FIDA spectroscopy and the TRANSP code

    NASA Astrophysics Data System (ADS)

    Geiger, B.; Karpushov, A. N.; Duval, B. P.; Marini, C.; Sauter, O.; Andrebe, Y.; Testa, D.; Marascheck, M.; Salewski, M.; Schneider, P. A.; the TCV Team; the EUROfusion MST1 Team

    2017-11-01

    Experiments with the new neutral beam injection source of TCV have been performed with high fast-ion fractions (>20%) that exhibit a clear reduction of the loop voltage and a clear increase of the plasma pressure in on- and off-axis heating configurations. However, good quantitative agreement between the experimental data and TRANSP predictions is only found when including strong additional fast-ion losses. These losses could in part be caused by turbulence or MHD activity as, e.g. high frequency modes near the frequency of toroidicity induced Alfvén eignmodes are observed. In addition, a newly installed fast-ion D-alpha (FIDA) spectroscopy system measures strong passive radiation and, hence, indicates the presence of high background neutral densities such that charge-exchange losses are substantial. Also the active radiation measured with the FIDA diagnostic, as well as data from a neutral particle analyzer, suggest strong fast-ion losses and large neutral densities. The large neutral densities can be justified since high electron temperatures (3-4 keV), combined with low electron densities (about 2× {10}19 m-3) yield long mean free paths of the neutrals which are penetrating from the walls.

  4. Anomalous transport in the H-mode pedestal of Alcator C-Mod discharges

    NASA Astrophysics Data System (ADS)

    Pankin, A. Y.; Hughes, J. W.; Greenwald, M. J.; Kritz, A. H.; Rafiq, T.

    2017-02-01

    Anomalous transport in the H-mode pedestal region of five Alcator C-Mod discharges, representing a collisionality scan is analyzed. The understanding of anomalous transport in the pedestal region is important for the development of a comprehensive model for the H-mode pedestal slope. In this research, a possible role of the drift resistive inertial ballooning modes (Rafiq et al 2010 Phys. Plasmas 17 082511) in the edge of Alcator C-Mod discharges is analyzed. The stability analysis, carried out using the TRANSP code, indicates that the DRIBM modes are strongly unstable in Alcator C-Mod discharges with large electron collisionality. An improved interpretive analysis of H-mode pedestal experimental data is carried out utilizing the additive flux minimization technique (Pankin et al 2013 Phys. Plasmas 20 102501) together with the guiding-center neoclassical kinetic XGC0 code. The neoclassical and neutral physics are simulated in the XGC0 code and the anomalous fluxes are computed using the additive flux minimization technique. The anomalous fluxes are reconstructed and compared with each other for the collisionality scan Alcator C-Mod discharges. It is found that the electron thermal anomalous diffusivities at the pedestal top increase with the electron collisionality. This dependence can also point to the drift resistive inertial ballooning modes as the modes that drive the anomalous transport in the plasma edge of highly collisional discharges.

  5. Fast ion beta limit measurements by collimated neutron detection in MST plasmas

    NASA Astrophysics Data System (ADS)

    Capecchi, William; Anderson, Jay; Bonofiglo, Phillip; Kim, Jungha; Sears, Stephanie

    2015-11-01

    Fast ion orbits in the reversed field pinch (RFP) are well ordered and classically confined despite magnetic field stochasticity generated by multiple tearing modes. Classical TRANSP modeling of a 1MW tangentially injected hydrogen neutral beam in MST deuterium plasmas predicts a core-localized fast ion density that can be up to 25% of the electron density and a fast ion beta of many times the local thermal beta. However, neutral particle analysis of an NBI-driven mode (presumably driven by a fast ion pressure gradient) shows mode-induced transport of core-localized fast ions and a saturated fast ion density. The TRANSP modeling is presumed valid until the onset of the beam-driven mode and gives an initial estimate of the volume-averaged fast ion beta of 1-2% (local core value up to 10%). A collimated neutron detector for fusion product profile measurements will be used to determine the spatial distribution of fast ions, allowing for a first measurement of the critical fast-ion pressure gradient required for mode destabilization. Testing/calibration data and initial fast-ion profiles will be presented. Characterization of both the local and global fast ion beta will be done for deuterium beam injection into deuterium plasmas for comparison to TRANSP predictions. Work supported by US DOE.

  6. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  7. An Efficient Method for Verifying Gyrokinetic Microstability Codes

    NASA Astrophysics Data System (ADS)

    Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.

    2009-11-01

    Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.

  8. Database-driven web interface automating gyrokinetic simulations for validation

    NASA Astrophysics Data System (ADS)

    Ernst, D. R.

    2010-11-01

    We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.

  9. DIGITAL IMAGE ANALYSIS REPORTS: THE CONVERSION OF EPIC'S TRADITIONAL SITE CHARACTERIZATION PRODUCT

    EPA Science Inventory

    Over the past several years EPIC has been exploring the practicality and cost-effectiveness of providing its traditional hard-copy report product in digital form. This conversion has a number of practical uses including- 1) compatibility for use as data layers in a GIS; 2) transp...

  10. Integrated Modeling of Time Evolving 3D Kinetic MHD Equilibria and NTV Torque

    NASA Astrophysics Data System (ADS)

    Logan, N. C.; Park, J.-K.; Grierson, B. A.; Haskey, S. R.; Nazikian, R.; Cui, L.; Smith, S. P.; Meneghini, O.

    2016-10-01

    New analysis tools and integrated modeling of plasma dynamics developed in the OMFIT framework are used to study kinetic MHD equilibria evolution on the transport time scale. The experimentally observed profile dynamics following the application of 3D error fields are described using a new OMFITprofiles workflow that directly addresses the need for rapid and comprehensive analysis of dynamic equilibria for next-step theory validation. The workflow treats all diagnostic data as fundamentally time dependent, provides physics-based manipulations such as ELM phase data selection, and is consistent across multiple machines - including DIII-D and NSTX-U. The seamless integration of tokamak data and simulation is demonstrated by using the self-consistent kinetic EFIT equilibria and profiles as input into 2D particle, momentum and energy transport calculations using TRANSP as well as 3D kinetic MHD equilibrium stability and neoclassical transport modeling using General Perturbed Equilibrium Code (GPEC). The result is a smooth kinetic stability and NTV torque evolution over transport time scales. Work supported by DE-AC02-09CH11466.

  11. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  12. Fishbone Mode Excited by Deeply Trapped Energetic Beam Ions in EAST

    NASA Astrophysics Data System (ADS)

    Zheng, Ting; Wu, Bin; Xu, Liqing; Hu, Chundong; Zang, Qing; Ding, Siye; Li, Yingying; Wu, Xingquan; Wang, Jinfang; Shen, Biao; Zhong, Guoqiang; Li, Hao; Shi, Tonghui; EAST Team

    2016-06-01

    This paper describes the fishbone mode phenomena during the injection of high-power neutral beams in EAST (Experimental Advanced Superconducting Tokamak). The features of the fishbone mode are presented. The change in frequency of the mode during a fishbone burst is from 1 kHz to 6 kHz. The nonlinear behavior of the fishbone mode is analyzed by using a prey-predator model, which is consistent with the experimental results. This model indicates that the periodic oscillations of the fishbone mode always occur near the critical value of fast ion beta. Furthermore, the neutral beam analysis for the discharge is done by using the NUBEAM module of the TRANSP code. According to the numerical simulation results and theoretical calculation, it can be concluded that the fishbone mode is driven by the deeply trapped energetic beam ions in EAST. supported by the National Magnetic Confinement Fusion Science Program of China (Nos. 2013GB101001, 2014DFG61950 and 2013GB112003) and National Natural Science Foundation of China (Nos. 11175211 and 11275233)

  13. Non-inductive current drive and transport in high βN plasmas in JET

    NASA Astrophysics Data System (ADS)

    Voitsekhovitch, I.; Alper, B.; Brix, M.; Budny, R. V.; Buratti, P.; Challis, C. D.; Ferron, J.; Giroud, C.; Joffrin, E.; Laborde, L.; Luce, T. C.; McCune, D.; Menard, J.; Murakami, M.; Park, J. M.; JET-EFDA contributors

    2009-05-01

    A route to stationary MHD stable operation at high βN has been explored at the Joint European Torus (JET) by optimizing the current ramp-up, heating start time and the waveform of neutral beam injection (NBI) power. In these scenarios the current ramp-up has been accompanied by plasma pre-heat (or the NBI has been started before the current flat-top) and NBI power up to 22 MW has been applied during the current flat-top. In the discharges considered transient total βN ≈ 3.3 and stationary (during high power phase) βN ≈ 3 have been achieved by applying the feedback control of βN with the NBI power in configurations with monotonic or flat core safety factor profile and without an internal transport barrier (ITB). The transport and current drive in this scenario is analysed here by using the TRANSP and ASTRA codes. The interpretative analysis performed with TRANSP shows that 50-70% of current is driven non-inductively; half of this current is due to the bootstrap current which has a broad profile since an ITB was deliberately avoided. The GLF23 transport model predicts the temperature profiles within a ±22% discrepancy with the measurements over the explored parameter space. Predictive simulations with this model show that the E × B rotational shear plays an important role for thermal ion transport in this scenario, producing up to a 40% increase of the ion temperature. By applying transport and current drive models validated in self-consistent simulations of given reference scenarios in a wider parameter space, the requirements for fully non-inductive stationary operation at JET are estimated. It is shown that the strong stiffness of the temperature profiles predicted by the GLF23 model restricts the bootstrap current at larger heating power. In this situation full non-inductive operation without an ITB can be rather expensive strongly relying on the external non-inductive current drive sources.

  14. A new numerical benchmark for variably saturated variable-density flow and transport in porous media

    NASA Astrophysics Data System (ADS)

    Guevara, Carlos; Graf, Thomas

    2016-04-01

    In subsurface hydrological systems, spatial and temporal variations in solute concentration and/or temperature may affect fluid density and viscosity. These variations could lead to potentially unstable situations, in which a dense fluid overlies a less dense fluid. These situations could produce instabilities that appear as dense plume fingers migrating downwards counteracted by vertical upwards flow of freshwater (Simmons et al., Transp. Porous Medium, 2002). As a result of unstable variable-density flow, solute transport rates are increased over large distances and times as compared to constant-density flow. The numerical simulation of variable-density flow in saturated and unsaturated media requires corresponding benchmark problems against which a computer model is validated (Diersch and Kolditz, Adv. Water Resour, 2002). Recorded data from a laboratory-scale experiment of variable-density flow and solute transport in saturated and unsaturated porous media (Simmons et al., Transp. Porous Medium, 2002) is used to define a new numerical benchmark. The HydroGeoSphere code (Therrien et al., 2004) coupled with PEST (www.pesthomepage.org) are used to obtain an optimized parameter set capable of adequately representing the data set by Simmons et al., (2002). Fingering in the numerical model is triggered using random hydraulic conductivity fields. Due to the inherent randomness, a large number of simulations were conducted in this study. The optimized benchmark model adequately predicts the plume behavior and the fate of solutes. This benchmark is useful for model verification of variable-density flow problems in saturated and/or unsaturated media.

  15. Modeling and control of plasma rotation for NSTX using neoclassical toroidal viscosity and neutral beam injection

    NASA Astrophysics Data System (ADS)

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.; Gates, D. A.; Gerhardt, S. P.; Boyer, M. D.; Andre, R.; Kolemen, E.; Taira, K.

    2016-03-01

    A model-based feedback system is presented to control plasma rotation in a magnetically confined toroidal fusion device, to maintain plasma stability for long-pulse operation. This research uses experimental measurements from the National Spherical Torus Experiment (NSTX) and is aimed at controlling plasma rotation using two different types of actuation: momentum from injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields. Based on the data-driven model obtained, a feedback controller is designed, and predictive simulations using the TRANSP plasma transport code show that the controller is able to attain desired plasma rotation profiles given practical constraints on the actuators and the available measurements of rotation.

  16. Modeling and control of plasma rotation for NSTX using neoclassical toroidal viscosity and neutral beam injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.

    2016-02-19

    A model-based feedback system is presented to control plasma rotation in a magnetically confined toroidal fusion device, to maintain plasma stability for long-pulse operation. This research uses experimental measurements from the National Spherical Torus Experiment (NSTX) and is aimed at controlling plasma rotation using two different types of actuation: momentum from injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields. Based on the data-driven model obtained, a feedback controller is designed, and predictive simulations using the TRANSP plasma transport code show that the controller is able to attain desired plasma rotation profiles given practical constraints onmore » the actuators and the available measurements of rotation.« less

  17. Measurements and modelling of fast-ion redistribution due to resonant MHD instabilities in MAST

    NASA Astrophysics Data System (ADS)

    Jones, O. M.; Cecconello, M.; McClements, K. G.; Klimek, I.; Akers, R. J.; Boeglin, W. U.; Keeling, D. L.; Meakins, A. J.; Perez, R. V.; Sharapov, S. E.; Turnyanskiy, M.; the MAST Team

    2015-12-01

    The results of a comprehensive investigation into the effects of toroidicity-induced Alfvén eigenmodes (TAE) and energetic particle modes on the NBI-generated fast-ion population in MAST plasmas are reported. Fast-ion redistribution due to frequency-chirping TAE in the range 50 kHz-100 kHz and frequency-chirping energetic particle modes known as fishbones in the range 20 kHz-50 kHz, is observed. TAE and fishbones are also observed to cause losses of fast ions from the plasma. The spatial and temporal evolution of the fast-ion distribution is determined using a fission chamber, a radially-scanning collimated neutron flux monitor, a fast-ion deuterium alpha spectrometer and a charged fusion product detector. Modelling using the global transport analysis code Transp, with ad hoc anomalous diffusion and fishbone loss models introduced, reproduces the coarsest features of the affected fast-ion distribution in the presence of energetic particle-driven modes. The spectrally and spatially resolved measurements show, however, that these models do not fully capture the effects of chirping modes on the fast-ion distribution.

  18. Critical Gradient Behavior of Alfvén Eigenmode Induced Fast-Ion Transport in Phase Space

    NASA Astrophysics Data System (ADS)

    Collins, C. S.; Pace, D. C.; van Zeeland, M. A.; Heidbrink, W. W.; Stagner, L.; Zhu, Y. B.; Kramer, G. J.; Podesta, M.; White, R. B.

    2016-10-01

    Experiments on DIII-D have shown that energetic particle (EP) transport suddenly increases when multiple Alfvén eigenmodes (AEs) cause particle orbits to become stochastic. Several key features have been observed; (1) the transport threshold is phase-space dependent and occurs above the AE linear stability threshold, (2) EP losses become intermittent above threshold and appear to depend on the types of AEs present, and (3) stiff transport causes the EP density profile to remain unchanged even if the source increases. Theoretical analysis using the NOVA and ORBIT codes shows that the threshold corresponds to when particle orbits become stochastic due to wave-particle resonances with AEs in the region of phase space measured by the diagnostics. The kick model in NUBEAM (TRANSP) is used to evolve the EP distribution function to study which modes cause the most transport and further characterize intermittent bursts of EP losses, which are associated with large scale redistribution through the domino effect. Work supported by the US DOE under DE-FC02-04ER54698.

  19. Optimization and Control of Burning Plasmas Through High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pankin, Alexei

    This project has revived the FACETS code, that has been developed under SciDAC fund- ing in 2008-2012. The code has been dormant for a number of years after the SciDAC funding stopped. FACETS depends on external packages. The external packages and libraries such as PETSc, FFTW, HDF5 and NETCDF that are included in FACETS have evolved during these years. Some packages in FACETS are also parts of other codes such as PlasmaState, NUBEAM, GACODES, and UEDGE. These packages have been also evolved together with their host codes which include TRANSP, TGYRO and XPTOR. Finally, there is also a set ofmore » packages in FACETS that are being developed and maintained by Tech-X. These packages include BILDER, SciMake, and FcioWrappers. Many of these packages evolved significantly during the last several years and FACETS had to be updated to synchronize with the re- cent progress in the external packages. The PI has introduced new changes to the BILDER package to support the updated interfaces to the external modules. During the last year of the project, the FACETS version of the UEDGE code has been extracted from FACETS as a standalone package. The PI collaborates with the scientists from LLNL on the updated UEDGE model in FACETS. Drs. T. Rognlien, M. Umansky and A. Dimits from LLNL are contributing to this task.« less

  20. Fast-ion D(alpha) measurements and simulations in DIII-D

    NASA Astrophysics Data System (ADS)

    Luo, Yadong

    The fast-ion Dalpha diagnostic measures the Doppler-shifted Dalpha light emitted by neutralized fast ions. For a favorable viewing geometry, the bright interferences from beam neutrals, halo neutrals, and edge neutrals span over a small wavelength range around the Dalpha rest wavelength and are blocked by a vertical bar at the exit focal plane of the spectrometer. Background subtraction and fitting techniques eliminate various contaminants in the spectrum. Fast-ion data are acquired with a time evolution of ˜1 ms, spatial resolution of ˜5 cm, and energy resolution of ˜10 keV. A weighted Monte Carlo simulation code models the fast-ion Dalpha spectra based on the fast-ion distribution function from other sources. In quiet plasmas, the spectral shape is in excellent agreement and absolute magnitude also has reasonable agreement. The fast-ion D alpha signal has the expected dependencies on plasma and neutral beam parameters. The neutral particle diagnostic and neutron diagnostic corroborate the fast-ion Dalpha measurements. The relative spatial profile is in agreement with the simulated profile based on the fast-ion distribution function from the TRANSP analysis code. During ion cyclotron heating, fast ions with high perpendicular energy are accelerated, while those with low perpendicular energy are barely affected. The spatial profile is compared with the simulated profiles based on the fast-ion distribution functions from the CQL Fokker-Planck code. In discharges with Alfven instabilities, both the spatial profile and spectral shape suggests that fast ions are redistributed. The flattened fast-ion Dalpha profile is in agreement with the fast-ion pressure profile.

  1. A Requirements Analysis Model for Selection of Personal Computer (PC) software in Air Force Organizations

    DTIC Science & Technology

    1988-09-01

    Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Systems Management Dexter R... management system software Diag/Prob Diagnosis and problem solving or problem finding GR Graphics software Int/Transp Interoperability and...language software Plan/D.S. Planning and decision support or decision making PM Program management software SC Systems for Command, Control, Communications

  2. Incorporating Value Systems in Strategic Force Analysis

    DTIC Science & Technology

    1991-03-01

    produce Typhoon and Delta IV class submarines. Other indications of the questionable intentions of the Soviet military machine are in allegations...Communications, their energy systems, transpcrtation, defense, oil, chemical, electronic, machine building and instruments industries, their reserves and...HAD100 8 -4 Transp . 1200 .4 SOCEc 3 opl.700 .17 .1In.1000 .14 Mild EO. 1Pr Warfighting . -8Fo 2000 .56 W8aoo 1000 .14 Tagt 1Id. 700 .11 StrutureSUPPT

  3. JPRS Report Science & Technology Europe Twenty-Fourth Isata International Symposium on Automotive Technology and Automation.

    DTIC Science & Technology

    1991-09-05

    34 Learning from Learning : Principles for Supporting Drivers" J A Groeger, MRC Applied Psychology Unit, UK "Argos: A Driver Behaviour Analysis System...Technology (CEST), UK MISCELLANEOUS "Modular Sensor System for Guiding Handling Machines " J Geit and J 423 Heinrich, TZN Forshcungs, FRG "Flexible...PUBLIC TRANSP . MANAa RESEARCH Arrrtympe PARTI "Implementation Strategl»» Systems engineering \\ PART III / Validation through Pilot

  4. A Systems Analysis View of the Vietnam War: 1965-1972 Volume 7. Republic of Vietnam Armed Forces (RVNAF)

    DTIC Science & Technology

    1975-02-18

    receiied frequent rebuttals or comments on our analyses which sharpened our studies and stimulated better analysis by other agencies. Second. it was...CONFIDENTIAL aI average rates in Tablt 2 show that 5 of the 15 units studied will have over half of their personnel desert in 19 6 9/. Only one unit, the...G8 directed RVNAP units to assist servicemen going on leave with transpQrtation. In two special "test cases " US units are assisting South Vietnamese

  5. A plasma rotation control scheme for NSTX and NSTX-U

    NASA Astrophysics Data System (ADS)

    Goumiri, Imene

    2016-10-01

    Plasma rotation has been proven to play a key role in stabilizing large scale instabilities and improving plasma confinement by suppressing micro-turbulence. A model-based feedback system which controls the plasma rotation profile on the National Spherical Torus Experiment (NSTX) and its upgrade (NSTX-U) is presented. The first part of this work uses experimental measurements from NSTX as a starting point and models the control of plasma rotation using two different types of actuation: momentum from injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields. Whether based on the data-driven model for NSTX or purely predictive modeling for NSTX-U, a reduced order model based feedback controller was designed. Predictive simulations using the TRANSP plasma transport code with the actuator input determined by the controller (controller-in-the-loop) show that the controller drives the plasma's rotation to the desired profiles in less than 100 ms given practical constraints on the actuators and the available real-time rotation measurements. This is the first time that TRANSP has been used as a plasma in simulator in a closed feedback loop test. Another approach to control simultaneously the toroidal rotation profile as well as βN is then shown for NSTX-U. For this case, the neutral beams (actuators) have been augmented in the modeling to match the upgrade version which spread the injection throughout the edge of the plasma. Control robustness in stability and performance has then been tested and used to predict the limits of the resulting controllers when the energy confinement time (τE) and the momentum diffusivity coefficient (χϕ) vary.

  6. Interaction between high harmonic fast waves and fast ions in NSTX/NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Bertelli, N.; Valeo, E. J.; Gorelenkova, M.; Green, D. L.; RF SciDAC Team

    2016-10-01

    Fast wave (FW) heating in the ion cyclotron range of frequency (ICRF) has been successfully used to sustain and control the fusion plasma performance, and it will likely play an important role in the ITER experiment. As demonstrated in the NSTX and DIII-D experiments the interactions between fast waves and fast ions can be so strong to significantly modify the fast ion population from neutral beam injection. In fact, it has been recently found in NSTX that FWs can modify and, under certain conditions, even suppress the energetic particle driven instabilities, such as toroidal Alfvén eigenmodes and global Alfvén eigenmodes and fishbones. This paper examines such interactions in NSTX/NSTX-U plasmas by using the recent extension of the RF full-wave code TORIC to include non-Maxwellian ions distribution functions. Particular attention is given to the evolution of the fast ions distribution function w/ and w/o RF. Tests on the RF kick-operator implemented in the Monte-Carlo particle code NUBEAM is also discussed in order to move towards a self consistent evaluation of the RF wave-field and the ion distribution functions in the TRANSP code. Work supported by US DOE Contract DE-AC02-09CH11466.

  7. Electron Profile Stiffness and Critical Gradient Length Studies in the Alcator C-Mod Tokamak

    NASA Astrophysics Data System (ADS)

    Houshmandyar, Saeid; Hatch, David R.; Liao, Kenneth T.; Zhao, Bingzhe; Phillips, Perry E.; Rowan, William L.; Cao, Norman; Ernst, Darin R.; Rice, John E.

    2017-10-01

    Electron temperature profile stiffness was investigated at Alcator C-Mod L-mode discharges. Electrons were heated by ion cyclotron range of frequencies (ICRF) through minority heating. The intent of the heating mechanism was to vary the heat flux and simultaneously, gradually change the local gradient. The electron temperature gradient scale length (LTe- 1 = | ∇Te |/Te) was accurately measured through a novel technique, using the high-resolution radiometer ECE diagnostic. The TRANSP power balance analysis (Q/QGB) and the measured scale length (a/LTe) result in critical scale length measurements at all major radius locations. These measurements suggest that the profiles are already at the critical values. Furthermore, the dependence of the stiffness on plasma rotation and magnetic shear will be discussed. In order to understand the underlying mechanism of turbulence for these discharges, simulations using the gyrokinetic code, GENE, were carried out. For linear runs at electron scales, it was found that the largest growth rates are very sensitive to a/LTe variation, which suggests the presence of ETG modes, while the sensitivity studies in the ion scales indicate ITG/TEM modes. Supported by USDoE awards DE-FG03-96ER54373 and DE-FC02-99ER54512.

  8. Kinetic equilibrium reconstruction for the NBI- and ICRH-heated H-mode plasma on EAST tokamak

    NASA Astrophysics Data System (ADS)

    Zhen, ZHENG; Nong, XIANG; Jiale, CHEN; Siye, DING; Hongfei, DU; Guoqiang, LI; Yifeng, WANG; Haiqing, LIU; Yingying, LI; Bo, LYU; Qing, ZANG

    2018-04-01

    The equilibrium reconstruction is important to study the tokamak plasma physical processes. To analyze the contribution of fast ions to the equilibrium, the kinetic equilibria at two time-slices in a typical H-mode discharge with different auxiliary heatings are reconstructed by using magnetic diagnostics, kinetic diagnostics and TRANSP code. It is found that the fast-ion pressure might be up to one-third of the plasma pressure and the contribution is mainly in the core plasma due to the neutral beam injection power is primarily deposited in the core region. The fast-ion current contributes mainly in the core region while contributes little to the pedestal current. A steep pressure gradient in the pedestal is observed which gives rise to a strong edge current. It is proved that the fast ion effects cannot be ignored and should be considered in the future study of EAST.

  9. Destabilization of counter-propagating TAEs by off-axis, co-current Neutral Beam Injection

    NASA Astrophysics Data System (ADS)

    Podesta', M.; Fredrickson, E.; Gorelenkova, M.

    2017-10-01

    Neutral Beam injection (NBI) is a common tool to heat the plasma and drive current non-inductively in fusion devices. Energetic particles (EP) resulting from NBI can drive instabilities that are detrimental for the performance and the predictability of plasma discharges. A broad NBI deposition profile, e.g. by off-axis injection aiming near the plasma mid-radius, is often assumed to limit those undesired effects by reducing the radial gradient of the EP density, thus reducing the ``universal'' drive for instabilities. However, this work presents new evidence that off-axis NBI can also lead to undesired effects such as the destabilization of Alfvénic instabilities, as observed in NSTX-U plasmas. Experimental observations indicate that counter propagating toroidal AEs are destabilized as the radial EP density profile becomes hollow as a result of off-axis NBI. Time-dependent analysis with the TRANSP code, augmented by a reduced fast ion transport model (known as kick model), indicates that instabilities are driven by a combination of radial and energy gradients in the EP distribution. Understanding the mechanisms for wave-particle interaction, revealed by the phase space resolved analysis, is the basis to identify strategies to mitigate or suppress the observed instabilities. Work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences under Contract Number DE-AC02-09CH11466.

  10. The effects of resonant magnetic perturbations on fast ion confinement in the Mega Amp Spherical Tokamak

    NASA Astrophysics Data System (ADS)

    McClements, K. G.; Akers, R. J.; Boeglin, W. U.; Cecconello, M.; Keeling, D.; Jones, O. M.; Kirk, A.; Klimek, I.; Perez, R. V.; Shinohara, K.; Tani, K.

    2015-07-01

    The effects of resonant magnetic perturbations (RMPs) on the confinement of energetic (neutral beam) ions in the Mega Amp Spherical Tokamak (MAST) are assessed experimentally using measurements of neutrons, fusion protons and fast ion Dα (FIDA) light emission. In single null-diverted (SND) MAST pulses with relatively low plasma current (400 kA), the total neutron emission dropped by approximately a factor of two when RMPs with toroidal mode number n = 3 were applied. The measured neutron rate during RMPs was much lower than that calculated using the TRANSP plasma simulation code, even when non-classical (but axisymmetric) ad hoc fast ion transport was taken into account in the latter. Sharp drops in spatially-resolved neutron rates, fusion proton rates and FIDA emission were also observed. First principles-based simulations of RMP-induced fast ion transport in MAST, using the F3D-OFMC code, show similar losses for two alternative representations of the MAST first wall, with and without full orbit effects taken into account; for n = 6 RMPs in a 600 kA plasma, the additional loss of beam power due to the RMPs was found in the simulations to be approximately 11%.

  11. The Need of an Open Data Quality Policy: The Case of the "Transparency - Health" Database in the Prevention of Conflict of Interest.

    PubMed

    Jantzen, Rodolphe; Rance, Bastien; Katsahian, Sandrine; Burgun, Anita; Looten, Vincent

    2018-01-01

    Open data available largely and with minimal constraints to the general public and journalists are needed to help rebuild trust between citizens and the health system. By opening data, we can expect to increase the democratic accountability, the self-empowerment of citizens. This article aims at assessing the quality and reusability of the Transparency - Health database with regards to the FAIR principles. More specifically, we observe the quality of the identity of the French medical doctors in the Transp-db. This study shows that the quality of the data in the Transp-db does not allow to identity with certainty those who benefit from an advantage or remuneration to be confirmed, reducing noticeably the impact of the open data effort.

  12. MDOT research receives national recognition : research update.

    DOT National Transportation Integrated Search

    2016-10-01

    To maintain a high-quality transportation : system for Michigans traveling : public, MDOT makes a sustained commitment : to excellence in transportation : research. That commitment includes both : developing solutions to meet Michigans : transp...

  13. Measurements of impurity concentrations and transport in the Lithium Tokamak Experiment

    NASA Astrophysics Data System (ADS)

    Boyle, D. P.; Bell, R. E.; Kaita, R.; Lucia, M.; Schmitt, J. C.; Scotti, F.; Kubota, S.; Hansen, C.; Biewer, T. M.; Gray, T. K.

    2016-10-01

    The Lithium Tokamak Experiment (LTX) is a modest-sized spherical tokamak with all-metal plasma facing components (PFCs), uniquely capable of operating with large area solid and/or liquid lithium coatings essentially surrounding the entire plasma. This work presents measurements of core plasma impurity concentrations and transport in LTX. In discharges with solid Li coatings, volume averaged impurity concentrations were low but non-negligible, with 2 - 4 % Li, 0.6 - 2 % C, 0.4 - 0.7 % O, and Zeff < 1.2 . Transport was assessed using the TRANSP, NCLASS, and MIST codes. Collisions with the main H ions dominated the neoclassical impurity transport, and neoclassical transport coefficients calculated with NCLASS were similar across all impurity species and differed no more than a factor of two. However, time-independent simulations with MIST indicated that neoclassical theory did not fully capture the impurity transport and anomalous transport likely played a significant role in determining impurity profiles. Progress on additional analysis, including time-dependent impurity transport simulations and impurity measurements with liquid lithium coatings, and plans for diagnostic upgrades and future experiments in LTX- β will also be presented. This work supported by US DOE contracts DE-AC02-09CH11466 and DE-AC05-00OR22725.

  14. Perturbative transport modeling and comparison to cold-pulse and heat-pulse propagation experiments in Alcator C-Mod and DIII-D

    NASA Astrophysics Data System (ADS)

    Rodriguez Fernandez, P.; White, A. E.; Cao, N. M.; Creely, A. J.; Greenwald, M. J.; Howard, N. T.; Hubbard, A. E.; Hughes, J. W.; Irby, J. H.; Petty, C. C.; Rice, J. E.; Alcator C-Mod Team

    2016-10-01

    Possible ``non-local'' transport phenomena are often observed in tokamak plasmas. Different models have been proposed to explain fast responses during perturbative transport experiments, including non-diffusive effects. Specific tools to characterize the dynamic behavior and power balance analysis using TRANSP and the quasi-linear trapped gyro-landau fluid code TGLF have been developed to analyze Alcator C-Mod experiments. Recent results from cold pulse experiments show that fast core temperature increases following edge cold-pulse injections (peak within 10ms , while τE 25ms) are not correlated with the direction of intrinsic rotation, and instead the amplitude of the core response depends on density, plasma current and RF input power. The propagation of the cold pulse can be compared with propagation of heat pulses from sawteeth, and both may be used to probe changes in temperature profile stiffness. A Laser Blow Off (LBO) system is being developed for DIII-D that will allow further validation and cross-machine comparison of cold pulse experiments. LBO at DIII-D will also allow for direct comparisons with ECH perturbative heat pulse experiments. Work supported by US DOE under Grants DE-FC02-99ER54512 (C-Mod) and DE-FC02-04ER54698 (DIII-D) and La Caixa Fellowship.

  15. Phase space effects on fast ion distribution function modeling in tokamaks

    NASA Astrophysics Data System (ADS)

    Podestà, M.; Gorelenkova, M.; Fredrickson, E. D.; Gorelenkov, N. N.; White, R. B.

    2016-05-01

    Integrated simulations of tokamak discharges typically rely on classical physics to model energetic particle (EP) dynamics. However, there are numerous cases in which energetic particles can suffer additional transport that is not classical in nature. Examples include transport by applied 3D magnetic perturbations and, more notably, by plasma instabilities. Focusing on the effects of instabilities, ad-hoc models can empirically reproduce increased transport, but the choice of transport coefficients is usually somehow arbitrary. New approaches based on physics-based reduced models are being developed to address those issues in a simplified way, while retaining a more correct treatment of resonant wave-particle interactions. The kick model implemented in the tokamak transport code TRANSP is an example of such reduced models. It includes modifications of the EP distribution by instabilities in real and velocity space, retaining correlations between transport in energy and space typical of resonant EP transport. The relevance of EP phase space modifications by instabilities is first discussed in terms of predicted fast ion distribution. Results are compared with those from a simple, ad-hoc diffusive model. It is then shown that the phase-space resolved model can also provide additional insight into important issues such as internal consistency of the simulations and mode stability through the analysis of the power exchanged between energetic particles and the instabilities.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podesta, M.; Gorelenkova, M.; Fredrickson, E. D.

    Here, integrated simulations of tokamak discharges typically rely on classical physics to model energetic particle (EP) dynamics. However, there are numerous cases in which energetic particles can suffer additional transport that is not classical in nature. Examples include transport by applied 3D magnetic perturbations and, more notably, by plasma instabilities. Focusing on the effects of instabilities,ad-hocmodels can empirically reproduce increased transport, but the choice of transport coefficients is usually somehow arbitrary. New approaches based on physics-based reduced models are being developed to address those issues in a simplified way, while retaining a more correct treatment of resonant wave-particle interactions. Themore » kick model implemented in the tokamaktransport code TRANSP is an example of such reduced models. It includes modifications of the EP distribution by instabilities in real and velocity space, retaining correlations between transport in energy and space typical of resonant EP transport. The relevance of EP phase space modifications by instabilities is first discussed in terms of predicted fast ion distribution. Results are compared with those from a simple, ad-hoc diffusive model. It is then shown that the phase-space resolved model can also provide additional insight into important issues such as internal consistency of the simulations and mode stability through the analysis of the power exchanged between energetic particles and the instabilities.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podestà, M., E-mail: mpodesta@pppl.gov; Gorelenkova, M.; Fredrickson, E. D.

    Integrated simulations of tokamak discharges typically rely on classical physics to model energetic particle (EP) dynamics. However, there are numerous cases in which energetic particles can suffer additional transport that is not classical in nature. Examples include transport by applied 3D magnetic perturbations and, more notably, by plasma instabilities. Focusing on the effects of instabilities, ad-hoc models can empirically reproduce increased transport, but the choice of transport coefficients is usually somehow arbitrary. New approaches based on physics-based reduced models are being developed to address those issues in a simplified way, while retaining a more correct treatment of resonant wave-particle interactions.more » The kick model implemented in the tokamak transport code TRANSP is an example of such reduced models. It includes modifications of the EP distribution by instabilities in real and velocity space, retaining correlations between transport in energy and space typical of resonant EP transport. The relevance of EP phase space modifications by instabilities is first discussed in terms of predicted fast ion distribution. Results are compared with those from a simple, ad-hoc diffusive model. It is then shown that the phase-space resolved model can also provide additional insight into important issues such as internal consistency of the simulations and mode stability through the analysis of the power exchanged between energetic particles and the instabilities.« less

  18. Phase space effects on fast ion distribution function modeling in tokamaks

    DOE Data Explorer

    White, R. B. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Podesta, M. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Gorelenkova, M. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Fredrickson, E. D. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Gorelenkov, N. N. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2016-06-01

    Integrated simulations of tokamak discharges typically rely on classical physics to model energetic particle (EP) dynamics. However, there are numerous cases in which energetic particles can suffer additional transport that is not classical in nature. Examples include transport by applied 3D magnetic perturbations and, more notably, by plasma instabilities. Focusing on the effects of instabilities, ad-hoc models can empirically reproduce increased transport, but the choice of transport coefficients is usually somehow arbitrary. New approaches based on physics-based reduced models are being developed to address those issues in a simplified way, while retaining a more correct treatment of resonant wave-particle interactions. The kick model implemented in the tokamak transport code TRANSP is an example of such reduced models. It includes modifications of the EP distribution by instabilities in real and velocity space, retaining correlations between transport in energy and space typical of resonant EP transport. The relevance of EP phase space modifications by instabilities is first discussed in terms of predicted fast ion distribution. Results are compared with those from a simple, ad-hoc diffusive model. It is then shown that the phase-space resolved model can also provide additional insight into important issues such as internal consistency of the simulations and mode stability through the analysis of the power exchanged between energetic particles and the instabilities.

  19. Benchmarking variable-density flow in saturated and unsaturated porous media

    NASA Astrophysics Data System (ADS)

    Guevara Morel, Carlos Roberto; Cremer, Clemens; Graf, Thomas

    2015-04-01

    In natural environments, fluid density and viscosity can be affected by spatial and temporal variations of solute concentration and/or temperature. These variations can occur, for example, due to salt water intrusion in coastal aquifers, leachate infiltration from waste disposal sites and upconing of saline water from deep aquifers. As a consequence, potentially unstable situations may exist in which a dense fluid overlies a less dense fluid. This situation can produce instabilities that manifest as dense plume fingers that move vertically downwards counterbalanced by vertical upwards flow of the less dense fluid. Resulting free convection increases solute transport rates over large distances and times relative to constant-density flow. Therefore, the understanding of free convection is relevant for the protection of freshwater aquifer systems. The results from a laboratory experiment of saturated and unsaturated variable-density flow and solute transport (Simmons et al., Transp. Porous Medium, 2002) are used as the physical basis to define a mathematical benchmark. The HydroGeoSphere code coupled with PEST are used to estimate the optimal parameter set capable of reproducing the physical model. A grid convergency analysis (in space and time) is also undertaken in order to obtain the adequate spatial and temporal discretizations. The new mathematical benchmark is useful for model comparison and testing of variable-density variably saturated flow in porous media.

  20. Tennessee long-range transportation plan : financial plan

    DOT National Transportation Integrated Search

    2005-12-01

    Meeting Tennessees transportation requirements over the next 25 years is a major challenge. The infrastructure demands associated with building and maintaining the states aviation, bicycle and pedestrian, rail, water, highway, and public transp...

  1. Deuterium temperature, drift velocity, and density measurements in non-Maxwellian plasmas at ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Salewski, M.; Geiger, B.; Jacobsen, A. S.; Abramovic, I.; Korsholm, S. B.; Leipold, F.; Madsen, B.; Madsen, J.; McDermott, R. M.; Moseev, D.; Nielsen, S. K.; Nocente, M.; Rasmussen, J.; Stejner, M.; Weiland, M.; The EUROfusion MST1 Team; The ASDEX Upgrade Team

    2018-03-01

    We measure the deuterium density, the parallel drift velocity, and parallel and perpendicular temperatures (T_\\Vert , T_\\perp ) in non-Maxwellian plasmas at ASDEX Upgrade. This is done by taking moments of the ion velocity distribution function measured by tomographic inversion of five simultaneously acquired spectra of D_α -light. Alternatively, we fit the spectra using a bi-Maxwellian distribution function. The measured kinetic temperatures (T_\\Vert =9 keV, T_\\perp=11 keV) reveal the anisotropy of the plasma and are substantially higher than the measured boron temperature (7 keV). The Maxwellian deuterium temperature computed with TRANSP (6 keV) is not uniquely measurable due to the fast ions. Nevertheless, simulated kinetic temperatures accounting for fast ions based on TRANSP (T_\\Vert =8.3 keV, T_\\perp=10.4 keV) are in excellent agreement with the measurements. Similarly, the Maxwellian deuterium drift velocity computed with TRANSP (300 km s-1) is not uniquely measurable, but the simulated kinetic drift velocity accounting for fast ions agrees with the measurements (400 km s-1) and is substantially larger than the measured boron drift velocity (270 km s-1). We further find that ion cyclotron resonance heating elevates T_\\Vert and T_\\perp each by 2 keV without evidence for preferential heating in the D_α spectra. Lastly, we derive an expression for the 1D projection of an arbitrarily drifting bi-Maxwellian onto a diagnostic line-of-sight.

  2. Numerical Solution of the Electron Heat Transport Equation and Physics-Constrained Modeling of the Thermal Conductivity via Sequential Quadratic Programming Optimization in Nuclear Fusion Plasmas

    NASA Astrophysics Data System (ADS)

    Paloma, Cynthia S.

    The plasma electron temperature (Te) plays a critical role in a tokamak nu- clear fusion reactor since temperatures on the order of 108K are required to achieve fusion conditions. Many plasma properties in a tokamak nuclear fusion reactor are modeled by partial differential equations (PDE's) because they depend not only on time but also on space. In particular, the dynamics of the electron temperature is governed by a PDE referred to as the Electron Heat Transport Equation (EHTE). In this work, a numerical method is developed to solve the EHTE based on a custom finite-difference technique. The solution of the EHTE is compared to temperature profiles obtained by using TRANSP, a sophisticated plasma transport code, for specific discharges from the DIII-D tokamak, located at the DIII-D National Fusion Facility in San Diego, CA. The thermal conductivity (also called thermal diffusivity) of the electrons (Xe) is a plasma parameter that plays a critical role in the EHTE since it indicates how the electron temperature diffusion varies across the minor effective radius of the tokamak. TRANSP approximates Xe through a curve-fitting technique to match experimentally measured electron temperature profiles. While complex physics-based model have been proposed for Xe, there is a lack of a simple mathematical model for the thermal diffusivity that could be used for control design. In this work, a model for Xe is proposed based on a scaling law involving key plasma variables such as the electron temperature (Te), the electron density (ne), and the safety factor (q). An optimization algorithm is developed based on the Sequential Quadratic Programming (SQP) technique to optimize the scaling factors appearing in the proposed model so that the predicted electron temperature and magnetic flux profiles match predefined target profiles in the best possible way. A simulation study summarizing the outcomes of the optimization procedure is presented to illustrate the potential of the proposed modeling method.

  3. Economic benefits of employment transportation services : final report

    DOT National Transportation Integrated Search

    2008-06-30

    This report examines the benefits that accrue from employment transportation services implemented as a result of changes in welfare policy, namely the Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996. Employment transp...

  4. 2000 Florida rail system plan

    DOT National Transportation Integrated Search

    2000-01-01

    The purpose of the Florida Rail System Plan is two-fold. First, it represents the rail : component of the Florida Transportation Plan (Agency Functional Plan) which, through an : annual series of policies, programs and projects, implements the Transp...

  5. Streamlined project closeout for construction at KYTC.

    DOT National Transportation Integrated Search

    2017-09-01

    Project closeout is the period between the end of construction and when a contract is finalized. During closeout, resources are held in encumbered funds intended for the project and in the contractors bonding capacity. Although the Kentucky Transp...

  6. Federal Hazardous Materials Law

    DOT National Transportation Integrated Search

    1994-10-01

    The purpose of this chapter is to provide adequate protection against the risks to life and property inherent in the transpor-tation : of hazardous material in commerce by improving the regulatory and enforcement authority of the Secretary of Trans-p...

  7. Colorado Transportation Management Center (CTMC) integration project (FY01 Earmark) : local evaluation report

    DOT National Transportation Integrated Search

    2007-11-09

    The CTMC Integration Project is the result of FY01 congressionally designated earmarks to improve transportation efficiency, promote safety, increase traffic flow, reduce emissions, improve traveler information dissemination, enhance alternate transp...

  8. Long-term aging of recycled binders.

    DOT National Transportation Integrated Search

    2015-07-01

    Asphalt pavement is Americas most recycled material. Eighty million tons of asphalt, nearly 80% of all milled asphalt pavement, : is recycled every year [1]. To effectively maintain its 40,000 miles of paved roads, the Florida Department of Transp...

  9. Integrated modeling of temperature and rotation profiles in JET ITER-like wall discharges

    NASA Astrophysics Data System (ADS)

    Rafiq, T.; Kritz, A. H.; Kim, Hyun-Tae; Schuster, E.; Weiland, J.

    2017-10-01

    Simulations of 78 JET ITER-like wall D-D discharges and 2 D-T reference discharges are carried out using the TRANSP predictive integrated modeling code. The time evolved temperature and rotation profiles are computed utilizing the Multi-Mode anomalous transport model. The discharges involve a broad range of conditions including scans over gyroradius, collisionality, and values of q95. The D-T reference discharges are selected in anticipation of the D-T experimental campaign planned at JET in 2019. The simulated temperature and rotation profiles are compared with the corresponding experimental profiles in the radial range from the magnetic axis to the ρ = 0.9 flux surface. The comparison is quantified by calculating the RMS deviations and Offsets. Overall, good agreement is found between the profiles produced in the simulations and the experimental data. It is planned that the simulations obtained using the Multi-Mode model will be compared with the simulations using the TGLF model. Research supported in part by the US, DoE, Office of Sciences.

  10. Comparison and prediction of chirping in NSTX and DIII-D

    NASA Astrophysics Data System (ADS)

    Duarte, Vinicius; Berk, Herbert; Gorelenkov, Nikolai; Heidbrink, William; Kramer, Gerrit; Nazikian, Raffi; Pace, David; Podesta, Mario; van Zeeland, Michael

    2016-10-01

    We present an explanation of why frequency chirping of Alfven waves is ubiquitous in NSTX and rarely observed in DIII-D. A time-delayed cubic nonlinear equation is employed for the study of the onset of nonlinear phase-space structures. Its explosive solutions are chirping precursors. We employ NOVA and NOVA-K codes to provide consistent Alfvenic eigenmodes and weighted physical contributions from all regions of phase space. In addition, TRANSP is employed to determine the diffusivity needed to fulfill power balance. Though background micro-turbulence is usually unimportant in determining the energetic particle spatial profile, it may still be important with regard to whether chirping structures likely form. We show that the energetic particle micro-turbulent induced scattering often competes with collisional pitch-angle scattering. This competition explains the tendency for NSTX, where micro-turbulence is weak, to exhibit Alfvénic chirping, whereas in DIII-D turbulent diffusion usually dominates and chirping is not observed except when micro-turbulence markedly reduces.

  11. Multi-layered mode structure of locked-tearing-modes after unlocking

    NASA Astrophysics Data System (ADS)

    Okabayashi, Michio; Logan, N.; Tobias, B.; Wang, Z.; Budny, B.; Nazikian, R.; Strait, E.; La Haye, R.; Paz-Soldan, C. J.; Ferraro, N.; Shiraki, D.; Hanson, J.; Zanca, P.; Paccagnella, R.

    2015-11-01

    Prevention of m/n=2/1 tearing modes (TM) by electro-magnetic torque injection has been successful in DIII-D and RFX-mod where plasma conditions and plasma shape are completely different. Understanding the internal structure in the post-unlocked phase is a pre-requisite to its application to reactor relevant plasmas such as in ITER. Ti and toroidal rotation perturbations show there exist several radially different TM layers. However, the phase shift between the applied field and the plasma response is rather small from plasma edge to the q ~3 domain, indicating that a kink-like response prevails. The biggest threat for sustaining an unlocked 2/1 mode is sudden distortion of the rotational profile due to the internal mode reconnection. Possible TM layer structure will be discussed with numerical MHD codes and TRANSP. This work is supported in part by the US Department of Energy under DE-AC02-09CH11466, DE-FG02-99ER54531, DE-SC0003913, and DE-FC02-04ER54698.

  12. Evaluation of transportation microenvironments through assessment of cyclists' exposure to traffic related particulate matter.

    DOT National Transportation Integrated Search

    2011-03-01

    Urban residents spend a considerable amount of outdoor time in transportation microenvironments as pedestrians, bicycle commuters, public transit users, residents and workers situated along roadways, and commuters within vehicles. Within these transp...

  13. Development of a speeding-related crash typology

    DOT National Transportation Integrated Search

    2010-04-01

    Speeding, the driver behavior of exceeding the posted speed limit or driving too fast for conditions, has consistently been estimated to be a contributing factor to a significant percentage of fatal and nonfatal crashes. The U.S. Department of Transp...

  14. 2001 Federal Radionavigation Plan

    DOT National Transportation Integrated Search

    2002-03-19

    The report is the official source of radio navigation policy and planning for the Federal Government and is required by the National Defense Authorization Act for Fiscal year 1998. It is prepared jointly by the Departments of Defense (DoD) and Transp...

  15. Automatic Extraction of Highway Traffic Data From Aerial Photographs

    DOT National Transportation Integrated Search

    1997-01-01

    This is the fifth and final report provided to fulfill the statutory requirement to periodically summarize the progress of the Intelligent Transportation Systems (ITS) program administered by the U.S. Department of Transportation (DOT). In the Transp...

  16. Impact assessment of integrated dynamic transit operations : final report.

    DOT National Transportation Integrated Search

    2016-03-02

    This document details the impact assessment conducted by the Volpe Center for the Integrated Dynamic Transit Operations (IDTO) prototypedemonstrations in Columbus, Ohio and Central Florida. The prototype is one result of the U.S. Department of Transp...

  17. 76 FR 22748 - Wisconsin Central Ltd.-Intra-Corporate Family Merger Exemption-Duluth, Missabe and Iron Range...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-22

    ... transaction, the WCTC family of rail carriers also included Fox Valley & Western Ltd. (FVW), Sault Ste. Marie.... Cent. Transp., Wis. Cent. Ltd. and Fox Valley & W. Ltd.--Intracorporate Family Transaction Exemption...

  18. Sustainable transportation : developing a framework for policy innovation December 14, 1993 summary of proceedings.

    DOT National Transportation Integrated Search

    1994-02-28

    Sustainable development is development that meets the needs of the present without compromising the future. How can sustainable development be linked meaningfully to transportation planning and policies? On December 14, 1993, the Department of Transp...

  19. Miami Valley ITS : early deployment plan : final user service plan

    DOT National Transportation Integrated Search

    1997-07-01

    This User Service Plan is the first major product of the process to develop an Intelligent Transportation System (ITS) Early Deployment Plan (EDP) for the Miami Valley. This User Service Plan documents the travel environment, growth trends and transp...

  20. Recycled Materials in European Highway Environments : Uses, Technologies, and Policies

    DOT National Transportation Integrated Search

    2000-10-01

    The objective of this scanning tour was to review and document innovative policies, programs, and techniques that promote the use of recycled materials in the highway environment. The U.S. delegation met with more than 100 representatives from transp...

  1. Strategic Enterprise Architecture Design and Implementation Plan for the Montana Department of Transportation

    DOT National Transportation Integrated Search

    2016-08-01

    The purpose of this research report is to develop a Strategic Enterprise Architecture (EA) Design and Implementation Plan for the Montana Department of Transportation (MDT). Information management systems are vital to maintaining the States transp...

  2. Transportation in the United States : a review

    DOT National Transportation Integrated Search

    1997-01-01

    The United States has the largest transportation system in the world. It serves 260 million people and 6 million business establishments spread over the fourth largest country (in land area) in the world. This report provides a snapshot of the transp...

  3. Phase space effects on fast ion distribution function modeling in tokamaks

    DOE PAGES

    Podesta, M.; Gorelenkova, M.; Fredrickson, E. D.; ...

    2016-04-14

    Here, integrated simulations of tokamak discharges typically rely on classical physics to model energetic particle (EP) dynamics. However, there are numerous cases in which energetic particles can suffer additional transport that is not classical in nature. Examples include transport by applied 3D magnetic perturbations and, more notably, by plasma instabilities. Focusing on the effects of instabilities,ad-hocmodels can empirically reproduce increased transport, but the choice of transport coefficients is usually somehow arbitrary. New approaches based on physics-based reduced models are being developed to address those issues in a simplified way, while retaining a more correct treatment of resonant wave-particle interactions. Themore » kick model implemented in the tokamaktransport code TRANSP is an example of such reduced models. It includes modifications of the EP distribution by instabilities in real and velocity space, retaining correlations between transport in energy and space typical of resonant EP transport. The relevance of EP phase space modifications by instabilities is first discussed in terms of predicted fast ion distribution. Results are compared with those from a simple, ad-hoc diffusive model. It is then shown that the phase-space resolved model can also provide additional insight into important issues such as internal consistency of the simulations and mode stability through the analysis of the power exchanged between energetic particles and the instabilities.« less

  4. A Novel Decision Support Tool to Develop Link Driving Schedules for Moves.

    DOT National Transportation Integrated Search

    2015-01-01

    A system or user level strategy that aims to reduce emissions from transportation networks requires a rigorous assessment of emissions inventory for the system to justify its effectiveness. It is important to estimate the total emissions for a transp...

  5. DEVELOPMENT AND VERIFICATION OF A SCREENING MODEL FOR SURFACE SPREADING OF PETROLEUM

    EPA Science Inventory

    Overflows and leakage from aboveground storage tanks and pipelines carrying crude oil and petroleum products occur frequently. The spilled hydrocarbons pose environmental threats by contaminating the surrounding soil and the underlying ground water. Predicting the fate and transp...

  6. Lowell National Historical Park alternative transportation system historic trolley planning study

    DOT National Transportation Integrated Search

    2002-12-01

    This report assesses opportunities for expanding Lowell National Historical Parks historic trolley line by implementing a light rail system reminiscent of late 19th/early 20th Century trolley lines. This is in line with the Park Services Transp...

  7. Development of an intermodal training program for disaster relief agencies.

    DOT National Transportation Integrated Search

    2010-08-01

    Natural disasters impact society on a broad level, often leading to both financial damage and the loss of : human life. This project seeks to improve the design and operation of disaster relief chains by providing : agencies with an intermodal transp...

  8. Electrical parameters and water permeability properties of monolayers formed by T84 cells cultured on permeable supports.

    PubMed

    Ozu, M; Toriano, R; Capurro, C; Parisi, M

    2005-01-01

    T84 is an established cell line expressing an enterocyte phenotype whose permeability properties have been widely explored. Osmotic permeability (POSM), hydraulic permeability (PHYDR) and transport-associated net water fluxes (JW-transp), as well as short-circuit current (ISC), transepithelial resistance (RT), and potential difference (deltaVT) were measured in T84 monolayers with the following results: POSM 1.3 +/- 0.1 cm.s-1 x 10-3; PHYDR 0.27 +/- 0.02 cm.s-1; RT 2426 +/- 109 omega.cm2, and deltaVT 1.31 +/- 0.38 mV. The effect of 50 microM 5,6-dichloro-1-ethyl-1,3-dihydro-2H-benzimidazol-2-one (DCEBIO), a "net Cl- secretory agent", on T84 cells was also studied. We confirm the reported important increase in ISC induced by DCEBIO which was associated here with a modest secretory deltaJW-transp. The present results were compared with those reported using the same experimental approach applied to established cell lines originating from intestinal and renal epithelial cells (Caco-2, LLC-PK1 and RCCD-1). No clear association between PHYDR and RT could be demonstrated and high PHYDR values were observed in an electrically tight epithelium, supporting the view that a "water leaky" barrier is not necessarily an "electrically leaky" one. Furthermore, the modest secretory deltaJW-transp was not consistent with previous results obtained with RCCD-1 cells stimulated with vasopressin (absorptive fluxes) or with T84 cells secreting water under the action of Escherichia coli heat stable enterotoxin. We conclude that, while the presence of aquaporins is necessary to dissipate an external osmotic gradient, coupling between water and ion transport cannot be explained by a simple and common underlying mechanism.

  9. EFFECTS OF THE VARIATION OF SELECT SAMPLING PARAMETERS ON SOIL VAPOR CONCENTRATIONS

    EPA Science Inventory

    Currently soil vapor surveys are commonly used as a screening technique to delineate subsurface volatile organic compound (VOC) contaminant plumes and to provide information for vapor intrusion and contaminated site evaluations. To improve our understanding of the fate and transp...

  10. Exercise handbook : what transportation security and emergency preparedness leaders need to know to improve emergency preparedness.

    DOT National Transportation Integrated Search

    2014-02-01

    The U.S. Department of Homeland Security (DHS) has provided extensive general guidance on developing training and exercise programs for public entities, but little had been done to focus that material on the transportation sector specifically. Transp...

  11. Sources of Mercury Exposure for U.S. Seafood Consumers: Implications for Policy

    EPA Science Inventory

    Recent policies attempting to reduce adverse effects of methylmercury exposure from fish consumption in the U.S. have targeted reductions in anthropogenic emissions from U.S. sources. Methods: We use models that simulate global atmospheric chemistry (GEOS-Chem); the fate, transp...

  12. Solute transport through a pine-bark based substrate under saturated and unsaturated conditions

    USDA-ARS?s Scientific Manuscript database

    An understanding of how dissolved mineral nutrient ions (solutes) move through pine bark substrates during the application of irrigation water is vital to better understand nutrient transport and leaching from containerized crops during an irrigation event. However, current theories on solute transp...

  13. Cost savings stemming from non-compliance with international environmental regulations in the maritime sector

    DOT National Transportation Integrated Search

    2003-01-01

    According to one recent study, the illegal discharge of oil into the sea through routine operations is equal to over eight times the Exxon Valdez oil spill - every year. Oil pollution is not the only environmental impact stemming from maritime transp...

  14. Spodoptera species as pests in Florida strawberries

    USDA-ARS?s Scientific Manuscript database

    Hillsborough County, Florida, produces about 15% of the nation’s strawberries on over 11,000 acres. The economic impact to the area is over $700 million. Production averages over 20 million flats of strawberries from November through March. Fields are planted in the fall with young plants (transp...

  15. Programmer's guide to the fuzzy logic ramp metering algorithm : software design, integration, testing, and evaluation

    DOT National Transportation Integrated Search

    2000-02-01

    A Fuzzy Logic Ramp Metering Algorithm was implemented on 126 ramps in the greater Seattle area. This report documents the implementation of the Fuzzy Logic Ramp Metering Algorithm at the Northwest District of the Washington State Department of Transp...

  16. COMPARISON OF MEASURED AND MODELED SURFACE FLUXES OF HEAT, MOISTURE, AND CHEMICAL DRY DEPOSITION

    EPA Science Inventory

    Realistic air quality modeling requires accurate simulation of both meteorological and chemical processes within the planetary boundary layer (PBL). n vegetated areas, the primary pathway for surface fluxes of moisture as well a many gaseous chemicals is through vegetative transp...

  17. ENVIRONMENTAL RESEARCH BRIEF: SPATIAL HETEROGENEITY OF GEOCHEMICAL AND HYDROLOGIC PARAMETERS AFFECTING METAL TRANSPORT IN GROUND WATER

    EPA Science Inventory

    Reliable assessment of the hazards or risks arising from groundwater contamination and the design of effective means of rehabilitation of contaminated sites requires the capability to predict the movement and fate of dissolved solutes in groundwater. The modeling of metal transp...

  18. Independent evaluation of light-vehicle safety applications based on vehicle-to-vehicle communications used in the 2012-2013 safety pilot model deployment

    DOT National Transportation Integrated Search

    2015-12-01

    This report presents the methodology and results of the independent evaluation of safety applications for passenger vehicles in the 2012-2013 Safety Pilot Model Deployment, part of the United States Department of Transportations Intelligent Transp...

  19. Suppression of Alfvénic modes with off-axis NBI

    NASA Astrophysics Data System (ADS)

    Fredrickson, Eric; Bell, R.; Diallo, A.; Leblanc, B.; Podesta, M.; Levinton, F.; Yuh, H.; Liu, D.

    2016-10-01

    GAE are seen on NSTX-U in the frequency range from 1 to 3 MHz with injection of the more perpendicular, NSTX neutral beam sources. A new result is that injection of any of the new, more tangential, neutral beam sources with tangency radii larger than the magnetic axis suppress this GAE activity. Simulations of beam deposition and slowing down with the TRANSP code indicate that these new sources deposit fast ions with 0.9

  20. Consistency between real and synthetic fast-ion measurements at ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Rasmussen, J.; Nielsen, S. K.; Stejner, M.; Geiger, B.; Salewski, M.; Jacobsen, A. S.; Korsholm, S. B.; Leipold, F.; Michelsen, P. K.; Moseev, D.; Schubert, M.; Stober, J.; Tardini, G.; Wagner, D.; The ASDEX Upgrade Team

    2015-07-01

    Internally consistent characterization of the properties of the fast-ion distribution from multiple diagnostics is a prerequisite for obtaining a full understanding of fast-ion behavior in tokamak plasmas. Here we benchmark several absolutely-calibrated core fast-ion diagnostics at ASDEX Upgrade by comparing fast-ion measurements from collective Thomson scattering, fast-ion {{\\text{D}}α} spectroscopy, and neutron rate detectors with numerical predictions from the TRANSP/NUBEAM transport code. We also study the sensitivity of the theoretical predictions to uncertainties in the plasma kinetic profiles. We find that theory and measurements generally agree within these uncertainties for all three diagnostics during heating phases with either one or two neutral beam injection sources. This suggests that the measurements can be described by the same model assuming classical slowing down of fast ions. Since the three diagnostics in the adopted configurations probe partially overlapping regions in fast-ion velocity space, this is also consistent with good internal agreement among the measurements themselves. Hence, our results support the feasibility of combining multiple diagnostics at ASDEX Upgrade to reconstruct the fast-ion distribution function in 2D velocity space.

  1. Collective Thomson scattering measurements of fast-ion transport due to sawtooth crashes in ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Rasmussen, J.; Nielsen, S. K.; Stejner, M.; Galdon-Quiroga, J.; Garcia-Munoz, M.; Geiger, B.; Jacobsen, A. S.; Jaulmes, F.; Korsholm, S. B.; Lazanyi, N.; Leipold, F.; Ryter, F.; Salewski, M.; Schubert, M.; Stober, J.; Wagner, D.; the ASDEX Upgrade Team; the EUROFusion MST1 Team

    2016-11-01

    Sawtooth instabilities can modify heating and current-drive profiles and potentially increase fast-ion losses. Understanding how sawteeth redistribute fast ions as a function of sawtooth parameters and of fast-ion energy and pitch is hence a subject of particular interest for future fusion devices. Here we present the first collective Thomson scattering (CTS) measurements of sawtooth-induced redistribution of fast ions at ASDEX Upgrade. These also represent the first localized fast-ion measurements on the high-field side of this device. The results indicate fast-ion losses in the phase-space measurement volume of about 50% across sawtooth crashes, in good agreement with values predicted with the Kadomtsev sawtooth model implemented in TRANSP and with the sawtooth model in the EBdyna_go code. In contrast to the case of sawteeth, we observe no fast-ion redistribution in the presence of fishbone modes. We highlight how CTS measurements can discriminate between different sawtooth models, in particular when aided by multi-diagnostic velocity-space tomography, and briefly discuss our results in light of existing measurements from other fast-ion diagnostics.

  2. Overview of the new capabilities of TORIC-v6 and comparison with TORIC-v5

    NASA Astrophysics Data System (ADS)

    Bilato, R.; Brambilla, M.; Bertelli, N.

    2016-10-01

    Since its release, version 5 (v5) of the full-wave TORIC code, characterized by an optimized parallelized solver for its routinely use in TRANSP package, has been ameliorated in many technical issues, e.g. the plasma-vacuum transition and the full-spectrum antenna modeling. For the WPCD-benchmark cases a good agreement between the new version, v6, and v5 is found. The major improvement, however, has been done in interfacing TORIC-v6 with the Fokker-Planck SSFPQL solver to account for the back-reaction of ICRF and NBI heating on the wave propagation and absorption. Special algorithms have been developed for SSFPQL for the numerical precision at high pitch-angle resolution and to evaluate the generalized dispersion function directly from the numerical solution. Care has been spent in automatizing the non-linear loop between TORIC-v6 and SSFPQL. In v6 the description of wave absorption at high-harmonics has been revised and applied to DEMO. For high-harmonic regimes there is an ongoing activity on the comparison with AORSA.

  3. CGI-58, a key regulator of lipid homeostasis and signaling in plants, also regulates polyamine metabolism

    USDA-ARS?s Scientific Manuscript database

    Comparative Gene Identification-58 (CGI-58) is an alpha/beta hydrolase-type protein that regulates lipid homeostasis and signaling in eukaryotes by interacting with and stimulating the activity of several different types of proteins, including a lipase in mammalian cells and a peroxisomal ABC transp...

  4. DEVELOPMENT OF BIOAVAILABILITY AND BIOKINETICS DETERMINATION METHODS FOR ORGANIC POLLUTANTS IN SOIL TO ENHANCE IN-SITU AND ON-SITE BIOREMEDIATION

    EPA Science Inventory

    Determination of biodegradation rates of organics in soil slurry and compacted soil systems is essential for evaluating the efficacy of bioremediation for treatment of contaminated soils. In this paper, a systematic protocol has been developed for evaluating bioknetic and transp...

  5. Advanced ST plasma scenario simulations for NSTX

    NASA Astrophysics Data System (ADS)

    Kessel, C. E.; Synakowski, E. J.; Bell, M. E.; Gates, D. A.; Harvey, R. W.; Kaye, S. M.; Mau, T. K.; Menard, J.; Phillips, C. K.; Taylor, G.; Wilson, R.; NSTX Research Team

    2005-08-01

    Integrated scenario simulations are done for NSTX that address four primary objectives for developing advanced spherical torus (ST) configurations: high β and high βN inductive discharges to study all aspects of ST physics in the high β regime; non-inductively sustained discharges for flattop times greater than the skin time to study the various current drive techniques; non-inductively sustained discharges at high β for flattop times much greater than a skin time which provides the integrated advanced ST target for NSTX and non-solenoidal startup and plasma current rampup. The simulations done here use the tokamak simulation code and are based on a discharge 109070. TRANSP analysis of the discharge provided the thermal diffusivities for electrons and ions, the neutral beam deposition profile and other characteristics. CURRAY is used to calculate the high harmonic fast wave (HHFW) heating depositions and current drive. GENRAY/CQL3D is used to establish the heating and CD deposition profiles for electron Bernstein waves (EBW). Analysis of the ideal MHD stability is done with JSOLVER, BALMSC and PEST2. The simulations indicate that the integrated advanced ST plasma is reachable, obtaining stable plasmas with βT ap 40% at βN's of 7.7-9, IP = 1.0 MA and BT = 0.35 T. The plasma is 100% non-inductive and has a flattop of four skin times. The resulting global energy confinement corresponds to a multiplier of H98(y),2 = 1.5. The simulations have demonstrated the importance of HHFW heating and CD, EBW off-axis CD, strong plasma shaping, density control and early heating/H-mode transition for producing and optimizing these plasma configurations.

  6. Comparing simulation of plasma turbulence with experiment

    NASA Astrophysics Data System (ADS)

    Ross, David W.; Bravenec, Ronald V.; Dorland, William; Beer, Michael A.; Hammett, G. W.; McKee, George R.; Fonck, Raymond J.; Murakami, Masanori; Burrell, Keith H.; Jackson, Gary L.; Staebler, Gary M.

    2002-01-01

    The direct quantitative correspondence between theoretical predictions and the measured plasma fluctuations and transport is tested by performing nonlinear gyro-Landau-fluid simulations with the GRYFFIN (or ITG) code [W. Dorland and G. W. Hammett, Phys. Fluids B 5, 812 (1993); M. A. Beer and G. W. Hammett, Phys. Plasmas 3, 4046 (1996)]. In an L-mode reference discharge in the DIII-D tokamak [J. L. Luxon and L. G. Davis, Fusion Technol. 8, 441 (1985)], which has relatively large fluctuations and transport, the turbulence is dominated by ion temperature gradient (ITG) modes. Trapped electron modes and impurity drift waves also play a role. Density fluctuations are measured by beam emission spectroscopy [R. J. Fonck, P. A. Duperrex, and S. F. Paul, Rev. Sci. Instrum. 61, 3487 (1990)]. Experimental fluxes and corresponding diffusivities are analyzed by the TRANSP code [R. J. Hawryluk, in Physics of Plasmas Close to Thermonuclear Conditions, edited by B. Coppi, G. G. Leotta, D. Pfirsch, R. Pozzoli, and E. Sindoni (Pergamon, Oxford, 1980), Vol. 1, p. 19]. The shape of the simulated wave number spectrum is close to the measured one. The simulated ion thermal transport, corrected for E×B low shear, exceeds the experimental value by a factor of 1.5 to 2.0. The simulation overestimates the density fluctuation level by an even larger factor. On the other hand, the simulation underestimates the electron thermal transport, which may be accounted for by modes that are not accessible to the simulation or to the BES measurement.

  7. Screening Maritime Shipping Containers for Weapons of Mass Destruction

    DTIC Science & Technology

    2010-01-01

    for dangerous chemical, biological, radiological, nuclear , and explosives (CBRNE) materials to prevent their unlawful transpOitation into the United...techniques, and various sensor modalities within their respective size, weight, and power constraints. Shipping containers are natural repositories ...isolators. The environmental enclosure has sufficient volume to allow for multiple sensors as well as a truth collection station (Summa canisters

  8. Development of Integrated Magnetic and Kinetic Control-oriented Transport Model for q-profile Response Prediction in EAST Discharges

    NASA Astrophysics Data System (ADS)

    Wang, Hexiang; Schuster, Eugenio; Rafiq, Tariq; Kritz, Arnold; Ding, Siye

    2016-10-01

    Extensive research has been conducted to find high-performance operating scenarios characterized by high fusion gain, good confinement, plasma stability and possible steady-state operation. A key plasma property that is related to both the stability and performance of these advanced plasma scenarios is the safety factor profile. A key component of the EAST research program is the exploration of non-inductively driven steady-state plasmas with the recently upgraded heating and current drive capabilities that include lower hybrid current drive and neutral beam injection. Anticipating the need for tight regulation of the safety factor profile in these plasma scenarios, a first-principles-driven (FPD)control-oriented model is proposed to describe the safety factor profile evolution in EAST in response to the different actuators. The TRANSP simulation code is employed to tailor the FPD model to the EAST tokamak geometry and to convert it into a form suitable for control design. The FPD control-oriented model's prediction capabilities are demonstrated by comparing predictions with experimental data from EAST. Supported by the US DOE under DE-SC0010537,DE-FG02-92ER54141 and DE-SC0013977.

  9. Core heat convection in NSTX-U via modification of electron orbits by high frequency Alfvén eigenmodes

    NASA Astrophysics Data System (ADS)

    Crocker, N. A.; Tritz, K.; White, R. B.; Fredrickson, E. D.; Gorelenkov, N. N.; NSTX-U Team

    2015-11-01

    New simulation results demonstrate that high frequency compressional (CAE) and global (GAE) Alfvén eigenmodes cause radial convection of electrons, with implications for particle and energy confinement, as well as electric field formation in NSTX-U. Simulations of electron orbits in the presence of multiple experimentally determined CAEs and GAEs, using the gyro-center code ORBIT, have revealed substantial convective transport, in addition to the expected diffusion via orbit stochastization. These results advance understanding of anomalous core energy transport expected in high performance, beam-heated NSTX-U plasmas. The simulations make use of experimentally determined density perturbation (δn) amplitudes and mode structures obtained by inverting measurements from 16 a channel reflectometer array using a synthetic diagnostic. Combined with experimentally determined mode polarizations (i.e. CAE or GAE), the δn are used to estimate the ExB displacements for use in ORBIT. Preliminary comparison of the simulation results with transport modeling by TRANSP indicate that the convection is currently underestimated. Supported by US DOE Contracts DE-SC0011810, DE-FG02-99ER54527 & DE-AC02-09CH11466.

  10. Anomalous Transport in High Beta Poloidal DIII-D Discharges

    NASA Astrophysics Data System (ADS)

    Pankin, A.; Garofalo, A.; Kritz, A.; Rafiq, T.; Weiland, J.

    2016-10-01

    Dominant instabilities that drive anomalous transport in high beta poloidal DIII-D discharges are investigated using the MMM7.1, and TGLF models in the predictive integrated modeling TRANSP code. The ion thermal transport is found to be strongly reduced in these discharges, but turbulence driven by the ITG modes along with the neoclassical transport still play a role in determining the ion temperature profiles. The electron thermal transport driven by the ETG modes impact the electron temperature profiles. The E × B flow shear is found to have a small effect in reducing the electron thermal transport. The Shafranov shift is found to strongly reduce the anomalous transport in the high beta poloidal DIII-D discharges. The reduction of Shafranov shift can destroy the ion internal transport barrier and can result in significantly lower core temperatures. The MMM7.1 model predicts electron and ion temperature profiles reasonably well, but it fails to accurately predict the properties of electron internal transport barrier, which indicates that the ETG model in MMM7.1 needs to be improved in the high beta poloidal operational regime. Research supported by the Office of Science, US DOE.

  11. Ship Motions and Capsizing in Astern Seas

    DTIC Science & Technology

    1974-12-01

    result of these experiments and concurrent analytical work,a great deal has been learned about the mechanism of capsizing. This...computer time. It does not appear economically feasible using present-generation machines to numerically simulate a complete experimental...a Fast Cargo Liner in San Francisco Bay." Dept. of Naval Archi- tecture, University of Calif., Berkeley. January 1972. (Dept. of Transp

  12. Real-Time Ada Problem Solution Study

    DTIC Science & Technology

    1989-03-24

    been performed, there is a larger base of information concerning standards and guidelines for Ada usage, as well "lessons learned ". A number of...the target machine and operate in conjunction with the application programs, they also require system resources (CPU,memory). The utilization of...Transporter-Consumer 1694 154 6. Producer-Transpt-Buffer- Transp -Consumer 2248 204 7. Relay 906 82 8. Conditional Entry - no rendezvous 170 15

  13. Seven Experiment Designs Addressing Problems of Safety and Capacity on Two-Lane Rural Highways : Volume 8. Experimental Design and Evaluate Remedial Aids for Intersections with Inadequate Sight Distance

    DOT National Transportation Integrated Search

    2007-01-01

    Americans lose 3.7 billion hours and 2.3 billion gallons of fuel every year sitting in traffic jams, and nearly 24 percent of non-recurring freeway delay, or about 482 million hours, is attributed to work zones. To combat the country's growing transp...

  14. Controlling Threats to Nuclear Security: A Hollistic Model

    DTIC Science & Technology

    1997-06-01

    learned . However, it may also be critical to consider the ability of the people recruited to work together as a team--trust, loyalty, and commitment...material from container Replace container Restore to original condition /sealsr etc.) Detection shield Transp ,ort medium (container) Provide cover for...is no special terminology or notation to be learned ; the model uses whatever terminology and notation is appropriate to the system being analyzed

  15. Suppression of Alfvénic modes through modification of the fast ion distribution

    NASA Astrophysics Data System (ADS)

    Fredrickson, Eric

    2017-10-01

    Experiments on NSTX-U have shown for the first time that small amounts of high pitch-angle, low ρL beam ions can strongly suppress the counter-propagating Global Alfvén Eigenmodes (GAE) [1]. GAE have been implicated in the redistribution of fast ions and modification of the electron power balance in previous experiments on NSTX. The ability to predict the stability of Alfvén modes, and development of methods to control them, is important for fusion reactors like ITER, which like NSTX, will be heated with a large population of non-thermal, super-Alfvénic ions (unlike the normal operation of conventional tokamaks). The suppression of the GAE by adding a small population of high-pitch resonant fast ions is qualitatively consistent with an analytic model of the Doppler-shifted ion-cyclotron resonance drive responsible for GAE instability [2]. The model predicts that fast ions with k⊥ρL <1.9 are stabilizing, which is in good agreement with the experimental observations. A quantitative analysis was done using the HYM stability code [3] of one of the nearly 100 identified examples of GAE suppression. The simulations find remarkable agreement with the observed mode numbers and frequencies of the unstable GAE prior to suppression. Adding the population of high pitch-angle, low ρL beam ions to the HYM fast ion distribution function predicts complete suppression of the GAE. TRANSP/NUBEAM calculations for the example analyzed with HYM suggest that the additional beam source increases the population of resonant fast ions with k⊥ρL <1.9 by roughly a factor of four. Work supported by U.S. DOE Contract DE-AC02-09CH11466.

  16. Time-Dependent Simulations of Fast-Wave Heated High-Non-Inductive-Fraction H-Mode Plasmas in the National Spherical Torus Experiment Upgrade

    NASA Astrophysics Data System (ADS)

    Taylor, Gary; Bertelli, Nicola; Gerhardt, Stefan P.; Hosea, Joel C.; Mueller, Dennis; Perkins, Rory J.; Poli, Francesca M.; Wilson, James R.; Raman, Roger

    2017-10-01

    30 MHz fast-wave heating may be an effective tool for non-inductively ramping low-current plasmas to a level suitable for initiating up to 12 MW of neutral beam injection on the National Spherical Tokamak Experiment Upgrade (NSTX-U). Previously on NSTX 30 MHz fast wave heating was shown to efficiently and rapidly heat electrons; at the NSTX maximum axial toroidal magnetic field (BT(0)) of 0.55 T, 1.4 MW of 30 MHz heating increased the central electron temperature from 0.2 to 2 keV in 30 ms and generated an H-mode plasma with a non-inductive fraction (fNI) ˜ 0.7 at a plasma current (Ip) of 300 kA. NSTX-U will operate at BT(0) up to 1 T, with up to 4 MW of 30 MHz power (Prf). Predictive TRANSP free boundary transport simulations, using the TORIC full wave spectral code to calculate the fast-wave heating and current drive, have been run for NSTX-U Ip = 300 kA H-mode plasmas. Favorable scaling of fNI with 30 MHz heating power is predicted, with fNI ≥ 1 for Prf ≥ 2 MW.

  17. Effects of MHD instabilities on neutral beam current drive

    NASA Astrophysics Data System (ADS)

    Podestà, M.; Gorelenkova, M.; Darrow, D. S.; Fredrickson, E. D.; Gerhardt, S. P.; White, R. B.

    2015-05-01

    Neutral beam injection (NBI) is one of the primary tools foreseen for heating, current drive (CD) and q-profile control in future fusion reactors such as ITER and a Fusion Nuclear Science Facility. However, fast ions from NBI may also provide the drive for energetic particle-driven instabilities (e.g. Alfvénic modes (AEs)), which in turn redistribute fast ions in both space and energy, thus hampering the control capabilities and overall efficiency of NB-driven current. Based on experiments on the NSTX tokamak (M. Ono et al 2000 Nucl. Fusion 40 557), the effects of AEs and other low-frequency magneto-hydrodynamic instabilities on NB-CD efficiency are investigated. A new fast ion transport model, which accounts for particle transport in phase space as required for resonant AE perturbations, is utilized to obtain consistent simulations of NB-CD through the tokamak transport code TRANSP. It is found that instabilities do indeed reduce the NB-driven current density over most of the plasma radius by up to ∼50%. Moreover, the details of the current profile evolution are sensitive to the specific model used to mimic the interaction between NB ions and instabilities. Implications for fast ion transport modeling in integrated tokamak simulations are briefly discussed.

  18. Transport properties of NSTX-U L- and H-mode plasmas

    NASA Astrophysics Data System (ADS)

    Kaye, Stanley; Guttenfelder, Walter; Bell, Ron; Diallo, Ahmed; Leblanc, Ben; Podesta, Mario

    2016-10-01

    The confinement and transport properties of L- and H-mode plasmas in NSTX-U has been studied using the TRANSP code. A dedicated series of L-mode discharges was obtained to study the dependence of confinement and transport on power level and beam aiming angle. The latter is made possible by having two beamlines with 3 sources each, capable of injecting with tangency radii from Rtan = 50 to 130 cm (Rgeo = 92 cm). L-mode plasmas typically have confinement enhancement factors with H98y,2 =0.6 to 0.65, exhibiting a 25% decrease in confinement time as the beam power is raised from 1 to 3 MW. Associated with this is an increase in the electron thermal diffusivity in the core of the plasma from 3.5 to 10 m2/s. Electron thermal transport is the dominant energy loss channel in these plasmas. H-mode plasmas exhibit improved confinement, with H98y,2 =1 or above, and core electron thermal diffusivity values <1 m2/s. Details of these studies will be presented, along with the results of the beam tangency radius scan in L-mode plasmas. This research was supported by the U.S. Department of Energy contract # DE-AC02-09CH11466.

  19. The ‘neutron deficit’ in the JET tokamak

    NASA Astrophysics Data System (ADS)

    Weisen, H.; Kim, Hyun-Tae; Strachan, J.; Scott, S.; Baranov, Y.; Buchanan, J.; Fitzgerald, M.; Keeling, D.; King, D. B.; Giacomelli, L.; Koskela, T.; Weisen, M. J.; Giroud, C.; Maslov, M.; Core, W. G.; Zastrow, K.-D.; Syme, D. B.; Popovichev, S.; Conroy, S.; Lengar, I.; Snoj, L.; Batistoni, P.; Santala, M.; Contributors, JET

    2017-07-01

    The measured D-D neutron rate of neutral beam heated JET baseline and hybrid H-modes in deuterium is found to be between approximately 50% and 100% of the neutron rate expected from the TRANSP code, depending on the plasma parameters. A number of candidate explanations for the shortfall, such as fuel dilution, errors in beam penetration and effectively available beam power have been excluded. As the neutron rate in JET is dominated by beam-plasma interactions, the ‘neutron deficit’ may be caused by a yet unidentified form of fast particle redistribution. Modelling, which assumes fast particle transport to be responsible for the deficit, indicates that such redistribution would have to happen at time scales faster than both the slowing down time and the energy confinement time. Sawteeth and edge localised modes are found to make no significant contribution to the deficit. There is also no obvious correlation with magnetohydrodynamic activity measured using magnetic probes at the tokamak vessel walls. Modelling of fast particle orbits in the 3D fields of neoclassical tearing modes shows that realistically sized islands can only contribute a few percent to the deficit. In view of these results it appears unlikely that the neutron deficit results from a single physical process in the plasma.

  20. Effects of MHD instabilities on neutral beam current drive

    DOE PAGES

    Podestà, M.; Gorelenkova, M.; Darrow, D. S.; ...

    2015-04-17

    One of the primary tools foreseen for heating, current drive (CD) and q-profile control in future fusion reactors such as ITER and a Fusion Nuclear Science Facility is the neutral beam injection (NBI). However, fast ions from NBI may also provide the drive for energetic particle-driven instabilities (e.g. Alfvénic modes (AEs)), which in turn redistribute fast ions in both space and energy, thus hampering the control capabilities and overall efficiency of NB-driven current. Based on experiments on the NSTX tokamak (M. Ono et al 2000 Nucl. Fusion 40 557), the effects of AEs and other low-frequency magneto-hydrodynamic instabilities on NB-CDmore » efficiency are investigated. When looking at the new fast ion transport model, which accounts for particle transport in phase space as required for resonant AE perturbations, is utilized to obtain consistent simulations of NB-CD through the tokamak transport code TRANSP. It is found that instabilities do indeed reduce the NB-driven current density over most of the plasma radius by up to ~50%. Moreover, the details of the current profile evolution are sensitive to the specific model used to mimic the interaction between NB ions and instabilities. Finally, implications for fast ion transport modeling in integrated tokamak simulations are briefly discussed.« less

  1. Model-based Optimization and Feedback Control of the Current Density Profile Evolution in NSTX-U

    NASA Astrophysics Data System (ADS)

    Ilhan, Zeki Okan

    Nuclear fusion research is a highly challenging, multidisciplinary field seeking contributions from both plasma physics and multiple engineering areas. As an application of plasma control engineering, this dissertation mainly explores methods to control the current density profile evolution within the National Spherical Torus eXperiment-Upgrade (NSTX-U), which is a substantial upgrade based on the NSTX device, which is located in Princeton Plasma Physics Laboratory (PPPL), Princeton, NJ. Active control of the toroidal current density profile is among those plasma control milestones that the NSTX-U program must achieve to realize its next-step operational goals, which are characterized by high-performance, long-pulse, MHD-stable plasma operation with neutral beam heating. Therefore, the aim of this work is to develop model-based, feedforward and feedback controllers that can enable time regulation of the current density profile in NSTX-U by actuating the total plasma current, electron density, and the powers of the individual neutral beam injectors. Motivated by the coupled, nonlinear, multivariable, distributed-parameter plasma dynamics, the first step towards control design is the development of a physics-based, control-oriented model for the current profile evolution in NSTX-U in response to non-inductive current drives and heating systems. Numerical simulations of the proposed control-oriented model show qualitative agreement with the high-fidelity physics code TRANSP. The next step is to utilize the proposed control-oriented model to design an open-loop actuator trajectory optimizer. Given a desired operating state, the optimizer produces the actuator trajectories that can steer the plasma to such state. The objective of the feedforward control design is to provide a more systematic approach to advanced scenario planning in NSTX-U since the development of such scenarios is conventionally carried out experimentally by modifying the tokamak's actuator trajectories and analyzing the resulting plasma evolution. Finally, the proposed control-oriented model is embedded in feedback control schemes based on optimal control and Model Predictive Control (MPC) approaches. Integrators are added to the standard Linear Quadratic Gaussian (LQG) and MPC formulations to provide robustness against various modeling uncertainties and external disturbances. The effectiveness of the proposed feedback controllers in regulating the current density profile in NSTX-U is demonstrated in closed-loop nonlinear simulations. Moreover, the optimal feedback control algorithm has been implemented successfully in closed-loop control simulations within TRANSP through the recently developed Expert routine. (Abstract shortened by ProQuest.).

  2. Design and simulation of control algorithms for stored energy and plasma current in non-inductive scenarios on NSTX-U

    NASA Astrophysics Data System (ADS)

    Boyer, Mark; Andre, Robert; Gates, David; Gerhardt, Stefan; Menard, Jonathan; Poli, Francesca

    2015-11-01

    One of the major goals of NSTX-U is to demonstrate non-inductive operation. To facilitate this and other program goals, the center stack has been upgraded and a second neutral beam line has been added with three sources aimed more tangentially to provide higher current drive efficiency and the ability to shape the current drive profile. While non-inductive start-up and ramp-up scenarios are being developed, initial non-inductive studies will likely rely on clamping the Ohmic coil current after the plasma current has been established inductively. In this work the ability to maintain control of stored energy and plasma current once the Ohmic coil has been clamped is explored. The six neutral beam sources and the mid-plane outer gap of the plasma are considered as actuators. System identification is done using TRANSP simulations in which the actuators are modulated around a reference shot. The resulting reduced model is used to design an optimal control law with anti-windup and a recently developed framework for closed loop simulations in TRANSP is used to test the control. Limitations due to actuator saturation are assessed and robustness to beam modulation, changes in the plasma density and confinement, and changes in density and temperature profile shapes are studied. Supported by US DOE contract DE-AC02-09CH11466.

  3. Feasibility study of ECRH in NSTX-U startup plasma

    NASA Astrophysics Data System (ADS)

    Lopez, N. A.; Poli, F.; Taylor, G.; Harvey, R.; Petrov, Yu.

    2016-10-01

    A key mission goal of the National Spherical Torus eXperiment Upgrade (NSTX-U) is the demonstration of fully non-inductive startup and operation. In part to accomplish this, a 1MW, 28 GHz ECRH system is presently being developed for implementation on NSTX-U in 2018. Like most spherical tokamaks, NSTX-U operates in the overdense regime (fpe>fce) , which limits traditional ECRH to the early startup phase. An extensive modelling effort of the propagation and absorption of EC waves in the evolving plasma is thus required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and for current drive during this window. In fact, the ECRH system will play an important role in preparing a target plasma for subsequent injection of IC waves and NBI. Here we assess the feasibility of O1-mode ECRH in NSTX-U startup plasma at full field of 1T through time-dependent simulations performed with the transport solver TRANSP. Linear ray-tracing calculations conducted by GENRAY are coupled into the TRANSP framework, allowing the plasma equilibrium and the temperature profiles to evolve self-consistently in response to the injected microwave power. Furthermore, we investigate additional possibilities of heating and current drive made available through coupling the injected O-mode power to the electrostatic EBW via the slow X-mode as an intermediary.

  4. Fourth Year Status Report. Computerized Training Systems Project. Project ABACUS.

    DTIC Science & Technology

    1976-08-01

    in 7 9. PERFORMING ORGANIZATION NAME AND ADOMEN ,,, 10. PROGRAM ELEMENT. PROJECT , TASK US Army Tra ining Support Center A R E A S WORK UNIT NUMBERS...transp ired during the fourth year of Project ABACUS, the A rmy ’s program for the development of a Computerized Training System. It inc l udes a...have transpired durlnq the fourth year of Project ABACUS, the Army ’s program for the developmen t o~ aprototype Computer i zed Training System. It

  5. Engineer Design of a Mono-Mooring System.

    DTIC Science & Technology

    1966-01-01

    swivels . When asked whether the bogie rails were machined by a large radius boring mill , Mr. Coombe indicated that these rails are rolled then V welded...lifted aboard the transp ort vessel , the disposi- tion of the various system compo- nents shall be as follows: 1. Buoy shall be complete , with...tugboat , equipped with towing winch or pow9r capstan , LOA 110 ’ — 120’, twin screw; BHP-l000 minimum . 7. It has been assumed that weld- ing machines

  6. Simulation of the hybrid and steady state advanced operating modes in ITER

    NASA Astrophysics Data System (ADS)

    Kessel, C. E.; Giruzzi, G.; Sips, A. C. C.; Budny, R. V.; Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Joffrin, E.; Schneider, M.; Murakami, M.; Luce, T.; St. John, Holger; Oikawa, T.; Hayashi, N.; Takizuka, T.; Ozeki, T.; Na, Y.-S.; Park, J. M.; Garcia, J.; Tucillo, A. A.

    2007-09-01

    Integrated simulations are performed to establish a physics basis, in conjunction with present tokamak experiments, for the operating modes in the International Thermonuclear Experimental Reactor (ITER). Simulations of the hybrid mode are done using both fixed and free-boundary 1.5D transport evolution codes including CRONOS, ONETWO, TSC/TRANSP, TOPICS and ASTRA. The hybrid operating mode is simulated using the GLF23 and CDBM05 energy transport models. The injected powers are limited to the negative ion neutral beam, ion cyclotron and electron cyclotron heating systems. Several plasma parameters and source parameters are specified for the hybrid cases to provide a comparison of 1.5D core transport modelling assumptions, source physics modelling assumptions, as well as numerous peripheral physics modelling. Initial results indicate that very strict guidelines will need to be imposed on the application of GLF23, for example, to make useful comparisons. Some of the variations among the simulations are due to source models which vary widely among the codes used. In addition, there are a number of peripheral physics models that should be examined, some of which include fusion power production, bootstrap current, treatment of fast particles and treatment of impurities. The hybrid simulations project to fusion gains of 5.6-8.3, βN values of 2.1-2.6 and fusion powers ranging from 350 to 500 MW, under the assumptions outlined in section 3. Simulations of the steady state operating mode are done with the same 1.5D transport evolution codes cited above, except the ASTRA code. In these cases the energy transport model is more difficult to prescribe, so that energy confinement models will range from theory based to empirically based. The injected powers include the same sources as used for the hybrid with the possible addition of lower hybrid. The simulations of the steady state mode project to fusion gains of 3.5-7, βN values of 2.3-3.0 and fusion powers of 290 to 415 MW, under the assumptions described in section 4. These simulations will be presented and compared with particular focus on the resulting temperature profiles, source profiles and peripheral physics profiles. The steady state simulations are at an early stage and are focused on developing a range of safety factor profiles with 100% non-inductive current.

  7. European Service Module-Structural Test Article Load onto Transp

    NASA Image and Video Library

    2017-06-21

    The Orion service module structural test article for Exploration Mission-1 (EM-1), built by the European Space Agency, is prepared for shipment to Lockheed Martin's Denver facility to undergo testing. Inside the Neil Armstrong Operations and Checkout Building high bay at NASA's Kennedy Space Center in Florida, workers secure the protective covering around the module and a crane lifts the module, secured on stand, for the move to the transport truck. The Orion spacecraft will launch atop the agency's Space Launch System rocket on EM-1 in 2019.

  8. Transport and Performance in DIII--D Discharges with Weak or Negative Central Magnetic Shear

    NASA Astrophysics Data System (ADS)

    Greenfield, C. M.

    1996-11-01

    The previously reported [B.W. Rice et al., Phys. Plasmas 3, 1983 (1996)] improved performance in DIII-D plasmas with weak or negative central magnetic shear has been additionally enhanced in recent experiments where controlled L-H transitions were used to further broaden the pressure profile and delay detrimental MHD activity [E.A. Lazarus et al., submitted to Phys. Rev. Letters]. These discharges exhibit the highest plasma energy (>=4 MJ) and fusion reactivity (R_DD <= 4.8 × 10^16 s-1, Q_DD <= 0.00146, equivalent Q_DT <= 0.32) yet realized in DIII-D. In such discharges, the core magnetic shear is reversed by tailoring the current profile through application of early, low power, neutral beam injection. These plasmas often undergo a transition to a high performance state, usually following an increase in the applied heating power. At the transition time, we observe the formation of an internal transport barrier near the location of the minimum safety factor, q_min. Formation of this barrier, which can result in central peaking of the temperature and density profiles, is consistent with suppression of turbulence by locally enhanced E×B flow shear. Beam emission spectroscopy and far infrared scattering measurements made in the vicinity of the barrier show that at the time of transition to high performance, fluctuation levels are reduced to below the threshold of detection (tilden/n <= 0.1%). Analysis with the ONETWO and TRANSP transport codes indicates concomitant reductions in the core ion thermal diffusivity to levels at or below Chang-Hinton neoclassical. Smaller reductions are indicated for the electrons. An L-H transition is programmed shortly before the plasma would become MHD unstable in order to broaden the profiles and delay the onset of instabilities. In the resulting state, the region exhibiting ion diffusivities at or below neoclassical is extended to nearly the entire plasma. Analysis to date suggests that the effect of strongly negative vs. weak magnetic shear on transport is negligible, although there is a significant effect on stability. A comparison of transport in strong and weakly sheared discharges will be shown, both in L- (peaked profiles) and H-mode (broadened profiles).

  9. Optimization of the extraction of the p-menthadienol isomers and aristolone contained in the essential oil from Elyonurus hensii using a 23 full factorial design.

    PubMed

    Loumouamou, Aubin Nestor; Bikindou, Kévin; Bitemou, Ernest; Chalard, Pierre; Silou, Thomas; Figueredo, Gilles

    2017-05-01

    The aim of this study was to optimize the extraction of p -menthadienol isomers and aristolone from the essential oil of Elyonurus hensii by hydrodistillation. The study of the seasonal variation in the chemical composition has shown that the plant material has been subject to a natural selection regarding the biosynthesis of the p -menthadienol isomers: during periods of water stress, the extracts are rich in cis and trans-p -mentha-1(7),8-dien-2-ol and poor in cis and trans-p -mentha-2,8-dien-1-ol. Regarding the modeling, eight experiments were carried out by considering three easily interpretable factors (the extraction duration, the residual water content and the state of the division of the plant material). The average yield was 1.33% for the aerial part and 0.74% for the roots. The residual water content is the most important factor, which significantly influences the average yield of the essential oil and the content of the major constituents. Regarding the aerial part, a low residual water content of the plant material varies the essential oil yield (from 0.40% to 2.11%) and the content of cis and trans - p -mentha-2.8-dien-1-ol (from 15.87% to 23.24%). At the root level, the samples that have a very low residual water content provide extracts richer in aristolone. The combined effects of the extraction duration, the state of division, and the residual water content influence greatly the extraction of aristolone (from 36.68% to 54.55%). However, these interactions are more complex and difficult to assess.

  10. Determination of hydrogen/deuterium ratio with neutron measurements on MAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klimek, I., E-mail: iwona.klimek@physics.uu.se; Cecconello, M.; Ericsson, G.

    2014-11-15

    On MAST, compressional Alfvén eigenmodes can be destabilized by the presence of a sufficiently large population of energetic particles in the plasma. This dependence was studied in a series of very similar discharges in which increasing amounts of hydrogen were puffed into a deuterium plasma. A simple method to estimate the isotopic ratio n{sub H}/n{sub D} using neutron emission measurements is here described. The inferred isotopic ratio ranged from 0.0 to 0.6 and no experimental indication of changes in radial profile of n{sub H}/n{sub D} were observed. These findings are confirmed by TRANSP/NUBEAM simulations of the neutron emission.

  11. Initial transport validation studies using NSTX-U L-mode plasmas

    NASA Astrophysics Data System (ADS)

    Guttenfelder, Walter; Battaglia, D.; Bell, R. E.; Boyer, M. D.; Crocker, N.; Diallo, A.; Ferraro, N.; Gerhardt, S. P.; Kaye, S. M.; Leblanc, B. P.; Liu, D.; Menard, J. E.; Mueller, D.; Myer, C.; Podesta, M.; Raman, R.; Ren, Y.; Sabbagh, S.; Smith, D.

    2016-10-01

    A variety of stationary L-mode plasmas have been successfully developed in NSTX-U for physics validation studies. The plasmas span a range of density (1-4 ×1019 m-3) , plasma current (0.65-1.0 MA), and neutral beam heating power (<=4 MW), taking advantage of new, more tangential neutral beam sources to vary rotation profiles. Transport analysis (TRANSP) and turbulence measurements (BES, reflectometry) of these plasmas will be illustrated and compared with initial microstability and transport predictions. In particular, the normalized beta of these L-modes range between βN = 1-2, providing a valuable bridge in parameter space between (i) H-modes at comparable beta in conventional tokamaks (R/a 3, βN 2), where transport models have been largely developed and tested, and (ii) low-aspect-ratio H-modes at higher beta (R/a 1.5-1.7, βN 5), where transport models are less tested and challenged by stronger electromagnetic and equilibrium effects. This work is supported by US DOE contract DE-AC02-09CH11466.

  12. The role of turbulent suppression in the triggering ITBs on C-Mod

    NASA Astrophysics Data System (ADS)

    Zhurovich, K.; Fiore, C. L.; Ernst, D. R.; Bonoli, P. T.; Greenwald, M. J.; Hubbard, A. E.; Hughes, J. W.; Marmar, E. S.; Mikkelsen, D. R.; Phillips, P.; Rice, J. E.

    2007-11-01

    Internal transport barriers can be routinely produced in C-Mod steady EDA H-mode plasmas by applying ICRF at |r/a|>= 0.5. Access to the off-axis ICRF heated ITBs may be understood within the paradigm of marginal stability. Analysis of the Te profiles shows a decrease of R/LTe in the ITB region as the RF resonance is moved off axis. Ti profiles broaden as the ICRF power deposition changes from on-axis to off-axis. TRANSP calculations of the Ti profiles support this trend. Linear GS2 calculations do not reveal any difference in ETG growth rate profiles for ITB vs. non-ITB discharges. However, they do show that the region of stability to ITG modes widens as the ICRF resonance is moved outward. Non-linear simulations show that the outward turbulent particle flux exceeds the Ware pinch by factor of 2 in the outer plasma region. Reducing the temperature gradient significantly decreases the diffusive flux and allows the Ware pinch to peak the density profile. Details of these experiments and simulations will be presented.

  13. Alfvén cascades in JET discharges with NBI-heating

    NASA Astrophysics Data System (ADS)

    Sharapov, S. E.; Alper, B.; Baranov, Yu. F.; Berk, H. L.; Borba, D.; Boswell, C.; Breizman, B. N.; Challis, C. D.; de Baar, M.; DeLa Luna, E.; Evangelidis, E. A.; Hacquin, S.; Hawkes, N. C.; Kiptily, V. G.; Pinches, S. D.; Sandquist, P.; Voitsekhovich, I.; Young, N. P.; Contributors, JET-EFDA

    2006-10-01

    Alfvén cascade (AC) eigenmodes excited by energetic ions accelerated with ion-cyclotron resonance heating in JET reversed-shear discharges are studied experimentally in high-density plasmas fuelled by neutral beam injection (NBI) and by deuterium pellets. The recently developed O-mode interferometry technique and Mirnov coils are employed for detecting ACs. The spontaneous improvements in plasma confinement (internal transport barrier (ITB) triggering events) and grand ACs are found to correlate within 0.2 s in JET plasmas with densities up to ~5 × 1019 m-3. Measurements with high time resolution show that ITB triggering events happen before 'grand' ACs in the majority of JET discharges, indicating that this improvement in confinement is likely to be associated with the decrease in the density of rational magnetic surfaces just before qmin(t) passes an integer value. Experimentally observed ACs excited by sub-Alfvénic NBI-produced ions with parallel velocities as low as V||NBI ap 0.2 · VA are found to be most likely associated with the geodesic acoustic effect that significantly modifies the shear-Alfvén dispersion relation at low frequency. Experiments were performed with a tritium NBI-blip (short time pulse) into JET plasmas with NBI-driven ACs. Although considerable NBI-driven AC activity was present, good agreement was found both in the radial profile and in the time evolution of DT neutrons between the neutron measurements and the TRANSP code modelling based on the Coulomb collision model, indicating the ACs have at most a small effect on fast particle confinement in this case.

  14. Modeling of potential TAE-induced beam ion loss from NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Darrow, Douglass; Fredrickson, Eric; Podesta, Mario; White, Roscoe; Liu, Deyong

    2015-11-01

    NSTX-U will add three additional neutral beam sources, whose tangency radii of 1.1, 1.2, and 1.3 m, are significantly larger than the 0.5, 0.6, and 0.7 m tangency radii of the neutral beams previously used in NSTX. These latter beams will also be used in NSTX-U. Here, we attempt to formulate an estimate of the propensity of the beam ions from all the various sources to be lost under a range of NSTX-U plasma conditions. This estimation is based upon TRANSP calculations of beam ion deposition in phase space, and the location of the FLR-corrected loss boundary in that phase space. Since TAEs were a prominent driver of beam ion loss in NSTX, we incorporate their effects through the following process: NOVA modeling of TAEs in the anticipated NSTX-U plasma conditions gives the mode numbers, frequencies, and mode structures that are likely to occur. Using this information as inputs to the guiding center ORBIT code, it is possible to find resonant surfaces in the same phase space along which beam ions would be able to diffuse under the influence of the modes. The degree to which these resonant surfaces intersect both the beam deposition volume and the orbit loss boundary should then give a sense of the propensity of that beam population to be lost from the plasma. Work supported by US DOE contracts DE-AC0209CH11466, DE-FG02-06ER54867, and DE-FG03-02ER54681.

  15. Space-Group Symmetries Generate Chaotic Fluid Advection in Crystalline Granular Media

    NASA Astrophysics Data System (ADS)

    Turuban, R.; Lester, D. R.; Le Borgne, T.; Méheust, Y.

    2018-01-01

    The classical connection between symmetry breaking and the onset of chaos in dynamical systems harks back to the seminal theory of Noether [Transp. Theory Statist. Phys. 1, 186 (1918), 10.1080/00411457108231446]. We study the Lagrangian kinematics of steady 3D Stokes flow through simple cubic and body-centered cubic (bcc) crystalline lattices of close-packed spheres, and uncover an important exception. While breaking of point-group symmetries is a necessary condition for chaotic mixing in both lattices, a further space-group (glide) symmetry of the bcc lattice generates a transition from globally regular to globally chaotic dynamics. This finding provides new insights into chaotic mixing in porous media and has significant implications for understanding the impact of symmetries upon generic dynamical systems.

  16. OMFIT Tokamak Profile Data Fitting and Physics Analysis

    DOE PAGES

    Logan, N. C.; Grierson, B. A.; Haskey, S. R.; ...

    2018-01-22

    Here, One Modeling Framework for Integrated Tasks (OMFIT) has been used to develop a consistent tool for interfacing with, mapping, visualizing, and fitting tokamak profile measurements. OMFIT is used to integrate the many diverse diagnostics on multiple tokamak devices into a regular data structure, consistently applying spatial and temporal treatments to each channel of data. Tokamak data are fundamentally time dependent and are treated so from the start, with front-loaded and logic-based manipulations such as filtering based on the identification of edge-localized modes (ELMs) that commonly scatter data. Fitting is general in its approach, and tailorable in its application inmore » order to address physics constraints and handle the multiple spatial and temporal scales involved. Although community standard one-dimensional fitting is supported, including scale length–fitting and fitting polynomial-exponential blends to capture the H-mode pedestal, OMFITprofiles includes two-dimensional (2-D) fitting using bivariate splines or radial basis functions. These 2-D fits produce regular evolutions in time, removing jitter that has historically been smoothed ad hoc in transport applications. Profiles interface directly with a wide variety of models within the OMFIT framework, providing the inputs for TRANSP, kinetic-EFIT 2-D equilibrium, and GPEC three-dimensional equilibrium calculations. he OMFITprofiles tool’s rapid and comprehensive analysis of dynamic plasma profiles thus provides the critical link between raw tokamak data and simulations necessary for physics understanding.« less

  17. OMFIT Tokamak Profile Data Fitting and Physics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, N. C.; Grierson, B. A.; Haskey, S. R.

    Here, One Modeling Framework for Integrated Tasks (OMFIT) has been used to develop a consistent tool for interfacing with, mapping, visualizing, and fitting tokamak profile measurements. OMFIT is used to integrate the many diverse diagnostics on multiple tokamak devices into a regular data structure, consistently applying spatial and temporal treatments to each channel of data. Tokamak data are fundamentally time dependent and are treated so from the start, with front-loaded and logic-based manipulations such as filtering based on the identification of edge-localized modes (ELMs) that commonly scatter data. Fitting is general in its approach, and tailorable in its application inmore » order to address physics constraints and handle the multiple spatial and temporal scales involved. Although community standard one-dimensional fitting is supported, including scale length–fitting and fitting polynomial-exponential blends to capture the H-mode pedestal, OMFITprofiles includes two-dimensional (2-D) fitting using bivariate splines or radial basis functions. These 2-D fits produce regular evolutions in time, removing jitter that has historically been smoothed ad hoc in transport applications. Profiles interface directly with a wide variety of models within the OMFIT framework, providing the inputs for TRANSP, kinetic-EFIT 2-D equilibrium, and GPEC three-dimensional equilibrium calculations. he OMFITprofiles tool’s rapid and comprehensive analysis of dynamic plasma profiles thus provides the critical link between raw tokamak data and simulations necessary for physics understanding.« less

  18. Non-inductive Hybrid Scenario-Transport and Turbulence at Reduced Plasma Torque

    NASA Astrophysics Data System (ADS)

    Thome, K. E.; Petty, C. C.; Pace, D. C.; Turco, F.; Rhodes, T. L.

    2016-10-01

    As the neutral beam injection (NBI) torque is lowered in steady-state hybrid plasmas via counter-beam injection, increased turbulence and thermal transport is observed, particularly in the ion channel. These discharges require Pco-NBI = 11 MW and PECH = 3 MW to achieve zero surface loop voltage. As the beam torque is reduced from 8.5 N-m to 4 N-m with βN 3 and q95 6 , the global confinement decreases from H 98 y , 2 of 1.5 to 1.2 . Local transport analysis using TRANSP shows that the lower torque discharges have increased ion thermal diffusivity across the whole profile and increased electron thermal diffusivity localized to the ρ = 0.7 region. Similarly, Doppler Backscattering shows increased density fluctuations at intermediate wavenumbers at the lower torque. However, fast-ion transport caused by off-axis fishbones favorably decreases from 0.7m2 /s to 0.1m2 /s as the torque is lowered, partially offsetting the thermal transport reduction. These measured changes in turbulence and transport are being compared to plasma simulations using TGLF/GYRO to better predict the confinement of future steady-state hybrids that will be primarily RF-heated. Work supported by the US DOE under DE-FC02-04ER54698.

  19. Observation of Alpha-Driven TAEs in TFTR

    NASA Astrophysics Data System (ADS)

    Nazikian, R.; Chang, Z.; Fu, G. Y.; Majeski, R.; Batha, S.; Bell, M.; Budny, R.; Cheng, C. Z.; Darrow, D. S.; Duong, H.; Efthimion, P. C.; Fredrickson, E.; Levinton, F.; Mazzucato, E.; Medley, S.; Taylor, S.; Zweben, G.

    1996-11-01

    Transient mode activity in the TAE range of frequencies (150-170 kHz) with toroidal mode numbers n=2,3 is observed in reduced magnetic shear DT discharges on TFTR with a fusion power threshold of ~1.5 MW. Mode activity appears 50-100 msec after NBI in discharges with the following machine parameter: B=5.3 T, I=1.6MA, R=260cm, P_NBI=25-28 MW, q(0)>2.0 from MSE and toroidal beta β_T<1%. The elevated q(0) and reduced central shear |s|<0.2 is achieved using a full size plasma startup with delayed NBI. Theoretical calculations using NOVA-K indicate that the combined effect of low shear, low beta and elevated q(0) leads to a very low instability threshold for the alpha-driven TAE with β_α (0) ~ 10-4. This appears to be consistent with experimental observations of mode activity in DT plasmas with β_α ~ 10-4 (determined from TRANSP analysis). Thus far the modes have only been observed on Mirnov coils with fluctuation levels tildeB ~ 1mG. Efforts to determine mode location by perturbing the edge density and inducing strong toroidal velocity shear will be reported, as will efforts to affect mode stability by systematically varying the central safety factor.

  20. Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III

    1996-01-01

    Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.

  1. Transportation Lines on the Atlantic, Gulf, and Pacific Coasts, Transportation Series 5, 1980.

    DTIC Science & Technology

    1982-05-01

    DIP?. 6OU009. r11 77611 7 61117 79*9290 * 79*9lP, CO. 7.O 05ll PkIIP63&, PA. t,10133 6111 CA199A M ARINIPLI0NM P el . I. M INC. NIIOP9 L0f.0 P ulf" few...iUAVA4mI LA ’UlO oIlO Ol696 S PAIN~, 4 6.. 14AR146 to ROSY TOK 4 0 *3 RA C84 ? tlO go.. INC. 1106AO6d t05m. 4lls 3 spl1olEs Ulm TRANSP64? ORO M T Fr AVlNU...BrA44 69D1164 CO . I IllF TwoI IT.?4 I OALfln, El . ile 0’-- -- 1% XXVIII TABLE 1-ALPHABETICAL LIST OF TRANSPORTATION LINES 5 OPERATOR NIE ADORES 9

  2. On iterative algorithms for quantitative photoacoustic tomography in the radiative transport regime

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Zhou, Tie

    2017-11-01

    In this paper, we present a numerical reconstruction method for quantitative photoacoustic tomography (QPAT), based on the radiative transfer equation (RTE), which models light propagation more accurately than diffusion approximation (DA). We investigate the reconstruction of absorption coefficient and scattering coefficient of biological tissues. An improved fixed-point iterative method to retrieve the absorption coefficient, given the scattering coefficient, is proposed for its cheap computational cost; the convergence of this method is also proved. The Barzilai-Borwein (BB) method is applied to retrieve two coefficients simultaneously. Since the reconstruction of optical coefficients involves the solutions of original and adjoint RTEs in the framework of optimization, an efficient solver with high accuracy is developed from Gao and Zhao (2009 Transp. Theory Stat. Phys. 38 149-92). Simulation experiments illustrate that the improved fixed-point iterative method and the BB method are competitive methods for QPAT in the relevant cases.

  3. Probing spherical tokamak plasmas using charged fusion products

    NASA Astrophysics Data System (ADS)

    Boeglin, Werner U.; Perez, Ramona V.; Darrow, Douglass S.; Cecconello, Marco; Klimek, Iwona; Allan, Scott Y.; Akers, Rob J.; Jones, Owen M.; Keeling, David L.; McClements, Ken G.; Scannell, Rory

    2015-11-01

    The detection of charged fusion products, such as protons and tritons resulting from D(d,p)t reactions, can be used to determine the fusion reaction rate profile in large spherical tokamak plasmas with neutral beam heating. The time resolution of a diagnostic of this type makes it possible to study the slowly-varying beam density profile, as well as rapid changes resulting from MHD instabilities. A 4-channel prototype proton detector (PD) was installed and operated on the MAST spherical tokamak in August/September 2013, and a new 6-channel system for the NSTX-U spherical tokamak is under construction. PD and neutron camera measurements obtained on MAST will be compared with TRANSP calculations, and the design of the new NSTX-U system will be presented, together with the first results from this diagnostic, if available. Supported in part by DOE DE-SC0001157.

  4. Electron critical gradient scale length measurements of ICRF heated L-mode plasmas at Alcator C-Mod tokamak

    NASA Astrophysics Data System (ADS)

    Houshmandyar, S.; Hatch, D. R.; Horton, C. W.; Liao, K. T.; Phillips, P. E.; Rowan, W. L.; Zhao, B.; Cao, N. M.; Ernst, D. R.; Greenwald, M.; Howard, N. T.; Hubbard, A. E.; Hughes, J. W.; Rice, J. E.

    2018-04-01

    A profile for the critical gradient scale length (Lc) has been measured in L-mode discharges at the Alcator C-Mod tokamak, where electrons were heated by an ion cyclotron range of frequency through minority heating with the intention of simultaneously varying the heat flux and changing the local gradient. The electron temperature gradient scale length (LTe-1 = |∇Te|/Te) profile was measured via the BT-jog technique [Houshmandyar et al., Rev. Sci. Instrum. 87, 11E101 (2016)] and it was compared with electron heat flux from power balance (TRANSP) analysis. The Te profiles were found to be very stiff and already above the critical values, however, the stiffness was found to be reduced near the q = 3/2 surface. The measured Lc profile is in agreement with electron temperature gradient (ETG) models which predict the dependence of Lc-1 on local Zeff, Te/Ti, and the ratio of the magnetic shear to the safety factor. The results from linear Gene gyrokinetic simulations suggest ETG to be the dominant mode of turbulence in the electron scale (k⊥ρs > 1), and ion temperature gradient/trapped electron mode modes in the ion scale (k⊥ρs < 1). The measured Lc profile is in agreement with the profile of ETG critical gradients deduced from Gene simulations.

  5. Control and data acquisition upgrades for NSTX-U

    DOE PAGES

    Davis, W. M.; Tchilinguirian, G. J.; Carroll, T.; ...

    2016-06-06

    The extensive NSTX Upgrade (NSTX-U) Project includes major components which allow a doubling of the toroidal field strength to 1 T, of the Neutral Beam heating power to 12 MW, and the plasma current to 2 MA, and substantial structural enhancements to withstand the increased electromagnetic loads. The maximum pulse length will go from 1.5 to 5 s. The larger and more complex forces on the coils will be protected by a Digital Coil Protection System, which requires demanding real-time data input rates, calculations and responses. The amount of conventional digitized data for a given pulse is expected to increasemore » from 2.5 to 5 GB per second of pulse. 2-D Fast Camera data is expected to go from 2.5 GB/pulse to 10, and another 2 GB/pulse is expected from new IR cameras. Our network capacity will be increased by a factor of 10, with 10 Gb/s fibers used for the major trunks. 32-core Linux systems will be used for several functions, including between-shot data processing, MDSplus data serving, between-shot EFIT analysis, real-time processing, and for a new capability, between-shot TRANSP. As a result, improvements to the MDSplus events subsystem will be made through the use of both UDP and TCP/IP based methods and the addition of a dedicated “event server”.« less

  6. X-Antenna: A graphical interface for antenna analysis codes

    NASA Technical Reports Server (NTRS)

    Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.

    1995-01-01

    This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.

  7. Methodology for fast detection of false sharing in threaded scientific codes

    DOEpatents

    Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang

    2014-11-25

    A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.

  8. Network analysis for the visualization and analysis of qualitative data.

    PubMed

    Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D

    2018-03-01

    We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Posttest analysis of the FFTF inherent safety tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padilla, A. Jr.; Claybrook, S.W.

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less

  10. GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winn, W.G.

    The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC.

  11. Integrated Composite Analyzer (ICAN): Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Chamis, C. C.

    1986-01-01

    The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.

  12. Reliable absolute analog code retrieval approach for 3D measurement

    NASA Astrophysics Data System (ADS)

    Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Chen, Deyun

    2017-11-01

    The wrapped phase of phase-shifting approach can be unwrapped by using Gray code, but both the wrapped phase error and Gray code decoding error can result in period jump error, which will lead to gross measurement error. Therefore, this paper presents a reliable absolute analog code retrieval approach. The combination of unequal-period Gray code and phase shifting patterns at high frequencies are used to obtain high-frequency absolute analog code, and at low frequencies, the same unequal-period combination patterns are used to obtain the low-frequency absolute analog code. Next, the difference between the two absolute analog codes was employed to eliminate period jump errors, and a reliable unwrapped result can be obtained. Error analysis was used to determine the applicable conditions, and this approach was verified through theoretical analysis. The proposed approach was further verified experimentally. Theoretical analysis and experimental results demonstrate that the proposed approach can perform reliable analog code unwrapping.

  13. A Semantic Analysis Method for Scientific and Engineering Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  14. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    NASA Astrophysics Data System (ADS)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  15. Gene expression variations during Drosophila metamorphosis in real and simulated gravity

    NASA Astrophysics Data System (ADS)

    Marco, R.; Leandro-García, L. J.; Benguría, A.; Herranz, R.; Zeballos, A.; Gassert, G.; van Loon, J. J.; Medina, F. J.

    Establishing the extent and significance of the effects of the exposure to microgravity of complex living organisms is a critical piece of information if the long-term exploration of near-by planets involving human beings is going to take place in the Future As a first step in this direction we have started to look into the patterns of gene expression during Drosophila development in real and simulated microgravity using microarray analysis of mRNA isolated from samples exposed to different environmental conditions In these experiments we used Affymetrix chips version 1 0 containing probes for more than 14 000 genes almost the complete Drosophila genome 55 of which are tagged with some molecular or functional designation while 45 are still waiting to be identified in functional terms The real microgravity exposure was imposed on the samples during the crew exchanging Soyuz 8 Mission to the ISS in October 2003 when after 11 days in Microgravity the Spanish-born astronaut Pedro Duque returned in the Soyuz 7 capsule carrying the experiments prepared by our Team Due to the constraints in the current ISS experiments in these Missions we limited the stages explored in our experiment to the developmental processes occurring during Drosophila metamorphosis As the experimental conditions at the launch site Baikonour were fairly limited we prepared the experiment in Madrid Toulouse and transp o rted the samples at 15 C in a temperature controlled container to slow down the developmental process a

  16. Predictive Rotation Profile Control for the DIII-D Tokamak

    NASA Astrophysics Data System (ADS)

    Wehner, W. P.; Schuster, E.; Boyer, M. D.; Walker, M. L.; Humphreys, D. A.

    2017-10-01

    Control-oriented modeling and model-based control of the rotation profile are employed to build a suitable control capability for aiding rotation-related physics studies at DIII-D. To obtain a control-oriented model, a simplified version of the momentum balance equation is combined with empirical representations of the momentum sources. The control approach is rooted in a Model Predictive Control (MPC) framework to regulate the rotation profile while satisfying constraints associated with the desired plasma stored energy and/or βN limit. Simple modifications allow for alternative control objectives, such as maximizing the plasma rotation while maintaining a specified input torque. Because the MPC approach can explicitly incorporate various types of constraints, this approach is well suited to a variety of control objectives, and therefore serves as a valuable tool for experimental physics studies. Closed-loop TRANSP simulations are presented to demonstrate the effectiveness of the control approach. Supported by the US DOE under DE-SC0010661 and DE-FC02-04ER54698.

  17. Cosmic rays and the magnetic field in the nearby starburst galaxy NGC 253

    NASA Astrophysics Data System (ADS)

    Heesen, Volker

    2008-02-01

    The transport of cosmic rays (CR's) in large-scale magnetic fields can be bes t investigated in edge-on galaxies with radio continuum observations including p olarization. I observed the nearby starburst galaxy NGC 253 which hosts one of t he brightest known radio halos with the Effelsberg 100-m telescope and the VLA i nterferometer. The vertical emission profiles follow closely a two-component exp onential distribution where the scaleheight is a linear function of the synchrot ron lifetime of the CR electrons. This requires a convection dominated CR transp ort from the disk into the halo while the CR's lose their energy due to synchrot ron radiation the so-called CR aging. The interaction of the "disk-wind" with th e magnetic field explains the "X"-shaped magnetic field structure centered on th e nucleus where the ordered magnetic field is amplified by compression in the bo undaries of the expanding superbubbles of hot gas.

  18. Current profile redistribution driven by neutral beam injection in a reversed-field pinch

    NASA Astrophysics Data System (ADS)

    Parke, E.; Anderson, J. K.; Brower, D. L.; Den Hartog, D. J.; Ding, W. X.; Johnson, C. A.; Lin, L.

    2016-05-01

    Neutral beam injection in reversed-field pinch (RFP) plasmas on the Madison Symmetric Torus [Dexter et al., Fusion Sci. Technol. 19, 131 (1991)] drives current redistribution with increased on-axis current density but negligible net current drive. Internal fluctuations correlated with tearing modes are observed on multiple diagnostics; the behavior of tearing mode correlated structures is consistent with flattening of the safety factor profile. The first application of a parametrized model for island flattening to temperature fluctuations in an RFP allows inferrence of rational surface locations for multiple tearing modes. The m = 1, n = 6 mode is observed to shift inward by 1.1 ± 0.6 cm with neutral beam injection. Tearing mode rational surface measurements provide a strong constraint for equilibrium reconstruction, with an estimated reduction of q0 by 5% and an increase in on-axis current density of 8% ± 5%. The inferred on-axis current drive is consistent with estimates of fast ion density using TRANSP [Goldston et al., J. Comput. Phys. 43, 61 (1981)].

  19. FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna

    2016-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  20. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  1. Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components

    NASA Technical Reports Server (NTRS)

    Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.

    1999-01-01

    This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.

  2. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  3. Wake coupling to full potential rotor analysis code

    NASA Technical Reports Server (NTRS)

    Torres, Francisco J.; Chang, I-Chung; Oh, Byung K.

    1990-01-01

    The wake information from a helicopter forward flight code is coupled with two transonic potential rotor codes. The induced velocities for the near-, mid-, and far-wake geometries are extracted from a nonlinear rigid wake of a standard performance and analysis code. These, together with the corresponding inflow angles, computation points, and azimuth angles, are then incorporated into the transonic potential codes. The coupled codes can then provide an improved prediction of rotor blade loading at transonic speeds.

  4. Comparative analysis of design codes for timber bridges in Canada, the United States, and Europe

    Treesearch

    James Wacker; James (Scott) Groenier

    2010-01-01

    The United States recently completed its transition from the allowable stress design code to the load and resistance factor design (LRFD) reliability-based code for the design of most highway bridges. For an international perspective on the LRFD-based bridge codes, a comparative analysis is presented: a study addressed national codes of the United States, Canada, and...

  5. Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO

    NASA Technical Reports Server (NTRS)

    Stallworth, R.; Meyers, C. A.; Stinson, H. C.

    1989-01-01

    Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.

  6. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less

  7. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  8. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  9. Measurements of impurity concentrations and transport in the Lithium Tokamak Experiment

    NASA Astrophysics Data System (ADS)

    Boyle, Dennis Patrick

    This thesis presents new measurements of core impurity concentrations and transport in plasmas with lithium coatings on all-metal plasma facing components (PFCs) in the Lithium Tokamak Experiment (LTX). LTX is a modest-sized spherical tokamak uniquely capable of operating with large area solid and/or liquid lithium coatings essentially surrounding the entire plasma (as opposed to just the divertor or limiter region in other devices). Lithium (Li) wall-coatings have improved plasma performance and confinement in several tokamaks with carbon (C) PFCs, including the National Spherical Torus Experiment (NSTX). In NSTX, contamination of the core plasma with Li impurities was very low (<0.1%) despite extensive divertor coatings. Low Li levels in NSTX were found to be largely due to neoclassical forces from the high level of C impurities. Studying impurity levels and transport with Li coatings on stainless steel surfaces in LTX is relevant to future devices (including future enhancements to NSTX-Upgrade) with all-metal PFCs. The new measurements in this thesis were enabled by a refurbished Thomson scattering system and improved impurity spectroscopy, primarily using a novel visible spectrometer monitoring several Li, C, and oxygen (O) emission lines. A simple model was used to account for impurities in unmeasured charge states, assuming constant density in the plasma core and constant concentration in the edge. In discharges with solid Li coatings, volume averaged impurity concentrations were low but non-negligible, with 2-4% Li, 0.6-2% C, 0.4-0.7% O, and Z eff<1.2. Transport was assessed using the TRANSP, NCLASS, and MIST codes. Collisions with the main H ions dominated the neoclassical impurity transport, unlike in NSTX, where collisions with C dominated. Furthermore, neoclassical transport coefficients calculated with NCLASS were similar across all impurity species and differed no more than a factor of two, in contrast to NSTX where they differed by an order of magnitude. However, time-independent simulations with MIST indicated that unlike NSTX, neoclassical theory did not fully capture the impurity transport and anomalous transport likely played a significant role in determining impurity profiles.

  10. Measurements of impurity concentrations and transport in the Lithium Tokamak Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyle, Dennis Patrick

    This thesis presents new measurements of core impurity concentrations and transport in plasmas with lithium coatings on all-metal plasma facing components (PFCs) in the Lithium Tokamak Experiment (LTX). LTX is a modest-sized spherical tokamak uniquely capable of operating with large area solid and/or liquid lithium coatings essentially surrounding the entire plasma (as opposed to just the divertor or limiter region in other devices). Lithium (Li) wall-coatings have improved plasma performance and confinement in several tokamaks with carbon (C) PFCs, including the National Spherical Torus Experiment (NSTX). In NSTX, contamination of the core plasma with Li impurities was very low (<0.1%)more » despite extensive divertor coatings. Low Li levels in NSTX were found to be largely due to neoclassical forces from the high level of C impurities. Studying impurity levels and transport with Li coatings on stainless steel surfaces in LTX is relevant to future devices (including future enhancements to NSTX-Upgrade) with all-metal PFCs. The new measurements in this thesis were enabled by a refurbished Thomson scattering system and improved impurity spectroscopy, primarily using a novel visible spectrometer monitoring several Li, C, and oxygen (O) emission lines. A simple model was used to account for impurities in unmeasured charge states, assuming constant density in the plasma core and constant concentration in the edge. In discharges with solid Li coatings, volume averaged impurity concentrations were low but non-negligible, with~2-4% Li, ~0.6-2% C, ~0.4-0.7% O, and Z_eff<1.2. Transport was assessed using the TRANSP, NCLASS, and MIST codes. Collisions with the main H ions dominated the neoclassical impurity transport, unlike in NSTX, where collisions with C dominated. Furthermore, neoclassical transport coefficients calculated with NCLASS were similar across all impurity species and differed no more than a factor of two, in contrast to NSTX where they differed by an order of magnitude. However, time-independent simulations with MIST indicated that unlike NSTX, neoclassical theory did not fully capture the impurity transport and anomalous transport likely played a significant role in determining impurity profiles.« less

  11. Combustion: Structural interaction in a viscoelastic material

    NASA Technical Reports Server (NTRS)

    Chang, T. Y.; Chang, J. P.; Kumar, M.; Kuo, K. K.

    1980-01-01

    The effect of interaction between combustion processes and structural deformation of solid propellant was considered. The combustion analysis was performed on the basis of deformed crack geometry, which was determined from the structural analysis. On the other hand, input data for the structural analysis, such as pressure distribution along the crack boundary and ablation velocity of the crack, were determined from the combustion analysis. The interaction analysis was conducted by combining two computer codes, a combustion analysis code and a general purpose finite element structural analysis code.

  12. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  13. FERRET data analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.

  14. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.

    1997-07-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together withmore » some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.« less

  15. Content Analysis Coding Schemes for Online Asynchronous Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  16. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  17. ASTROP2-LE: A Mistuned Aeroelastic Analysis System Based on a Two Dimensional Linearized Euler Solver

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Srivastava, R.; Mehmed, Oral

    2002-01-01

    An aeroelastic analysis system for flutter and forced response analysis of turbomachines based on a two-dimensional linearized unsteady Euler solver has been developed. The ASTROP2 code, an aeroelastic stability analysis program for turbomachinery, was used as a basis for this development. The ASTROP2 code uses strip theory to couple a two dimensional aerodynamic model with a three dimensional structural model. The code was modified to include forced response capability. The formulation was also modified to include aeroelastic analysis with mistuning. A linearized unsteady Euler solver, LINFLX2D is added to model the unsteady aerodynamics in ASTROP2. By calculating the unsteady aerodynamic loads using LINFLX2D, it is possible to include the effects of transonic flow on flutter and forced response in the analysis. The stability is inferred from an eigenvalue analysis. The revised code, ASTROP2-LE for ASTROP2 code using Linearized Euler aerodynamics, is validated by comparing the predictions with those obtained using linear unsteady aerodynamic solutions.

  18. Guidelines for Coding and Entering Ground-Water Data into the Ground-Water Site Inventory Data Base, Version 4.6, U.S. Geological Survey, Washington Water Science Center

    DTIC Science & Technology

    2006-01-01

    collected, code both. Code Type of Analysis Code Type of Analysis A Physical properties I Common ions/trace elements B Common ions J Sanitary analysis and...1) A ground-water site is coded as if it is a single point, not a geographic area or property . (2) Latitude and longitude should be determined at a...terrace from an adjacent upland on one side, and a lowland coast or valley on the other. Due to the effects of erosion, the terrace surface may not be as

  19. EAC: A program for the error analysis of STAGS results for plates

    NASA Technical Reports Server (NTRS)

    Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.

    1989-01-01

    A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.

  20. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  1. Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bing; Zhang, Jian; Chen, Yan

    This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less

  2. New technologies for advanced three-dimensional optimum shape design in aeronautics

    NASA Astrophysics Data System (ADS)

    Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno

    1999-05-01

    The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright

  3. Intrasystem Analysis Program (IAP) code summaries

    NASA Astrophysics Data System (ADS)

    Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.

    1983-05-01

    This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.

  4. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  5. The STAGS computer code

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.

    1978-01-01

    Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.

  6. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  7. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  8. Evaluation of the DRAGON code for VHTR design analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-12

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less

  9. A Bayesian network coding scheme for annotating biomedical information presented to genetic counseling clients.

    PubMed

    Green, Nancy

    2005-04-01

    We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.

  10. Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery

    NASA Astrophysics Data System (ADS)

    Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.

    2017-05-01

    In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.

  11. Mal-Xtract: Hidden Code Extraction using Memory Analysis

    NASA Astrophysics Data System (ADS)

    Lim, Charles; Syailendra Kotualubun, Yohanes; Suryadi; Ramli, Kalamullah

    2017-01-01

    Software packer has been used effectively to hide the original code inside a binary executable, making it more difficult for existing signature based anti malware software to detect malicious code inside the executable. A new method of written and rewritten memory section is introduced to to detect the exact end time of unpacking routine and extract original code from packed binary executable using Memory Analysis running in an software emulated environment. Our experiment results show that at least 97% of the original code from the various binary executable packed with different software packers could be extracted. The proposed method has also been successfully extracted hidden code from recent malware family samples.

  12. Interactive Finite Elements for General Engine Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1984-01-01

    General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.

  13. Dispersive transport and symmetry of the dispersion tensor in porous media

    NASA Astrophysics Data System (ADS)

    Pride, Steven R.; Vasco, Donald W.; Flekkoy, Eirik G.; Holtzman, Ran

    2017-04-01

    The macroscopic laws controlling the advection and diffusion of solute at the scale of the porous continuum are derived in a general manner that does not place limitations on the geometry and time evolution of the pore space. Special focus is given to the definition and symmetry of the dispersion tensor that is controlling how a solute plume spreads out. We show that the dispersion tensor is not symmetric and that the asymmetry derives from the advective derivative in the pore-scale advection-diffusion equation. When flow is spatially variable across a voxel, such as in the presence of a permeability gradient, the amount of asymmetry can be large. As first shown by Auriault [J.-L. Auriault et al. Transp. Porous Med. 85, 771 (2010), 10.1007/s11242-010-9591-y] in the limit of low Péclet number, we show that at any Péclet number, the dispersion tensor Di j satisfies the flow-reversal symmetry Di j(+q ) =Dj i(-q ) where q is the mean flow in the voxel under analysis; however, Reynold's number must be sufficiently small that the flow is reversible when the force driving the flow changes sign. We also demonstrate these symmetries using lattice-Boltzmann simulations and discuss some subtle aspects of how to measure the dispersion tensor numerically. In particular, the numerical experiments demonstrate that the off-diagonal components of the dispersion tensor are antisymmetric which is consistent with the analytical dependence on the average flow gradients that we propose for these off-diagonal components.

  14. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less

  15. Error control techniques for satellite and space communications

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.

    1988-01-01

    During the period December 1, 1987 through May 31, 1988, progress was made in the following areas: construction of Multi-Dimensional Bandwidth Efficient Trellis Codes with MPSK modulation; performance analysis of Bandwidth Efficient Trellis Coded Modulation schemes; and performance analysis of Bandwidth Efficient Trellis Codes on Fading Channels.

  16. Modeling of rolling element bearing mechanics. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Greenhill, Lyn M.; Merchant, David H.

    1994-01-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  17. Sandia Engineering Analysis Code Access System v. 2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Gregory D.

    The Sandia Engineering Analysis Code Access System (SEACAS) is a suite of preprocessing, post processing, translation, visualization, and utility applications supporting finite element analysis software using the Exodus database file format.

  18. FEAMAC-CARES Software Coupling Development Effort for CMC Stochastic-Strength-Based Damage Simulation

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Walton, Owen

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MACGMC composite material analysis code. The resulting code is called FEAMACCARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMACCARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMACCARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  19. Local Laplacian Coding From Theoretical Analysis of Local Coding Schemes for Locally Linear Classification.

    PubMed

    Pang, Junbiao; Qin, Lei; Zhang, Chunjie; Zhang, Weigang; Huang, Qingming; Yin, Baocai

    2015-12-01

    Local coordinate coding (LCC) is a framework to approximate a Lipschitz smooth function by combining linear functions into a nonlinear one. For locally linear classification, LCC requires a coding scheme that heavily determines the nonlinear approximation ability, posing two main challenges: 1) the locality making faraway anchors have smaller influences on current data and 2) the flexibility balancing well between the reconstruction of current data and the locality. In this paper, we address the problem from the theoretical analysis of the simplest local coding schemes, i.e., local Gaussian coding and local student coding, and propose local Laplacian coding (LPC) to achieve the locality and the flexibility. We apply LPC into locally linear classifiers to solve diverse classification tasks. The comparable or exceeded performances of state-of-the-art methods demonstrate the effectiveness of the proposed method.

  20. Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing

    DTIC Science & Technology

    2008-01-01

    complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchibori, Akihiro; Kurihara, Akikazu; Ohshima, Hiroyuki

    A multiphysics analysis system for sodium-water reaction phenomena in a steam generator of sodium-cooled fast reactors was newly developed. The analysis system consists of the mechanistic numerical analysis codes, SERAPHIM, TACT, and RELAP5. The SERAPHIM code calculates the multicomponent multiphase flow and sodium-water chemical reaction caused by discharging of pressurized water vapor. Applicability of the SERAPHIM code was confirmed through the analyses of the experiment on water vapor discharging in liquid sodium. The TACT code was developed to calculate heat transfer from the reacting jet to the adjacent tube and to predict the tube failure occurrence. The numerical models integratedmore » into the TACT code were verified through some related experiments. The RELAP5 code evaluates thermal hydraulic behavior of water inside the tube. The original heat transfer correlations were corrected for the tube rapidly heated by the reacting jet. The developed system enables evaluation of the wastage environment and the possibility of the failure propagation.« less

  2. Improvements to a method for the geometrically nonlinear analysis of compressively loaded stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Stoll, Frederick

    1993-01-01

    The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.

  3. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  4. Current profile redistribution driven by neutral beam injection in a reversed-field pinch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parke, E.; Department of Physics, University of Wisconsin-Madison 1150 University Ave., Madison, Wisconsin 53706; Anderson, J. K.

    2016-05-15

    Neutral beam injection in reversed-field pinch (RFP) plasmas on the Madison Symmetric Torus [Dexter et al., Fusion Sci. Technol. 19, 131 (1991)] drives current redistribution with increased on-axis current density but negligible net current drive. Internal fluctuations correlated with tearing modes are observed on multiple diagnostics; the behavior of tearing mode correlated structures is consistent with flattening of the safety factor profile. The first application of a parametrized model for island flattening to temperature fluctuations in an RFP allows inferrence of rational surface locations for multiple tearing modes. The m = 1, n = 6 mode is observed to shift inward by 1.1 ± 0.6 cm withmore » neutral beam injection. Tearing mode rational surface measurements provide a strong constraint for equilibrium reconstruction, with an estimated reduction of q{sub 0} by 5% and an increase in on-axis current density of 8% ± 5%. The inferred on-axis current drive is consistent with estimates of fast ion density using TRANSP [Goldston et al., J. Comput. Phys. 43, 61 (1981)].« less

  5. Time Dependent Predictive Modeling of DIII-D ITER Baseline Scenario using Predictive TRANSP

    NASA Astrophysics Data System (ADS)

    Grierson, B. A.; Andre, R. G.; Budny, R. V.; Solomon, W. M.; Yuan, X.; Candy, J.; Pinsker, R. I.; Staebler, G. M.; Holland, C.; Rafiq, T.

    2015-11-01

    ITER baseline scenario discharges on DIII-D are modeled with TGLF and MMM transitioning from combined ECH (3.3MW) +NBI(2.8MW) heating to NBI only (3.0 MW) heating maintaining βN = 2.0 on DIII-D predicting temperature, density and rotation for comparison to experimental measurements. These models capture the reduction of confinement associated with direct electron heating H98y2 = 0.89 vs. 1.0) consistent with stiff electron transport. Reasonable agreement between experimental and modeled temperature profiles is achieved for both heating methods, whereas density and momentum predictions differ significantly. Transport fluxes from TGLF indicate that on DIII-D the electron energy flux has reached a transition from low-k to high-k turbulence with more stiff high-k transport that inhibits an increase in core electron stored energy with additional electron heating. Projections to ITER also indicate high electron stiffness. Supported by US DOE DE-AC02-09CH11466, DE-FC02-04ER54698, DE-FG02-07ER54917, DE-FG02-92-ER54141.

  6. Self-regulation of turbulence in low rotation DIII-D QH-mode with an oscillating transport barrier

    NASA Astrophysics Data System (ADS)

    Barada, Kshitish; Rhodes, T. L.; Burrell, K. H.; Zeng, L.; Chen, Xi

    2016-10-01

    We present observations of turbulence and flow shear limit cycle oscillations (LCOs) in wide pedestal QH-mode DIII-D tokamak plasmas that are consistent with turbulence self-regulation. In this low input torque regime, both edge harmonic oscillations (EHOs) and ELMs are absent. LCOs of ExB velocity shear and ñ present predator-prey like behavior in these fully developed QH-mode plasmas. During these limit cycle oscillations, the ExB poloidal flows possess a long-range toroidal correlation consistent with turbulence generated zonal flow activity. Further, these limit cycle oscillations are observed in a broad range of edge parameters including ne, Te, floor Langmuir probe ion saturation current, and radial electric field Er. TRANSP calculations of transport indicate little change between the EHO and LCO wide pedestal phases. These observations are consistent with LCO driven transport that may play a role in maintaining the profiles below ELM threshold in the EHO-free steady state wide pedestal QH-mode regime. Work supported by the US DOE under DE-FG02-08ER54984 and DE-FC02-04ER54698.

  7. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  8. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  9. Dependency graph for code analysis on emerging architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shashkov, Mikhail Jurievich; Lipnikov, Konstantin

    Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.

  10. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  11. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  12. Finite element analysis of inviscid subsonic boattail flow

    NASA Technical Reports Server (NTRS)

    Chima, R. V.; Gerhart, P. M.

    1981-01-01

    A finite element code for analysis of inviscid subsonic flows over arbitrary nonlifting planar or axisymmetric bodies is described. The code solves a novel primitive variable formulation of the coupled irrotationality and compressible continuity equations. Results for flow over a cylinder, a sphere, and a NACA 0012 airfoil verify the code. Computed subcritical flows over an axisymmetric boattailed afterbody compare well with finite difference results and experimental data. Interative coupling with an integral turbulent boundary layer code shows strong viscous effects on the inviscid flow. Improvements in code efficiency and extensions to transonic flows are discussed.

  13. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  14. Identification of novel mRNAs and lncRNAs associated with mouse experimental colitis and human inflammatory bowel disease.

    PubMed

    Rankin, Carl Robert; Theodorou, Evangelos; Law, Ivy Ka Man; Rowe, Lorraine; Kokkotou, Efi; Pekow, Joel; Wang, Jiafang; Martin, Martin G; Pothoulakis, Charalabos; Padua, David Miguel

    2018-06-28

    Inflammatory bowel disease (IBD) is a complex disorder that is associated with significant morbidity. While many recent advances have been made with new diagnostic and therapeutic tools, a deeper understanding of its basic pathophysiology is needed to continue this trend towards improving treatments. By utilizing an unbiased, high-throughput transcriptomic analysis of two well-established mouse models of colitis, we set out to uncover novel coding and non-coding RNAs that are differentially expressed in the setting of colonic inflammation. RNA-seq analysis was performed using colonic tissue from two mouse models of colitis, a dextran sodium sulfate induced model and a genetic-induced model in mice lacking IL-10. We identified 81 coding RNAs that were commonly altered in both experimental models. Of these coding RNAs, 12 of the human orthologs were differentially expressed in a transcriptomic analysis of IBD patients. Interestingly, 5 of the 12 of human differentially expressed genes have not been previously identified as IBD-associated genes, including ubiquitin D. Our analysis also identified 15 non-coding RNAs that were differentially expressed in either mouse model. Surprisingly, only three non-coding RNAs were commonly dysregulated in both of these models. The discovery of these new coding and non-coding RNAs expands our transcriptional knowledge of mouse models of IBD and offers additional targets to deepen our understanding of the pathophysiology of IBD.

  15. A CFD/CSD Interaction Methodology for Aircraft Wings

    NASA Technical Reports Server (NTRS)

    Bhardwaj, Manoj K.

    1997-01-01

    With advanced subsonic transports and military aircraft operating in the transonic regime, it is becoming important to determine the effects of the coupling between aerodynamic loads and elastic forces. Since aeroelastic effects can contribute significantly to the design of these aircraft, there is a strong need in the aerospace industry to predict these aero-structure interactions computationally. To perform static aeroelastic analysis in the transonic regime, high fidelity computational fluid dynamics (CFD) analysis tools must be used in conjunction with high fidelity computational structural fluid dynamics (CSD) analysis tools due to the nonlinear behavior of the aerodynamics in the transonic regime. There is also a need to be able to use a wide variety of CFD and CSD tools to predict these aeroelastic effects in the transonic regime. Because source codes are not always available, it is necessary to couple the CFD and CSD codes without alteration of the source codes. In this study, an aeroelastic coupling procedure is developed which will perform static aeroelastic analysis using any CFD and CSD code with little code integration. The aeroelastic coupling procedure is demonstrated on an F/A-18 Stabilator using NASTD (an in-house McDonnell Douglas CFD code) and NASTRAN. In addition, the Aeroelastic Research Wing (ARW-2) is used for demonstration of the aeroelastic coupling procedure by using ENSAERO (NASA Ames Research Center CFD code) and a finite element wing-box code (developed as part of this research).

  16. Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System

    NASA Astrophysics Data System (ADS)

    Aizawa, Naoto; Iwasaki, Tomohiko

    2014-06-01

    Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.

  17. EBT reactor systems analysis and cost code: description and users guide (Version 1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santoro, R.T.; Uckan, N.A.; Barnes, J.M.

    1984-06-01

    An ELMO Bumpy Torus (EBT) reactor systems analysis and cost code that incorporates the most recent advances in EBT physics has been written. The code determines a set of reactors that fall within an allowed operating window determined from the coupling of ring and core plasma properties and the self-consistent treatment of the coupled ring-core stability and power balance requirements. The essential elements of the systems analysis and cost code are described, along with the calculational sequences leading to the specification of the reactor options and their associated costs. The input parameters, the constraints imposed upon them, and the operatingmore » range over which the code provides valid results are discussed. A sample problem and the interpretation of the results are also presented.« less

  18. Research and Trends in the Field of Technology-Enhanced Learning from 2006 to 2011: A Content Analysis of Quick Response Code (QR-Code) and Its Application in Selected Studies

    ERIC Educational Resources Information Center

    Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali

    2013-01-01

    This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…

  19. Computer codes developed and under development at Lewis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1992-01-01

    The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.

  20. Improvements in the MGA Code Provide Flexibility and Better Error Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W D; Kerr, J

    2005-05-26

    The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less

  1. A Computer Program for Flow-Log Analysis of Single Holes (FLASH)

    USGS Publications Warehouse

    Day-Lewis, F. D.; Johnson, C.D.; Paillet, Frederick L.; Halford, K.J.

    2011-01-01

    A new computer program, FLASH (Flow-Log Analysis of Single Holes), is presented for the analysis of borehole vertical flow logs. The code is based on an analytical solution for steady-state multilayer radial flow to a borehole. The code includes options for (1) discrete fractures and (2) multilayer aquifers. Given vertical flow profiles collected under both ambient and stressed (pumping or injection) conditions, the user can estimate fracture (or layer) transmissivities and far-field hydraulic heads. FLASH is coded in Microsoft Excel with Visual Basic for Applications routines. The code supports manual and automated model calibration. ?? 2011, The Author(s). Ground Water ?? 2011, National Ground Water Association.

  2. Barriers to Early Detection of Breast Cancer Among African American Females Over Age of 55

    DTIC Science & Technology

    2005-02-01

    used for data analysis. NUDIST , software for qualitative data analysis will be used for systematic coding. All transcripts, as well as interviewer notes...will be coded in NUDIST . Dr. Smith and Mr. Worts will jointly develop the NUDIST coding system. Each of them will separately code each transcript and...already provided training in NUDIST to Dr. Smith and Mr. Worts. All interviews will be conducted by the Principal Investigator for this study who is

  3. Development and application of structural dynamics analysis capabilities

    NASA Technical Reports Server (NTRS)

    Heinemann, Klaus W.; Hozaki, Shig

    1994-01-01

    Extensive research activities were performed in the area of multidisciplinary modeling and simulation of aerospace vehicles that are relevant to NASA Dryden Flight Research Facility. The efforts involved theoretical development, computer coding, and debugging of the STARS code. New solution procedures were developed in such areas as structures, CFD, and graphics, among others. Furthermore, systems-oriented codes were developed for rendering the code truly multidisciplinary and rather automated in nature. Also, work was performed in pre- and post-processing of engineering analysis data.

  4. Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.

    PubMed

    Tauber, J; Lahav, M

    1987-11-01

    A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.

  5. Teaching, Morality, and Responsibility: A Structuralist Analysis of a Teachers' Code of Conduct

    ERIC Educational Resources Information Center

    Shortt, Damien; Hallett, Fiona; Spendlove, David; Hardy, Graham; Barton, Amanda

    2012-01-01

    In this paper we conduct a Structuralist analysis of the General Teaching Council for England's "Code of Conduct and Practice for Registered Teachers" in order to reveal how teachers are required to fulfil an apparently impossible social role. The GTCE's "Code," we argue, may be seen as an attempt by a government agency to…

  6. New Tool Released for Engine-Airframe Blade-Out Structural Simulations

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles

    2004-01-01

    Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.

  7. National Combustion Code Parallel Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Benyo, Theresa (Technical Monitor)

    2002-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.

  8. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    NASA Astrophysics Data System (ADS)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  9. Analysis of Phenix end-of-life natural convection test with the MARS-LMR code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, H. Y.; Ha, K. S.; Lee, K. L.

    The end-of-life test of Phenix reactor performed by the CEA provided an opportunity to have reliable and valuable test data for the validation and verification of a SFR system analysis code. KAERI joined this international program for the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main objectives of this study were to evaluate the capability of existing SFR system analysis code MARS-LMR and to identify any limitation of the code. The analysis was performed in three stages: pre-test analysis, blind posttest analysis, and final post-test analysis. In the pre-test analysis, the design conditionsmore » provided by the CEA were used to obtain a prediction of the test. The blind post-test analysis was based on the test conditions measured during the tests but the test results were not provided from the CEA. The final post-test analysis was performed to predict the test results as accurate as possible by improving the previous modeling of the test. Based on the pre-test analysis and blind test analysis, the modeling for heat structures in the hot pool and cold pool, steel structures in the core, heat loss from roof and vessel, and the flow path at core outlet were reinforced in the final analysis. The results of the final post-test analysis could be characterized into three different phases. In the early phase, the MARS-LMR simulated the heat-up process correctly due to the enhanced heat structure modeling. In the mid phase before the opening of SG casing, the code reproduced the decrease of core outlet temperature successfully. Finally, in the later phase the increase of heat removal by the opening of the SG opening was well predicted with the MARS-LMR code. (authors)« less

  10. Performance Analysis of Hybrid ARQ Protocols in a Slotted Code Division Multiple-Access Network

    DTIC Science & Technology

    1989-08-01

    Convolutional Codes . in Proc Int. Conf. Commun., 21.4.1-21.4.5, 1987. [27] J. Hagenauer. Rate Compatible Punctured Convolutional Codes . in Proc Int. Conf...achieved by using a low rate (r = 0.5), high constraint length (e.g., 32) punctured convolutional code . Code puncturing provides for a variable rate code ...investigated the use of convolutional codes in Type II Hybrid ARQ protocols. The error

  11. Development of the Off-line Analysis Code for GODDESS

    NASA Astrophysics Data System (ADS)

    Garland, Heather; Cizewski, Jolie; Lepailleur, Alex; Walters, David; Pain, Steve; Smith, Karl

    2016-09-01

    Determining (n, γ) cross sections on unstable nuclei is important for understanding the r-process that is theorized to occur in supernovae and neutron-star mergers. However, (n, γ) reactions are difficult to measure directly because of the short lifetime of the involved neutron rich nuclei. A possible surrogate for the (n, γ) reaction is the (d,p γ) reaction; the measurement of these reactions in inverse kinematics is part of the scope of GODDESS - Gammasphere ORRUBA (Oak Ridge Rutgers University Barrel Array): Dual Detectors for Experimental Structure Studies. The development of an accurate and efficient off-line analysis code for GODDESS experiments is not only essential, but also provides a unique opportunity to create an analysis code designed specifically for transfer reaction experiments. The off-line analysis code has been developed to produce histograms from the binary data file to determine how to best sort events. Recent developments in the off-line analysis code will be presented as well as details on the energy and position calibrations for the ORRUBA detectors. This work is supported in part by the U.S. Department of Energy and National Science Foundation.

  12. Color Coding of Circuit Quantities in Introductory Circuit Analysis Instruction

    ERIC Educational Resources Information Center

    Reisslein, Jana; Johnson, Amy M.; Reisslein, Martin

    2015-01-01

    Learning the analysis of electrical circuits represented by circuit diagrams is often challenging for novice students. An open research question in electrical circuit analysis instruction is whether color coding of the mathematical symbols (variables) that denote electrical quantities can improve circuit analysis learning. The present study…

  13. Maximum likelihood decoding analysis of accumulate-repeat-accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, A.; Divsalar, D.; Yao, K.

    2004-01-01

    In this paper, the performance of the repeat-accumulate codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. Some simple codes are shown that perform very close to Shannon limit with maximum likelihood decoding.

  14. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  15. Moderate Deviation Analysis for Classical Communication over Quantum Channels

    NASA Astrophysics Data System (ADS)

    Chubb, Christopher T.; Tan, Vincent Y. F.; Tomamichel, Marco

    2017-11-01

    We analyse families of codes for classical data transmission over quantum channels that have both a vanishing probability of error and a code rate approaching capacity as the code length increases. To characterise the fundamental tradeoff between decoding error, code rate and code length for such codes we introduce a quantum generalisation of the moderate deviation analysis proposed by Altŭg and Wagner as well as Polyanskiy and Verdú. We derive such a tradeoff for classical-quantum (as well as image-additive) channels in terms of the channel capacity and the channel dispersion, giving further evidence that the latter quantity characterises the necessary backoff from capacity when transmitting finite blocks of classical data. To derive these results we also study asymmetric binary quantum hypothesis testing in the moderate deviations regime. Due to the central importance of the latter task, we expect that our techniques will find further applications in the analysis of other quantum information processing tasks.

  16. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less

  17. Colour cyclic code for Brillouin distributed sensors

    NASA Astrophysics Data System (ADS)

    Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne

    2015-09-01

    For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.

  18. Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bekar, Kursat B.; Ibrahim, Ahmad M.

    2017-05-01

    This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less

  19. Airfoil Vibration Dampers program

    NASA Technical Reports Server (NTRS)

    Cook, Robert M.

    1991-01-01

    The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.

  20. The Use and Effectiveness of Triple Multiplex System for Coding Region Single Nucleotide Polymorphism in Mitochondrial DNA Typing of Archaeologically Obtained Human Skeletons from Premodern Joseon Tombs of Korea

    PubMed Central

    Oh, Chang Seok; Lee, Soong Deok; Kim, Yi-Suk; Shin, Dong Hoon

    2015-01-01

    Previous study showed that East Asian mtDNA haplogroups, especially those of Koreans, could be successfully assigned by the coupled use of analyses on coding region SNP markers and control region mutation motifs. In this study, we tried to see if the same triple multiplex analysis for coding regions SNPs could be also applicable to ancient samples from East Asia as the complementation for sequence analysis of mtDNA control region. By the study on Joseon skeleton samples, we know that mtDNA haplogroup determined by coding region SNP markers successfully falls within the same haplogroup that sequence analysis on control region can assign. Considering that ancient samples in previous studies make no small number of errors in control region mtDNA sequencing, coding region SNP analysis can be used as good complimentary to the conventional haplogroup determination, especially of archaeological human bone samples buried underground over long periods. PMID:26345190

  1. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  2. Mads.jl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo

    2016-07-01

    Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less

  3. Development of Web Interfaces for Analysis Codes

    NASA Astrophysics Data System (ADS)

    Emoto, M.; Watanabe, T.; Funaba, H.; Murakami, S.; Nagayama, Y.; Kawahata, K.

    Several codes have been developed to analyze plasma physics. However, most of them are developed to run on supercomputers. Therefore, users who typically use personal computers (PCs) find it difficult to use these codes. In order to facilitate the widespread use of these codes, a user-friendly interface is required. The authors propose Web interfaces for these codes. To demonstrate the usefulness of this approach, the authors developed Web interfaces for two analysis codes. One of them is for FIT developed by Murakami. This code is used to analyze the NBI heat deposition, etc. Because it requires electron density profiles, electron temperatures, and ion temperatures as polynomial expressions, those unfamiliar with the experiments find it difficult to use this code, especially visitors from other institutes. The second one is for visualizing the lines of force in the LHD (large helical device) developed by Watanabe. This code is used to analyze the interference caused by the lines of force resulting from the various structures installed in the vacuum vessel of the LHD. This code runs on PCs; however, it requires that the necessary parameters be edited manually. Using these Web interfaces, users can execute these codes interactively.

  4. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  5. TOOKUIL: A case study in user interface development for safety code application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less

  6. Performance Analysis and Optimization on the UCLA Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos

    1996-01-01

    An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.

  7. "SEN's Completely Different Now": Critical Discourse Analysis of Three "Codes of Practice for Special Educational Needs" (1994, 2001, 2015)

    ERIC Educational Resources Information Center

    Lehane, Teresa

    2017-01-01

    Regardless of the differing shades of neo-liberalism, successive governments have claimed to champion the cause of "special educational needs and/or disability" (SEND) through official Codes of Practice in 1994, 2001 and 2015. This analysis and comparison of the three Codes of Practice aims to contribute to the debate by exploring…

  8. The Evolution of a Coding Schema in a Paced Program of Research

    ERIC Educational Resources Information Center

    Winters, Charlene A.; Cudney, Shirley; Sullivan, Therese

    2010-01-01

    A major task involved in the management, analysis, and integration of qualitative data is the development of a coding schema to facilitate the analytic process. Described in this paper is the evolution of a coding schema that was used in the analysis of qualitative data generated from online forums of middle-aged women with chronic conditions who…

  9. Java Source Code Analysis for API Migration to Embedded Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, Victor; McCoy, James A.; Guerrero, Jonathan

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less

  10. Iterative categorization (IC): a systematic technique for analysing qualitative data

    PubMed Central

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  11. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    ERIC Educational Resources Information Center

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  12. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  13. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  14. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  15. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Bhat, Kabekode Ghanasham

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  16. Space shuttle main engine numerical modeling code modifications and analysis

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.

    1988-01-01

    The user of computational fluid dynamics (CFD) codes must be concerned with the accuracy and efficiency of the codes if they are to be used for timely design and analysis of complicated three-dimensional fluid flow configurations. A brief discussion of how accuracy and efficiency effect the CFD solution process is given. A more detailed discussion of how efficiency can be enhanced by using a few Cray Research Inc. utilities to address vectorization is presented and these utilities are applied to a three-dimensional Navier-Stokes CFD code (INS3D).

  17. General practitioners’ justifications for therapeutic inertia in cardiovascular prevention: an empirically grounded typology

    PubMed Central

    Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Vaillant-Roussel, Hélène; Pouchain, Denis; Yaouanc, Virginie; Aubin-Auger, Isabelle; Mercier, Alain; Rusch, Emmanuel; Remmen, Roy; Vermeire, Etienne; Hendrickx, Kristin

    2016-01-01

    Objective To construct a typology of general practitioners’ (GPs) responses regarding their justification of therapeutic inertia in cardiovascular primary prevention for high-risk patients with hypertension. Design Empirically grounded construction of typology. Types were defined by attributes derived from the qualitative analysis of GPs’ reported reasons for inaction. Participants 256 GPs randomised in the intervention group of a cluster randomised controlled trial. Setting GPs members of 23 French Regional Colleges of Teachers in General Practice, included in the EffectS of a multifaceted intervention on CArdiovascular risk factors in high-risk hyPErtensive patients (ESCAPE) trial. Data collection and analysis The database consisted of 2638 written responses given by the GPs to an open-ended question asking for the reasons why drug treatment was not changed as suggested by the national guidelines. All answers were coded using constant comparison analysis. A matrix analysis of codes per GP allowed the construction of a response typology, where types were defined by codes as attributes. Initial coding and definition of types were performed independently by two teams. Results Initial coding resulted in a list of 69 codes in the final codebook, representing 4764 coded references in the question responses. A typology including seven types was constructed. 100 GPs were allocated to one and only one of these types, while 25 GPs did not provide enough data to allow classification. Types (numbers of GPs allocated) were: ‘optimists’ (28), ‘negotiators’ (20), ‘checkers’ (15), ‘contextualisers’ (13), ‘cautious’ (11), ‘rounders’ (8) and ‘scientists’ (5). For the 36 GPs that provided 50 or more coded references, analysis of the code evolution over time and across patients showed a consistent belonging to the initial type for any given GP. Conclusion This typology could provide GPs with some insight into their general ways of considering changes in the treatment/management of cardiovascular risk factors and guide design of specific physician-centred interventions to reduce inappropriate inaction. Trial registration number NCT00348855. PMID:27178974

  18. Vibration Response Models of a Stiffened Aluminum Plate Excited by a Shaker

    NASA Technical Reports Server (NTRS)

    Cabell, Randolph H.

    2008-01-01

    Numerical models of structural-acoustic interactions are of interest to aircraft designers and the space program. This paper describes a comparison between two energy finite element codes, a statistical energy analysis code, a structural finite element code, and the experimentally measured response of a stiffened aluminum plate excited by a shaker. Different methods for modeling the stiffeners and the power input from the shaker are discussed. The results show that the energy codes (energy finite element and statistical energy analysis) accurately predicted the measured mean square velocity of the plate. In addition, predictions from an energy finite element code had the best spatial correlation with measured velocities. However, predictions from a considerably simpler, single subsystem, statistical energy analysis model also correlated well with the spatial velocity distribution. The results highlight a need for further work to understand the relationship between modeling assumptions and the prediction results.

  19. Statistical properties of DNA sequences

    NASA Technical Reports Server (NTRS)

    Peng, C. K.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Simons, M.; Stanley, H. E.

    1995-01-01

    We review evidence supporting the idea that the DNA sequence in genes containing non-coding regions is correlated, and that the correlation is remarkably long range--indeed, nucleotides thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene. We resolve the problem of the "non-stationarity" feature of the sequence of base pairs by applying a new algorithm called detrended fluctuation analysis (DFA). We address the claim of Voss that there is no difference in the statistical properties of coding and non-coding regions of DNA by systematically applying the DFA algorithm, as well as standard FFT analysis, to every DNA sequence (33301 coding and 29453 non-coding) in the entire GenBank database. Finally, we describe briefly some recent work showing that the non-coding sequences have certain statistical features in common with natural and artificial languages. Specifically, we adapt to DNA the Zipf approach to analyzing linguistic texts. These statistical properties of non-coding sequences support the possibility that non-coding regions of DNA may carry biological information.

  20. Transient dynamics capability at Sandia National Laboratories

    NASA Technical Reports Server (NTRS)

    Attaway, Steven W.; Biffle, Johnny H.; Sjaardema, G. D.; Heinstein, M. W.; Schoof, L. A.

    1993-01-01

    A brief overview of the transient dynamics capabilities at Sandia National Laboratories, with an emphasis on recent new developments and current research is presented. In addition, the Sandia National Laboratories (SNL) Engineering Analysis Code Access System (SEACAS), which is a collection of structural and thermal codes and utilities used by analysts at SNL, is described. The SEACAS system includes pre- and post-processing codes, analysis codes, database translation codes, support libraries, Unix shell scripts for execution, and an installation system. SEACAS is used at SNL on a daily basis as a production, research, and development system for the engineering analysts and code developers. Over the past year, approximately 190 days of CPU time were used by SEACAS codes on jobs running from a few seconds up to two and one-half days of CPU time. SEACAS is running on several different systems at SNL including Cray Unicos, Hewlett Packard PH-UX, Digital Equipment Ultrix, and Sun SunOS. An overview of SEACAS, including a short description of the codes in the system, are presented. Abstracts and references for the codes are listed at the end of the report.

  1. Efficient genome-wide association in biobanks using topic modeling identifies multiple novel disease loci.

    PubMed

    McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H

    2017-08-31

    Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that may be unreliable and fail to capture the relationship between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records (EHR) for 10845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes are included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p<1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than for single phenome-wide diagnostic codes, and incorporation of less strongly-loading diagnostic codes enhanced association. This strategy provides a more efficient means of phenome-wide association in biobanks with coded clinical data.

  2. Efficient Genome-wide Association in Biobanks Using Topic Modeling Identifies Multiple Novel Disease Loci

    PubMed Central

    McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H

    2017-01-01

    Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that can be unreliable and fail to capture relationships between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records for 10,845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted a genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes were included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p < 1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than single phenome-wide diagnostic codes, and incorporation of less strongly loading diagnostic codes enhanced association. This strategy provides a more efficient means of identifying phenome-wide associations in biobanks with coded clinical data. PMID:28861588

  3. DRG coding practice: a nationwide hospital survey in Thailand.

    PubMed

    Pongpirul, Krit; Walker, Damian G; Rahman, Hafizur; Robinson, Courtland

    2011-10-31

    Diagnosis Related Group (DRG) payment is preferred by healthcare reform in various countries but its implementation in resource-limited countries has not been fully explored. This study was aimed (1) to compare the characteristics of hospitals in Thailand that were audited with those that were not and (2) to develop a simplified scale to measure hospital coding practice. A questionnaire survey was conducted of 920 hospitals in the Summary and Coding Audit Database (SCAD hospitals, all of which were audited in 2008 because of suspicious reports of possible DRG miscoding); the questionnaire also included 390 non-SCAD hospitals. The questionnaire asked about general demographics of the hospitals, hospital coding structure and process, and also included a set of 63 opinion-oriented items on the current hospital coding practice. Descriptive statistics and exploratory factor analysis (EFA) were used for data analysis. SCAD and Non-SCAD hospitals were different in many aspects, especially the number of medical statisticians, experience of medical statisticians and physicians, as well as number of certified coders. Factor analysis revealed a simplified 3-factor, 20-item model to assess hospital coding practice and classify hospital intention. Hospital providers should not be assumed capable of producing high quality DRG codes, especially in resource-limited settings.

  4. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  5. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

  6. Automated Detection and Analysis of Interplanetary Shocks with Real-Time Application

    NASA Astrophysics Data System (ADS)

    Vorotnikov, V.; Smith, C. W.; Hu, Q.; Szabo, A.; Skoug, R. M.; Cohen, C. M.

    2006-12-01

    The ACE real-time data stream provides web-based now-casting capabilities for solar wind conditions upstream of Earth. Our goal is to provide an automated code that finds and analyzes interplanetary shocks as they occur for possible real-time application to space weather nowcasting. Shock analysis algorithms based on the Rankine-Hugoniot jump conditions exist and are in wide-spread use today for the interactive analysis of interplanetary shocks yielding parameters such as shock speed and propagation direction and shock strength in the form of compression ratios. Although these codes can be automated in a reasonable manner to yield solutions not far from those obtained by user-directed interactive analysis, event detection presents an added obstacle and the first step in a fully automated analysis. We present a fully automated Rankine-Hugoniot analysis code that can scan the ACE science data, find shock candidates, analyze the events, obtain solutions in good agreement with those derived from interactive applications, and dismiss false positive shock candidates on the basis of the conservation equations. The intent is to make this code available to NOAA for use in real-time space weather applications. The code has the added advantage of being able to scan spacecraft data sets to provide shock solutions for use outside real-time applications and can easily be applied to science-quality data sets from other missions. Use of the code for this purpose will also be explored.

  7. GLSENS: A Generalized Extension of LSENS Including Global Reactions and Added Sensitivity Analysis for the Perfectly Stirred Reactor

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1996-01-01

    A generalized version of the NASA Lewis general kinetics code, LSENS, is described. The new code allows the use of global reactions as well as molecular processes in a chemical mechanism. The code also incorporates the capability of performing sensitivity analysis calculations for a perfectly stirred reactor rapidly and conveniently at the same time that the main kinetics calculations are being done. The GLSENS code has been extensively tested and has been found to be accurate and efficient. Nine example problems are presented and complete user instructions are given for the new capabilities. This report is to be used in conjunction with the documentation for the original LSENS code.

  8. Combined coding and delay-throughput analysis for fading channels of mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Wang, C. C.; Yan, Tsun-Yee

    1986-01-01

    This paper presents the analysis of using the punctured convolutional code with Viterbi decoding to improve communications reliability. The punctured code rate is optimized so that the average delay is minimized. The coding gain in terms of the message delay is also defined. Since using punctured convolutional code with interleaving is still inadequate to combat the severe fading for short packets, the use of multiple copies of assignment and acknowledgment packets is suggested. The performance on the average end-to-end delay of this protocol is analyzed. It is shown that a replication of three copies for both assignment packets and acknowledgment packets is optimum for the cases considered.

  9. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  10. Analysis of the Length of Braille Texts in English Braille American Edition, the Nemeth Code, and Computer Braille Code versus the Unified English Braille Code

    ERIC Educational Resources Information Center

    Knowlton, Marie; Wetzel, Robin

    2006-01-01

    This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…

  11. Permanence analysis of a concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.; Kasami, T.

    1983-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however, the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for the planetary program, is analyzed.

  12. Digital microarray analysis for digital artifact genomics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger; Handley, James; Williams, Deborah

    2013-06-01

    We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.

  13. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  14. Committed to the Honor Code: An Investment Model Analysis of Academic Integrity

    ERIC Educational Resources Information Center

    Dix, Emily L.; Emery, Lydia F.; Le, Benjamin

    2014-01-01

    Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…

  15. One Speaker, Two Languages. Cross-Disciplinary Perspectives on Code-Switching.

    ERIC Educational Resources Information Center

    Milroy, Lesley, Ed.; Muysken, Pieter, Ed.

    Fifteen articles review code-switching in the four major areas: policy implications in specific institutional and community settings; perspectives of social theory of code-switching as a form of speech behavior in particular social contexts; the grammatical analysis of code-switching, including factors that constrain switching even within a…

  16. User's Guide for MSAP2D: A Program for Unsteady Aerodynamic and Aeroelastic (Flutter and Forced Response) Analysis of Multistage Compressors and Turbines. 1.0

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Srivastava, R.

    1996-01-01

    This guide describes the input data required for using MSAP2D (Multi Stage Aeroelastic analysis Program - Two Dimensional) computer code. MSAP2D can be used for steady, unsteady aerodynamic, and aeroelastic (flutter and forced response) analysis of bladed disks arranged in multiple blade rows such as those found in compressors, turbines, counter rotating propellers or propfans. The code can also be run for single blade row. MSAP2D code is an extension of the original NPHASE code for multiblade row aerodynamic and aeroelastic analysis. Euler equations are used to obtain aerodynamic forces. The structural dynamic equations are written for a rigid typical section undergoing pitching (torsion) and plunging (bending) motion. The aeroelastic equations are solved in time domain. For single blade row analysis, frequency domain analysis is also provided to obtain unsteady aerodynamic coefficients required in an eigen analysis for flutter. In this manual, sample input and output are provided for a single blade row example, two blade row example with equal and unequal number of blades in the blade rows.

  17. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  18. Optical Surface Analysis Code (OSAC). 7.0

    NASA Technical Reports Server (NTRS)

    Glenn, P.

    1998-01-01

    The purpose of this modification to the Optical Surface Analysis Code (OSAC) is to upgrade the PSF program to allow the user to get proper diffracted energy normalization even when deliberately obscuring rays with internal obscurations.

  19. Regarding the optimization of O1-mode ECRH and the feasibility of EBW startup on NSTX-U

    NASA Astrophysics Data System (ADS)

    Lopez, N. A.; Poli, F. M.

    2018-06-01

    Recently published scenarios for fully non-inductive startup and operation on the National Spherical Torus eXperiment Upgrade (NSTX-U) (Menard et al 2012 Nucl. Fusion 52 083015) show Electron Cyclotron Resonance Heating (ECRH) as an important component in preparing a target plasma for efficient High Harmonic Fast Wave and Neutral Beam heating. The modeling of the propagation and absorption of EC waves in the evolving plasma is required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and current drive during this window. Here, we extend a previous optimization of O1-mode ECRH on NSTX-U to account for the full time-dependent performance of the ECRH using simulations performed with TRANSP. We find that the evolution of the density profile has a prominent role in the optimization by defining the time window of operation, which in certain cases may be a more important metric to compare launcher performance than the average power absorption. This feature cannot be captured by analysis on static profiles, and should be accounted for when optimizing ECRH on any device that operates near the cutoff density. Additionally, the utility of the electron Bernstein wave (EBW) in driving current and generating closed flux surfaces in the early startup phase has been demonstrated on a number of devices. Using standalone GENRAY simulations, we find that efficient EBW current drive is possible on NSTX-U if the injection angle is shifted below the midplane and aimed towards the top half of the vacuum vessel. However, collisional damping of the EBW is projected to be significant, in some cases accounting for up to 97% of the absorbed EBW power.

  20. Regarding the optimization of O1-mode ECRH and the feasibility of EBW startup on NSTX-U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, Nicolas; Poli, Francesca M.

    Recently published scenarios for fully non-inductive startup and operation on the National Spherical Torus eXperiment Upgrade (NSTX-U) [Menard J et al 2012 Nucl. Fusion 52 083015] show Electron Cyclotron Resonance Heating (ECRH) as an important component in preparing a target plasma for efficient High Harmonic Fast Wave and Neutral Beam heating. The modelling of the propagation and absorption of EC waves in the evolving plasma is required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and current drive during this window. Here in this paper, we extend a previous optimization ofmore » O1-mode ECRH on NSTX-U to account for the full time-dependent performance of the ECRH using simulations performed with TRANSP. We find that the evolution of the density profile has a prominent role in the optimization by defining the time window of operation, which in certain cases may be a more important metric to compare launcher performance than the average power absorption. This feature cannot be captured by analysis on static profiles, and should be accounted for when optimizing ECRH on any device that operates near the cutoff density. Additionally, the utility of the electron Bernstein wave (EBW) in driving current and generating closed flux surfaces in the early startup phase has been demonstrated on a number of devices. Using standalone GENRAY simulations, we find that efficient EBW current drive is possible on NSTX-U if the injection angle is shifted below the midplane and aimed towards the top half of the vacuum vessel. However, collisional damping of the EBW is projected to be significant, in some cases accounting for up to 97\\% of the absorbed EBW power.« less

  1. Regarding the optimization of O1-mode ECRH and the feasibility of EBW startup on NSTX-U

    DOE PAGES

    Lopez, Nicolas; Poli, Francesca M.

    2018-03-29

    Recently published scenarios for fully non-inductive startup and operation on the National Spherical Torus eXperiment Upgrade (NSTX-U) [Menard J et al 2012 Nucl. Fusion 52 083015] show Electron Cyclotron Resonance Heating (ECRH) as an important component in preparing a target plasma for efficient High Harmonic Fast Wave and Neutral Beam heating. The modelling of the propagation and absorption of EC waves in the evolving plasma is required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and current drive during this window. Here in this paper, we extend a previous optimization ofmore » O1-mode ECRH on NSTX-U to account for the full time-dependent performance of the ECRH using simulations performed with TRANSP. We find that the evolution of the density profile has a prominent role in the optimization by defining the time window of operation, which in certain cases may be a more important metric to compare launcher performance than the average power absorption. This feature cannot be captured by analysis on static profiles, and should be accounted for when optimizing ECRH on any device that operates near the cutoff density. Additionally, the utility of the electron Bernstein wave (EBW) in driving current and generating closed flux surfaces in the early startup phase has been demonstrated on a number of devices. Using standalone GENRAY simulations, we find that efficient EBW current drive is possible on NSTX-U if the injection angle is shifted below the midplane and aimed towards the top half of the vacuum vessel. However, collisional damping of the EBW is projected to be significant, in some cases accounting for up to 97\\% of the absorbed EBW power.« less

  2. LSENS, The NASA Lewis Kinetics and Sensitivity Analysis Code

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, K.

    2000-01-01

    A general chemical kinetics and sensitivity analysis code for complex, homogeneous, gas-phase reactions is described. The main features of the code, LSENS (the NASA Lewis kinetics and sensitivity analysis code), are its flexibility, efficiency and convenience in treating many different chemical reaction models. The models include: static system; steady, one-dimensional, inviscid flow; incident-shock initiated reaction in a shock tube; and a perfectly stirred reactor. In addition, equilibrium computations can be performed for several assigned states. An implicit numerical integration method (LSODE, the Livermore Solver for Ordinary Differential Equations), which works efficiently for the extremes of very fast and very slow reactions, is used to solve the "stiff" ordinary differential equation systems that arise in chemical kinetics. For static reactions, the code uses the decoupled direct method to calculate sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of dependent variables and/or the rate coefficient parameters. Solution methods for the equilibrium and post-shock conditions and for perfectly stirred reactor problems are either adapted from or based on the procedures built into the NASA code CEA (Chemical Equilibrium and Applications).

  3. Main steam line break accident simulation of APR1400 using the model of ATLAS facility

    NASA Astrophysics Data System (ADS)

    Ekariansyah, A. S.; Deswandri; Sunaryo, Geni R.

    2018-02-01

    A main steam line break simulation for APR1400 as an advanced design of PWR has been performed using the RELAP5 code. The simulation was conducted in a model of thermal-hydraulic test facility called as ATLAS, which represents a scaled down facility of the APR1400 design. The main steam line break event is described in a open-access safety report document, in which initial conditions and assumptionsfor the analysis were utilized in performing the simulation and analysis of the selected parameter. The objective of this work was to conduct a benchmark activities by comparing the simulation results of the CESEC-III code as a conservative approach code with the results of RELAP5 as a best-estimate code. Based on the simulation results, a general similarity in the behavior of selected parameters was observed between the two codes. However the degree of accuracy still needs further research an analysis by comparing with the other best-estimate code. Uncertainties arising from the ATLAS model should be minimized by taking into account much more specific data in developing the APR1400 model.

  4. Full core analysis of IRIS reactor by using MCNPX.

    PubMed

    Amin, E A; Bashter, I I; Hassan, Nabil M; Mustafa, S S

    2016-07-01

    This paper describes neutronic analysis for fresh fuelled IRIS (International Reactor Innovative and Secure) reactor by MCNPX code. The analysis included criticality calculations, radial power and axial power distribution, nuclear peaking factor and axial offset percent at the beginning of fuel cycle. The effective multiplication factor obtained by MCNPX code is compared with previous calculations by HELIOS/NESTLE, CASMO/SIMULATE, modified CORD-2 nodal calculations and SAS2H/KENO-V code systems. It is found that k-eff value obtained by MCNPX is closer to CORD-2 value. The radial and axial powers are compared with other published results carried out using SAS2H/KENO-V code. Moreover, the WIMS-D5 code is used for studying the effect of enriched boron in form of ZrB2 on the effective multiplication factor (K-eff) of the fuel pin. In this part of calculation, K-eff is calculated at different concentrations of Boron-10 in mg/cm at different stages of burnup of unit cell. The results of this part are compared with published results performed by HELIOS code. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Analysis of SMA Hybrid Composite Structures using Commercial Codes

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Patel, Hemant D.

    2004-01-01

    A thermomechanical model for shape memory alloy (SMA) actuators and SMA hybrid composite (SMAHC) structures has been recently implemented in the commercial finite element codes MSC.Nastran and ABAQUS. The model may be easily implemented in any code that has the capability for analysis of laminated composite structures with temperature dependent material properties. The model is also relatively easy to use and requires input of only fundamental engineering properties. A brief description of the model is presented, followed by discussion of implementation and usage in the commercial codes. Results are presented from static and dynamic analysis of SMAHC beams of two types; a beam clamped at each end and a cantilevered beam. Nonlinear static (post-buckling) and random response analyses are demonstrated for the first specimen. Static deflection (shape) control is demonstrated for the cantilevered beam. Approaches for modeling SMAHC material systems with embedded SMA in ribbon and small round wire product forms are demonstrated and compared. The results from the commercial codes are compared to those from a research code as validation of the commercial implementations; excellent correlation is achieved in all cases.

  6. Spectral fitting, shock layer modeling, and production of nitrogen oxides and excited nitrogen

    NASA Technical Reports Server (NTRS)

    Blackwell, H. E.

    1991-01-01

    An analysis was made of N2 emission from 8.72 MJ/kg shock layer at 2.54, 1.91, and 1.27 cm positions and vibrational state distributions, temperatures, and relative electronic state populations was obtained from data sets. Other recorded arc jet N2 and air spectral data were reviewed and NO emission characteristics were studied. A review of operational procedures of the DSMC code was made. Information on other appropriate codes and modifications, including ionization, were made as well as a determination of the applicability of codes reviewed to task requirement. A review was also made of computational procedures used in CFD codes of Li and other codes on JSC computers. An analysis was made of problems associated with integration of specific chemical kinetics applicable to task into CFD codes.

  7. Thermodynamic Analysis of the Combustion of Metallic Materials

    NASA Technical Reports Server (NTRS)

    Wilson, D. Bruce; Stoltzfus, Joel M.

    2000-01-01

    Two types of computer codes are available to assist in the thermodynamic analysis of metallic materials combustion. One type of code calculates phase equilibrium data and is represented by CALPHAD. The other type of code calculates chemical reaction by the Gordon-McBride code. The first has seen significant application for alloy-phase diagrams, but only recently has it been considered for oxidation systems. The Gordon-McBride code has been applied to the combustion of metallic materials. Both codes are limited by their treatment of non-ideal solutions and the fact they are limited to treating volatile and gaseous species as ideal. This paper examines the significance of these limitations for combustion of metallic materials. In addition, the applicability of linear-free energy relationships for solid-phase oxidation and their possible extension to liquid-phase systems is examined.

  8. [Differentiation of coding quality in orthopaedics by special, illustration-oriented case group analysis in the G-DRG System 2005].

    PubMed

    Schütz, U; Reichel, H; Dreinhöfer, K

    2007-01-01

    We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.

  9. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3, Part 4.

    DTIC Science & Technology

    1983-09-01

    6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA

  10. Transient Ejector Analysis (TEA) code user's guide

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1993-01-01

    A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.

  11. Classification of breast tissue in mammograms using efficient coding.

    PubMed

    Costa, Daniel D; Campos, Lúcio F; Barros, Allan K

    2011-06-24

    Female breast cancer is the major cause of death by cancer in western countries. Efforts in Computer Vision have been made in order to improve the diagnostic accuracy by radiologists. Some methods of lesion diagnosis in mammogram images were developed based in the technique of principal component analysis which has been used in efficient coding of signals and 2D Gabor wavelets used for computer vision applications and modeling biological vision. In this work, we present a methodology that uses efficient coding along with linear discriminant analysis to distinguish between mass and non-mass from 5090 region of interest from mammograms. The results show that the best rates of success reached with Gabor wavelets and principal component analysis were 85.28% and 87.28%, respectively. In comparison, the model of efficient coding presented here reached up to 90.07%. Altogether, the results presented demonstrate that independent component analysis performed successfully the efficient coding in order to discriminate mass from non-mass tissues. In addition, we have observed that LDA with ICA bases showed high predictive performance for some datasets and thus provide significant support for a more detailed clinical investigation.

  12. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  13. HART-II: Prediction of Blade-Vortex Interaction Loading

    DTIC Science & Technology

    2003-09-01

    14:30 (2) Improvement of DLR Rotor Aero- acoustic Code ( APSIM ) and its Valida- tion with Analytic Solution J. Yin, J. Delfs (5...of DLR Rotor Aero- acoustic Code ( APSIM ) and its Valida- tion with Analytic Solution J. Yin, J. Delfs (5) Aeroelastic Stability Analysis of...of DLR Rotor Aero- acoustic Code ( APSIM ) and its Valida- tion with Analytic Solution J. Yin, J. Delfs (5) Aeroelastic Stability Analysis of

  14. Safe, Multiphase Bounds Check Elimination in Java

    DTIC Science & Technology

    2010-01-28

    production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds

  15. CELFE/NASTRAN Code for the Analysis of Structures Subjected to High Velocity Impact

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1978-01-01

    CELFE (Coupled Eulerian Lagrangian Finite Element)/NASTRAN Code three-dimensional finite element code has the capability for analyzing of structures subjected to high velocity impact. The local response is predicted by CELFE and, for large problems, the far-field impact response is predicted by NASTRAN. The coupling of the CELFE code with NASTRAN (CELFE/NASTRAN code) and the application of the code to selected three-dimensional high velocity impact problems are described.

  16. Structural design, analysis, and code evaluation of an odd-shaped pressure vessel

    NASA Astrophysics Data System (ADS)

    Rezvani, M. A.; Ziada, H. H.

    1992-12-01

    An effort to design, analyze, and evaluate a rectangular pressure vessel is described. Normally pressure vessels are designed in circular or spherical shapes to prevent stress concentrations. In this case, because of operational limitations, the choice of vessels was limited to a rectangular pressure box with a removable cover plate. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code is used as a guideline for pressure containments whose width or depth exceeds 15.24 cm (6.0 in.) and where pressures will exceed 103.4 KPa (15.0 lbf/in(sup 2)). This evaluation used Section 8 of this Code, hereafter referred to as the Code. The dimensions and working pressure of the subject vessel fall within the pressure vessel category of the Code. The Code design guidelines and rules do not directly apply to this vessel. Therefore, finite-element methodology was used to analyze the pressure vessel, and the Code then was used in qualifying the vessel to be stamped to the Code. Section 8, Division 1 of the Code was used for evaluation. This action was justified by selecting a material for which fatigue damage would not be a concern. The stress analysis results were then checked against the Code, and the thicknesses adjusted to satisfy Code requirements. Although not directly applicable, the Code design formulas for rectangular vessels were also considered and presented.

  17. ICC-CLASS: isotopically-coded cleavable crosslinking analysis software suite

    PubMed Central

    2010-01-01

    Background Successful application of crosslinking combined with mass spectrometry for studying proteins and protein complexes requires specifically-designed crosslinking reagents, experimental techniques, and data analysis software. Using isotopically-coded ("heavy and light") versions of the crosslinker and cleavable crosslinking reagents is analytically advantageous for mass spectrometric applications and provides a "handle" that can be used to distinguish crosslinked peptides of different types, and to increase the confidence of the identification of the crosslinks. Results Here, we describe a program suite designed for the analysis of mass spectrometric data obtained with isotopically-coded cleavable crosslinkers. The suite contains three programs called: DX, DXDX, and DXMSMS. DX searches the mass spectra for the presence of ion signal doublets resulting from the light and heavy isotopic forms of the isotopically-coded crosslinking reagent used. DXDX searches for possible mass matches between cleaved and uncleaved isotopically-coded crosslinks based on the established chemistry of the cleavage reaction for a given crosslinking reagent. DXMSMS assigns the crosslinks to the known protein sequences, based on the isotopically-coded and un-coded MS/MS fragmentation data of uncleaved and cleaved peptide crosslinks. Conclusion The combination of these three programs, which are tailored to the analytical features of the specific isotopically-coded cleavable crosslinking reagents used, represents a powerful software tool for automated high-accuracy peptide crosslink identification. See: http://www.creativemolecules.com/CM_Software.htm PMID:20109223

  18. Evaluation of the finite element fuel rod analysis code (FRANCO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, K.; Feltus, M.A.

    1994-12-31

    Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less

  19. Computational electronics and electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C. C.

    The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less

  20. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  1. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  2. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  3. Learning to Analyze and Code Accounting Transactions in Interactive Mode.

    ERIC Educational Resources Information Center

    Bentz, William F.; Ambler, Eric E.

    An interactive computer-assisted instructional (CAI) system, called CODE, is used to teach transactional analysis, or coding, in elementary accounting. The first major component of CODE is TEACH, a program which controls student input and output. Following the statement of a financial position on a cathode ray tube, TEACH describes an event to…

  4. Porcupine: A visual pipeline tool for neuroimaging analysis

    PubMed Central

    Snoek, Lukas; Knapen, Tomas

    2018-01-01

    The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461

  5. Recent improvements of reactor physics codes in MHI

    NASA Astrophysics Data System (ADS)

    Kosaka, Shinya; Yamaji, Kazuya; Kirimura, Kazuki; Kamiyama, Yohei; Matsumoto, Hideki

    2015-12-01

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO's Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipated transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.

  6. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  7. Recent improvements of reactor physics codes in MHI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosaka, Shinya, E-mail: shinya-kosaka@mhi.co.jp; Yamaji, Kazuya; Kirimura, Kazuki

    2015-12-31

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO’s Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipatedmore » transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.« less

  8. NESSUS/NASTRAN Interface

    NASA Technical Reports Server (NTRS)

    Millwater, Harry; Riha, David

    1996-01-01

    The NESSUS probabilistic analysis computer program has been developed with a built-in finite element analysis program NESSUS/FEM. However, the NESSUS/FEM program is specialized for engine structures and may not contain sufficient features for other applications. In addition, users often become well acquainted with a particular finite element code and want to use that code for probabilistic structural analysis. For these reasons, this work was undertaken to develop an interface between NESSUS and NASTRAN such that NASTRAN can be used for the finite element analysis and NESSUS can be used for the probabilistic analysis. In addition, NESSUS was restructured such that other finite element codes could be more easily coupled with NESSUS. NESSUS has been enhanced such that NESSUS will modify the NASTRAN input deck for a given set of random variables, run NASTRAN and read the NASTRAN result. The coordination between the two codes is handled automatically. The work described here was implemented within NESSUS 6.2 which was delivered to NASA in September 1995. The code runs on Unix machines: Cray, HP, Sun, SGI and IBM. The new capabilities have been implemented such that a user familiar with NESSUS using NESSUS/FEM and NASTRAN can immediately use NESSUS with NASTRAN. In other words, the interface with NASTRAN has been implemented in an analogous manner to the interface with NESSUS/FEM. Only finite element specific input has been changed. This manual is written as an addendum to the existing NESSUS 6.2 manuals. We assume users have access to NESSUS manuals and are familiar with the operation of NESSUS including probabilistic finite element analysis. Update pages to the NESSUS PFEM manual are contained in Appendix E. The finite element features of the code and the probalistic analysis capabilities are summarized.

  9. A study of transonic aerodynamic analysis methods for use with a hypersonic aircraft synthesis code

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral R.; Davis, Paul Christopher

    1992-01-01

    A means of performing routine transonic lift, drag, and moment analyses on hypersonic all-body and wing-body configurations were studied. The analysis method is to be used in conjunction with the Hypersonic Vehicle Optimization Code (HAVOC). A review of existing techniques is presented, after which three methods, chosen to represent a spectrum of capabilities, are tested and the results are compared with experimental data. The three methods consist of a wave drag code, a full potential code, and a Navier-Stokes code. The wave drag code, representing the empirical approach, has very fast CPU times, but very limited and sporadic results. The full potential code provides results which compare favorably to the wind tunnel data, but with a dramatic increase in computational time. Even more extreme is the Navier-Stokes code, which provides the most favorable and complete results, but with a very large turnaround time. The full potential code, TRANAIR, is used for additional analyses, because of the superior results it can provide over empirical and semi-empirical methods, and because of its automated grid generation. TRANAIR analyses include an all body hypersonic cruise configuration and an oblique flying wing supersonic transport.

  10. Fuel burnup analysis for IRIS reactor using MCNPX and WIMS-D5 codes

    NASA Astrophysics Data System (ADS)

    Amin, E. A.; Bashter, I. I.; Hassan, Nabil M.; Mustafa, S. S.

    2017-02-01

    International Reactor Innovative and Secure (IRIS) reactor is a compact power reactor designed with especial features. It contains Integral Fuel Burnable Absorber (IFBA). The core is heterogeneous both axially and radially. This work provides the full core burn up analysis for IRIS reactor using MCNPX and WIMDS-D5 codes. Criticality calculations, radial and axial power distributions and nuclear peaking factor at the different stages of burnup were studied. Effective multiplication factor values for the core were estimated by coupling MCNPX code with WIMS-D5 code and compared with SAS2H/KENO-V code values at different stages of burnup. The two calculation codes show good agreement and correlation. The values of radial and axial powers for the full core were also compared with published results given by SAS2H/KENO-V code (at the beginning and end of reactor operation). The behavior of both radial and axial power distribution is quiet similar to the other data published by SAS2H/KENO-V code. The peaking factor values estimated in the present work are close to its values calculated by SAS2H/KENO-V code.

  11. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  12. DRG coding practice: a nationwide hospital survey in Thailand

    PubMed Central

    2011-01-01

    Background Diagnosis Related Group (DRG) payment is preferred by healthcare reform in various countries but its implementation in resource-limited countries has not been fully explored. Objectives This study was aimed (1) to compare the characteristics of hospitals in Thailand that were audited with those that were not and (2) to develop a simplified scale to measure hospital coding practice. Methods A questionnaire survey was conducted of 920 hospitals in the Summary and Coding Audit Database (SCAD hospitals, all of which were audited in 2008 because of suspicious reports of possible DRG miscoding); the questionnaire also included 390 non-SCAD hospitals. The questionnaire asked about general demographics of the hospitals, hospital coding structure and process, and also included a set of 63 opinion-oriented items on the current hospital coding practice. Descriptive statistics and exploratory factor analysis (EFA) were used for data analysis. Results SCAD and Non-SCAD hospitals were different in many aspects, especially the number of medical statisticians, experience of medical statisticians and physicians, as well as number of certified coders. Factor analysis revealed a simplified 3-factor, 20-item model to assess hospital coding practice and classify hospital intention. Conclusion Hospital providers should not be assumed capable of producing high quality DRG codes, especially in resource-limited settings. PMID:22040256

  13. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  14. Aerodynamic Analysis of the M33 Projectile Using the CFX Code

    DTIC Science & Technology

    2011-12-01

    is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) The M33 projectile has been analyzed using the ANSYS CFX code that is based...analyzed using the ANSYS CFX code that is based on the numerical solution of the full Navier-Stokes equations. Simulation data were obtained...using the CFX code. The ANSYS - CFX code is a commercial CFD program used to simulate fluid flow in a variety of applications such as gas turbine

  15. VIC: A Computer Analysis of Verbal Interaction Category Systems.

    ERIC Educational Resources Information Center

    Kline, John A.; And Others

    VIC is a computer program for the analysis of verbal interaction category systems, especially the Flanders interaction analysis system. The observer codes verbal behavior on coding sheets for later machine scoring. A matrix is produced by the program showing the number and percentages of times that a particular cell describes classroom behavior.…

  16. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    PubMed

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  17. Efficiency of International Classification of Diseases, Ninth Revision, Billing Code Searches to Identify Emergency Department Visits for Blood or Body Fluid Exposures through a Statewide Multicenter Database

    PubMed Central

    Rosen, Lisa M.; Liu, Tao; Merchant, Roland C.

    2016-01-01

    BACKGROUND Blood and body fluid exposures are frequently evaluated in emergency departments (EDs). However, efficient and effective methods for estimating their incidence are not yet established. OBJECTIVE Evaluate the efficiency and accuracy of estimating statewide ED visits for blood or body fluid exposures using International Classification of Diseases, Ninth Revision (ICD-9), code searches. DESIGN Secondary analysis of a database of ED visits for blood or body fluid exposure. SETTING EDs of 11 civilian hospitals throughout Rhode Island from January 1, 1995, through June 30, 2001. PATIENTS Patients presenting to the ED for possible blood or body fluid exposure were included, as determined by prespecified ICD-9 codes. METHODS Positive predictive values (PPVs) were estimated to determine the ability of 10 ICD-9 codes to distinguish ED visits for blood or body fluid exposure from ED visits that were not for blood or body fluid exposure. Recursive partitioning was used to identify an optimal subset of ICD-9 codes for this purpose. Random-effects logistic regression modeling was used to examine variations in ICD-9 coding practices and styles across hospitals. Cluster analysis was used to assess whether the choice of ICD-9 codes was similar across hospitals. RESULTS The PPV for the original 10 ICD-9 codes was 74.4% (95% confidence interval [CI], 73.2%–75.7%), whereas the recursive partitioning analysis identified a subset of 5 ICD-9 codes with a PPV of 89.9% (95% CI, 88.9%–90.8%) and a misclassification rate of 10.1%. The ability, efficiency, and use of the ICD-9 codes to distinguish types of ED visits varied across hospitals. CONCLUSIONS Although an accurate subset of ICD-9 codes could be identified, variations across hospitals related to hospital coding style, efficiency, and accuracy greatly affected estimates of the number of ED visits for blood or body fluid exposure. PMID:22561713

  18. An Analysis of Language Code Used by the Cross-Married Couples, Banjarese-Javanese Ethnics: A Case Study in South Kalimantan Province, Indonesia

    ERIC Educational Resources Information Center

    Supiani

    2016-01-01

    This research aims to describe the use of language code applied by the participants and to find out the factors influencing the choice of language codes. This research is qualitative research that describe the use of language code in the cross married couples. The data are taken from the discourses about language code phenomena dealing with the…

  19. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  20. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  1. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  2. Three-Dimensional Numerical Analyses of Earth Penetration Dynamics

    DTIC Science & Technology

    1979-01-31

    Lagrangian formulation based on the HEMP method and has been adapted and validated for treatment of normal-incidence (axisymmetric) impact and...code, is a detailed analysis of the structural response of the EPW. This analysis is generated using a nonlinear dynamic, elastic- plastic finite element...based on the HEMP scheme. Thus, the code has the same material modeling capabilities and abilities to track large scale motion found in the WAVE-L code

  3. Identification and Analysis of Critical Gaps in Nuclear Fuel Cycle Codes Required by the SINEMA Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adrian Miron; Joshua Valentine; John Christenson

    2009-10-01

    The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), Unviery of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFCmore » codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.« less

  4. Exclusively visual analysis of classroom group interactions

    NASA Astrophysics Data System (ADS)

    Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric

    2016-12-01

    Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.

  5. Adaptive distributed source coding.

    PubMed

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  6. Application of a Two-dimensional Unsteady Viscous Analysis Code to a Supersonic Throughflow Fan Stage

    NASA Technical Reports Server (NTRS)

    Steinke, Ronald J.

    1989-01-01

    The Rai ROTOR1 code for two-dimensional, unsteady viscous flow analysis was applied to a supersonic throughflow fan stage design. The axial Mach number for this fan design increases from 2.0 at the inlet to 2.9 at the outlet. The Rai code uses overlapped O- and H-grids that are appropriately packed. The Rai code was run on a Cray XMP computer; then data postprocessing and graphics were performed to obtain detailed insight into the stage flow. The large rotor wakes uniformly traversed the rotor-stator interface and dispersed as they passed through the stator passage. Only weak blade shock losses were computerd, which supports the design goals. High viscous effects caused large blade wakes and a low fan efficiency. Rai code flow predictions were essentially steady for the rotor, and they compared well with Chima rotor viscous code predictions based on a C-grid of similar density.

  7. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  8. Frequency- and Time-Domain Methods in Soil-Structure Interaction Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolisetti, Chandrakanth; Whittaker, Andrew S.; Coleman, Justin L.

    2015-06-01

    Soil-structure interaction (SSI) analysis in the nuclear industry is currently performed using linear codes that function in the frequency domain. There is a consensus that these frequency-domain codes give reasonably accurate results for low-intensity ground motions that result in almost linear response. For higher intensity ground motions, which may result in nonlinear response in the soil, structure or at the vicinity of the foundation, the adequacy of frequency-domain codes is unproven. Nonlinear analysis, which is only possible in the time domain, is theoretically more appropriate in such cases. These methods are available but are rarely used due to the largemore » computational requirements and a lack of experience with analysts and regulators. This paper presents an assessment of the linear frequency-domain code, SASSI, which is widely used in the nuclear industry, and the time-domain commercial finite-element code, LS-DYNA, for SSI analysis. The assessment involves benchmarking the SSI analysis procedure in LS-DYNA against SASSI for linearly elastic models. After affirming that SASSI and LS-DYNA result in almost identical responses for these models, they are used to perform nonlinear SSI analyses of two structures founded on soft soil. An examination of the results shows that, in spite of using identical material properties, the predictions of frequency- and time-domain codes are significantly different in the presence of nonlinear behavior such as gapping and sliding of the foundation.« less

  9. Hearing the voices of service user researchers in collaborative qualitative data analysis: the case for multiple coding.

    PubMed

    Sweeney, Angela; Greenwood, Kathryn E; Williams, Sally; Wykes, Til; Rose, Diana S

    2013-12-01

    Health research is frequently conducted in multi-disciplinary teams, with these teams increasingly including service user researchers. Whilst it is common for service user researchers to be involved in data collection--most typically interviewing other service users--it is less common for service user researchers to be involved in data analysis and interpretation. This means that a unique and significant perspective on the data is absent. This study aims to use an empirical report of a study on Cognitive Behavioural Therapy for psychosis (CBTp) to demonstrate the value of multiple coding in enabling service users voices to be heard in team-based qualitative data analysis. The CBTp study employed multiple coding to analyse service users' discussions of CBT for psychosis (CBTp) from the perspectives of a service user researcher, clinical researcher and psychology assistant. Multiple coding was selected to enable multiple perspectives to analyse and interpret data, to understand and explore differences and to build multi-disciplinary consensus. Multiple coding enabled the team to understand where our views were commensurate and incommensurate and to discuss and debate differences. Through the process of multiple coding, we were able to build strong consensus about the data from multiple perspectives, including that of the service user researcher. Multiple coding is an important method for understanding and exploring multiple perspectives on data and building team consensus. This can be contrasted with inter-rater reliability which is only appropriate in limited circumstances. We conclude that multiple coding is an appropriate and important means of hearing service users' voices in qualitative data analysis. © 2012 John Wiley & Sons Ltd.

  10. Modeling of rolling element bearing mechanics. Theoretical manual

    NASA Technical Reports Server (NTRS)

    Merchant, David H.; Greenhill, Lyn M.

    1994-01-01

    This report documents the theoretical basis for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings; duplex angular contact ball bearings; and cylindrical roller bearings. The model includes the effects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program; and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. A companion report addresses the input instructions for and features of the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  11. RTE: A computer code for Rocket Thermal Evaluation

    NASA Technical Reports Server (NTRS)

    Naraghi, Mohammad H. N.

    1995-01-01

    The numerical model for a rocket thermal analysis code (RTE) is discussed. RTE is a comprehensive thermal analysis code for thermal analysis of regeneratively cooled rocket engines. The input to the code consists of the composition of fuel/oxidant mixture and flow rates, chamber pressure, coolant temperature and pressure. dimensions of the engine, materials and the number of nodes in different parts of the engine. The code allows for temperature variation in axial, radial and circumferential directions. By implementing an iterative scheme, it provides nodal temperature distribution, rates of heat transfer, hot gas and coolant thermal and transport properties. The fuel/oxidant mixture ratio can be varied along the thrust chamber. This feature allows the user to incorporate a non-equilibrium model or an energy release model for the hot-gas-side. The user has the option of bypassing the hot-gas-side calculations and directly inputting the gas-side fluxes. This feature is used to link RTE to a boundary layer module for the hot-gas-side heat flux calculations.

  12. Spatial transform coding of color images.

    NASA Technical Reports Server (NTRS)

    Pratt, W. K.

    1971-01-01

    The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.

  13. Electron Temperature Gradient Scale Measurements in ICRF Heated Plasmas at Alcator C-Mod

    NASA Astrophysics Data System (ADS)

    Houshmandyar, Saeid; Phillips, Perry E.; Rowan, William L.; Howard, Nathaniel T.; Greenwald, Martin

    2016-10-01

    It is generally believed that the temperature gradient is a driving mechanism for the turbulent transport in hot and magnetically confined plasmas. A feature of many anomalous transport models is the critical threshold value (LC) for the gradient scale length, above which both the turbulence and the heat transport increases. This threshold is also predicted by the recent multi-scale gyrokinetic simulations, which are focused on addressing the electron (and ion) heat transport in tokamaks. Recently, we have established an accurate technique (BT-jog) to directly measure the electron temperature gradient scale length (LTe =Te / ∇T) profile, using a high-spatial resolution radiometer-based electron cyclotron emission (ECE) diagnostic. For the work presented here, electrons are heated by ion cyclotron range of frequencies (ICRF) through minority heating in L-mode plasmas at different power levels, TRANSP runs determine the electron heat fluxes and the scale lengths are measured through the BT-jog technique. Furthermore, the experiment is extended for different plasma current and electron densities by which the parametric dependence of LC on magnetic shear, safety factor and density will be investigated. This work is supported by U.S. DoE OFES, under Award No. DE-FG03-96ER-54373.

  14. Energy Confinement Recovery in Low Collisionality ITER Shape Plasmas with Applied Resonant Magnetic Perturbations (RMPs)

    NASA Astrophysics Data System (ADS)

    Cui, L.; Grierson, B.; Logan, N.; Nazikian, R.

    2016-10-01

    Application of RMPs to low collisionality (ν*e < 0.4) ITER shape plasmas on DIII-D leads to a rapid reduction in stored energy due to density pumpout that is sometimes followed by a gradual recovery in the plasma stored energy. Understanding this confinement recovery is essential to optimize the confinement of RMP plasmas in present and future devices such as ITER. Transport modeling using TRANSP+TGLF indicates that the core a/LTi is stiff in these plasmas while the ion temperature gradient is much less stiff in the pedestal region. The reduction in the edge density during pumpout leads to an increase in the core ion temperature predicted by TGLF based on experimental data. This is correlated to the increase in the normalized ion heat flux. Transport stiffness in the core combined with an increase in the edge a/LTi results in an increase of the plasma stored energy, consistent with experimental observations. For plasmas where the edge density is controlled using deuterium gas puffs, the effect of the RMP on ion thermal confinement is significantly reduced. Work supported by US DOE Grant DE-FC02-04ER54698 and DE-AC02-09CH11466.

  15. Beam ion susceptibility to loss in NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Darrow, Douglass; Fredrickson, Eric; Podesta, Mario; Liu, Deyong; White, Roscoe

    2016-10-01

    NSTX-U has operated with three additional neutral beam sources whose tangency radii of 1.1, 1.2, and 1.3 m are significantly larger than the 0.5, 0.6, and 0.7 m tangency radii of the neutral beams previously used in NSTX. These latter beams have also be retained for NSTX-U. Here, we present an estimate of the susceptibility of the beam ions from all the various sources to loss under a range of NSTX-U plasma conditions. This estimation is based upon TRANSP calculations of beam ion deposition in phase space, and the location of the FLR-corrected loss boundary in that phase space. Since losses are often observed at the injection energy, a simple measure of loss susceptibility is the change in canonical toroidal momentum required to move beam ions from their deposition point to the loss boundary, as a function of magnetic moment. To augment this simple estimate, we intend to report some associated transport coefficients of beam ions due to AE activity. Work supported by U.S. DOE DE-AC0209CH11466, DE-FG02-06ER54867, and DE-FG03-02ER54681.

  16. Development of a Novel Method for Determination of Momentum Transport Parameters

    NASA Astrophysics Data System (ADS)

    Peters, Michael J.; Guttenfelder, Walter; Scotti, Filippo; Kaye, Stanley M.; Solomon, Wayne M.

    2015-11-01

    The toroidal momentum pinch velocity Vφ and diffusivity χφ in NSTX were previously determined from the transient response of the toroidal rotation Ω following applied n =3 magnetic perturbations that brake the plasma. Assuming Π = nmR2(-χϕ ∇Ω + Vϕ Ω), where the momentum flux Π is determined using TRANSP, these local analyses used fits to Ω and ∇Ω to obtain χϕ and Vϕ one flux surface at a time. This work attempts to improve the accuracy of the inferred χϕ(r) and Vϕ(r) profiles by utilizing many flux surfaces simultaneously. We employ nonlinear least-squares minimization that compares the entire perturbed rotation profile evolution Ω(r,t) against the profile evolution generated by solving the momentum transport equation. We compare the local and integrated approaches and discuss their limitations. We also apply the integrated approach to determine whether an additional residual stress contribution (independent of Ω or ∇Ω) can be inferred given experimental uncertainties. This work supported by the U.S. Department of Energy SULI program and contract DE-AC02-09/CH11466, as well as the LLNL contract DE-AC52-07NA27344.

  17. Measurement and Simulation of First-Orbit Fast-Ion D-Alpha Emission and the Application to Fast-Ion Loss Detection in the DIII-D Tokamak

    NASA Astrophysics Data System (ADS)

    Bolte, Nathan; Heidbrink, W. W.; Pace, D. C.; van Zeeland, M. A.; Chen, X.

    2015-11-01

    A new fast-ion diagnostic method uses passive emission of D-alpha radiation to determine fast-ion losses quantitatively. The passive fast-ion D-alpha simulation (P-FIDAsim) forward models the Doppler-shifted spectra of first-orbit fast ions that charge exchange with edge neutrals. Simulated spectra are up to 80 % correlated with experimental spectra. Calibrated spectra are used to estimate the 2D neutral density profile by inverting simulated spectra. The inferred neutral density shows the expected increase toward each x-point and an average value of 8 × 10 9 cm-3 at the plasma boundary and 1 × 10 11 cm-3 near the wall. Measuring and simulating first-orbit spectra effectively ``calibrates'' the system, allowing for the quantification of more general fast-ion losses. Sawtooth crashes are estimated to eject 1.2 % of the fast-ion inventory, in good agreement with a 1.7 % loss estimate made by TRANSP. Sightlines sensitive to passing ions observe larger sawtooth losses than sightlines sensitive to trapped ions. Supported by US DOE under SC-G903402, DE-FC02-04ER54698.

  18. Enhancement of Edge Stability with Lithium Wall Coatings in NSTX

    NASA Astrophysics Data System (ADS)

    Maingi, R.; Bell, R. E.; Leblanc, B. P.; Kaita, R.; Kaye, S. M.; Kugel, H. W.; Mansfield, D. K.; Osborne, T. H.

    2008-11-01

    ELM reduction or elimination while maintaining high confinement is essential for ITER, which has been designed for H-mode operation. Large ELMs are thought to be triggered by exceeding either edge current density and/or pressure gradient limits (peeling, ballooning modes). Stability calculations show that spherical tori should have access to higher pressure gradients and pedestal heights than higher R/a tokamaks, owing to access to second stability regimes[...1]. An ELM-free regime was recently observed in the NSTX following the application of lithium onto the graphite plasma facing components[......2]. ELMs were eliminated in phases[.....3], with the resulting pressure gradients and pedestal widths increasing substantially. Calculations with TRANSP have shown that the edge bootstrap current increased substantially, consistent with second stability access. These ELM-free discharges have a substantial improvement in energy confinement, up to the global βN˜ 5.5 limit. * Supported by US DOE DE-FG02-04ER54520, DE-AC-76CH03073, and DE-FC02-04ER54698. [.1] P. B. Snyder, et. al., Plasma Phys. Contr. Fusion 46 (2004) A131. [2] H. W. Kugel, et. al., Phys. Plasma 15 (2008) #056118. [3] D. M. Mansfield, et. al., J. Nucl. Materials (2009) submitted.

  19. Proline isomerization in the C-terminal region of HSP27.

    PubMed

    Alderson, T Reid; Benesch, Justin L P; Baldwin, Andrew J

    2017-07-01

    In mammals, small heat-shock proteins (sHSPs) typically assemble into interconverting, polydisperse oligomers. The dynamic exchange of sHSP oligomers is regulated, at least in part, by molecular interactions between the α-crystallin domain and the C-terminal region (CTR). Here we report solution-state nuclear magnetic resonance (NMR) spectroscopy investigations of the conformation and dynamics of the disordered and flexible CTR of human HSP27, a systemically expressed sHSP. We observed multiple NMR signals for residues in the vicinity of proline 194, and we determined that, while all observed forms are highly disordered, the extra resonances arise from cis-trans peptidyl-prolyl isomerization about the G193-P194 peptide bond. The cis-P194 state is populated to near 15% at physiological temperatures, and, although both cis- and trans-P194 forms of the CTR are flexible and dynamic, both states show a residual but differing tendency to adopt β-strand conformations. In NMR spectra of an isolated CTR peptide, we observed similar evidence for isomerization involving proline 182, found within the IPI/V motif. Collectively, these data indicate a potential role for cis-trans proline isomerization in regulating the oligomerization of sHSPs.

  20. A transonic-small-disturbance wing design methodology

    NASA Technical Reports Server (NTRS)

    Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.

    1988-01-01

    An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.

  1. Modeling and Analysis of Actinide Diffusion Behavior in Irradiated Metal Fuel

    NASA Astrophysics Data System (ADS)

    Edelmann, Paul G.

    There have been numerous attempts to model fast reactor fuel behavior in the last 40 years. The US currently does not have a fully reliable tool to simulate the behavior of metal fuels in fast reactors. The experimental database necessary to validate the codes is also very limited. The DOE-sponsored Advanced Fuels Campaign (AFC) has performed various experiments that are ready for analysis. Current metal fuel performance codes are either not available to the AFC or have limitations and deficiencies in predicting AFC fuel performance. A modified version of a new fuel performance code, FEAST-Metal , was employed in this investigation with useful results. This work explores the modeling and analysis of AFC metallic fuels using FEAST-Metal, particularly in the area of constituent actinide diffusion behavior. The FEAST-Metal code calculations for this work were conducted at Los Alamos National Laboratory (LANL) in support of on-going activities related to sensitivity analysis of fuel performance codes. A sensitivity analysis of FEAST-Metal was completed to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. A modification was made to the FEAST-Metal constituent redistribution model to enable accommodation of newer AFC metal fuel compositions with verified results. Applicability of this modified model for sodium fast reactor metal fuel design is demonstrated.

  2. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.

  3. Unfiltered Talk--A Challenge to Categories.

    ERIC Educational Resources Information Center

    McCormick, Kay

    A study investigated how and why code switching and mixing occurs between English and Afrikaans in a region of South Africa. In District Six, non-standard Afrikaans seems to be a mixed code, and it is unclear whether non-standard English is a mixed code. Consequently, it is unclear when codes are being switched or mixed. The analysis looks at…

  4. Paired Comparison Survey Analyses Utilizing Rasch Methodology of the Relative Difficulty and Estimated Work Relative Value Units of CPT® Code 27279.

    PubMed

    Lorio, Morgan; Martinson, Melissa; Ferrara, Lisa

    2016-01-01

    Minimally invasive sacroiliac joint arthrodesis ("MI SIJ fusion") received a Category I CPT ® code (27279) effective January 1, 2015 and was assigned a work relative value unit ("RVU") of 9.03. The International Society for the Advancement of Spine Surgery ("ISASS") conducted a study consisting of a Rasch analysis of two separate surveys of surgeons to assess the accuracy of the assigned work RVU. A survey was developed and sent to ninety-three ISASS surgeon committee members. Respondents were asked to compare CPT ® 27279 to ten other comparator CPT ® codes reflective of common spine surgeries. The survey presented each comparator CPT ® code with its code descriptor as well as the description of CPT ® 27279 and asked respondents to indicate whether CPT ® 27279 was greater, equal, or less in terms of work effort than the comparator code. A second survey was sent to 557 U.S.-based spine surgeon members of ISASS and 241 spine surgeon members of the Society for Minimally Invasive Spine Surgery ("SMISS"). The design of the second survey mirrored that of the first survey except for the use of a broader set of comparator CPT ® codes (27 vs. 10). Using the work RVUs of the comparator codes, a Rasch analysis was performed to estimate the relative difficulty of CPT ® 27279, after which the work RVU of CPT ® 27279 was estimated by regression analysis. Twenty surgeons responded to the first survey and thirty-four surgeons responded to the second survey. The results of the regression analysis of the first survey indicate a work RVU for CPT ® 27279 of 14.36 and the results of the regression analysis of the second survey indicate a work RVU for CPT ® 27279 of 14.1. The Rasch analysis indicates that the current work RVU assigned to CPT ® 27279 is undervalued at 9.03. Averaging the results of the regression analyses of the two surveys indicates a work RVU for CPT ® 27279 of 14.23.

  5. Progressive Failure And Life Prediction of Ceramic and Textile Composites

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Shi, Yucheng; Katikala, Madhu; Johnston, William M., Jr.; Card, Michael F.

    1998-01-01

    An engineering approach to predict the fatigue life and progressive failure of multilayered composite and textile laminates is presented. Analytical models which account for matrix cracking, statistical fiber failures and nonlinear stress-strain behavior have been developed for both composites and textiles. The analysis method is based on a combined micromechanics, fracture mechanics and failure statistics analysis. Experimentally derived empirical coefficients are used to account for the interface of fiber and matrix, fiber strength, and fiber-matrix stiffness reductions. Similar approaches were applied to textiles using Repeating Unit Cells. In composite fatigue analysis, Walker's equation is applied for matrix fatigue cracking and Heywood's formulation is used for fiber strength fatigue degradation. The analysis has been compared with experiment with good agreement. Comparisons were made with Graphite-Epoxy, C/SiC and Nicalon/CAS composite materials. For textile materials, comparisons were made with triaxial braided and plain weave materials under biaxial or uniaxial tension. Fatigue predictions were compared with test data obtained from plain weave C/SiC materials tested at AS&M. Computer codes were developed to perform the analysis. Composite Progressive Failure Analysis for Laminates is contained in the code CPFail. Micromechanics Analysis for Textile Composites is contained in the code MicroTex. Both codes were adapted to run as subroutines for the finite element code ABAQUS and CPFail-ABAQUS and MicroTex-ABAQUS. Graphic user interface (GUI) was developed to connect CPFail and MicroTex with ABAQUS.

  6. Genome-wide screening and identification of long noncoding RNAs and their interaction with protein coding RNAs in bladder urothelial cell carcinoma.

    PubMed

    Wang, Longxin; Fu, Dian; Qiu, Yongbin; Xing, Xiaoxiao; Xu, Feng; Han, Conghui; Xu, Xiaofeng; Wei, Zhifeng; Zhang, Zhengyu; Ge, Jingping; Cheng, Wen; Xie, Hai-Long

    2014-07-10

    To understand lncRNAs expression profiling and their potential functions in bladder cancer, we investigated the lncRNA and coding RNA expression on human bladder cancer and normal bladder tissues. Bioinformatic analysis revealed thousands of significantly differentially expressed lncRNAs and coding mRNA in bladder cancer relative to normal bladder tissue. Co-expression analysis revealed that 50% of lncRNAs and coding RNAs expressed in the same direction. A subset of lncRNAs might be involved in mTOR signaling, p53 signaling, cancer pathways. Our study provides a large scale of co-expression between lncRNA and coding RNAs in bladder cancer cells and lays biological basis for further investigation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Utilization of recently developed codes for high power Brayton and Rankine cycle power systems

    NASA Technical Reports Server (NTRS)

    Doherty, Michael P.

    1993-01-01

    Two recently developed FORTRAN computer codes for high power Brayton and Rankine thermodynamic cycle analysis for space power applications are presented. The codes were written in support of an effort to develop a series of subsystem models for multimegawatt Nuclear Electric Propulsion, but their use is not limited just to nuclear heat sources or to electric propulsion. Code development background, a description of the codes, some sample input/output from one of the codes, and state future plans/implications for the use of these codes by NASA's Lewis Research Center are provided.

  8. PCC Framework for Program-Generators

    NASA Technical Reports Server (NTRS)

    Kong, Soonho; Choi, Wontae; Yi, Kwangkeun

    2009-01-01

    In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.

  9. Complete analysis of steady and transient missile aerodynamic/propulsive/plume flowfield interactions

    NASA Astrophysics Data System (ADS)

    York, B. J.; Sinha, N.; Dash, S. M.; Hosangadi, A.; Kenzakowski, D. C.; Lee, R. A.

    1992-07-01

    The analysis of steady and transient aerodynamic/propulsive/plume flowfield interactions utilizing several state-of-the-art computer codes (PARCH, CRAFT, and SCHAFT) is discussed. These codes have been extended to include advanced turbulence models, generalized thermochemistry, and multiphase nonequilibrium capabilities. Several specialized versions of these codes have been developed for specific applications. This paper presents a brief overview of these codes followed by selected cases demonstrating steady and transient analyses of conventional as well as advanced missile systems. Areas requiring upgrades include turbulence modeling in a highly compressible environment and the treatment of particulates in general. Recent progress in these areas are highlighted.

  10. Development of a CFD Code for Analysis of Fluid Dynamic Forces in Seals

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh M.; Przekwas, Andrzej J.; Singhal, Ashok K.

    1991-01-01

    The aim is to develop a 3-D computational fluid dynamics (CFD) code for the analysis of fluid flow in cylindrical seals and evaluation of the dynamic forces on the seals. This code is expected to serve as a scientific tool for detailed flow analysis as well as a check for the accuracy of the 2D industrial codes. The features necessary in the CFD code are outlined. The initial focus was to develop or modify and implement new techniques and physical models. These include collocated grid formulation, rotating coordinate frames and moving grid formulation. Other advanced numerical techniques include higher order spatial and temporal differencing and an efficient linear equation solver. These techniques were implemented in a 2D flow solver for initial testing. Several benchmark test cases were computed using the 2D code, and the results of these were compared to analytical solutions or experimental data to check the accuracy. Tests presented here include planar wedge flow, flow due to an enclosed rotor, and flow in a 2D seal with a whirling rotor. Comparisons between numerical and experimental results for an annular seal and a 7-cavity labyrinth seal are also included.

  11. CAVE: A computer code for two-dimensional transient heating analysis of conceptual thermal protection systems for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Rathjen, K. A.

    1977-01-01

    A digital computer code CAVE (Conduction Analysis Via Eigenvalues), which finds application in the analysis of two dimensional transient heating of hypersonic vehicles is described. The CAVE is written in FORTRAN 4 and is operational on both IBM 360-67 and CDC 6600 computers. The method of solution is a hybrid analytical numerical technique that is inherently stable permitting large time steps even with the best of conductors having the finest of mesh size. The aerodynamic heating boundary conditions are calculated by the code based on the input flight trajectory or can optionally be calculated external to the code and then entered as input data. The code computes the network conduction and convection links, as well as capacitance values, given basic geometrical and mesh sizes, for four generations (leading edges, cooled panels, X-24C structure and slabs). Input and output formats are presented and explained. Sample problems are included. A brief summary of the hybrid analytical-numerical technique, which utilizes eigenvalues (thermal frequencies) and eigenvectors (thermal mode vectors) is given along with aerodynamic heating equations that have been incorporated in the code and flow charts.

  12. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  13. A Proposal of Monitoring and Forecasting Method for Crustal Activity in and around Japan with 3-dimensional Heterogeneous Medium Using a Large-scale High-fidelity Finite Element Simulation

    NASA Astrophysics Data System (ADS)

    Hori, T.; Agata, R.; Ichimura, T.; Fujita, K.; Yamaguchi, T.; Takahashi, N.

    2017-12-01

    Recently, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. For construct a system for monitoring and forecasting, it is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate inter-face and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Unstructured FE non-linear seismic wave simulation code has been developed. This achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. A high fidelity FEM simulation code with mesh generator has also been developed to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. This code has been improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, waveform inversion code for modeling 3D crustal structure has been developed, and the high-fidelity FEM code has been improved to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. We are developing the methods for forecasting the slip velocity variation on the plate interface. Although the prototype is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model. Furthermore, large-scale simulation codes for monitoring are being implemented on the GPU clusters and analysis tools are developing to include other functions such as examination in model errors.

  14. Concurrent electromagnetic scattering analysis

    NASA Technical Reports Server (NTRS)

    Patterson, Jean E.; Cwik, Tom; Ferraro, Robert D.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Parker, Jay

    1989-01-01

    The computational power of the hypercube parallel computing architecture is applied to the solution of large-scale electromagnetic scattering and radiation problems. Three analysis codes have been implemented. A Hypercube Electromagnetic Interactive Analysis Workstation was developed to aid in the design and analysis of metallic structures such as antennas and to facilitate the use of these analysis codes. The workstation provides a general user environment for specification of the structure to be analyzed and graphical representations of the results.

  15. An Object Oriented Analysis Method for Ada and Embedded Systems

    DTIC Science & Technology

    1989-12-01

    expansion of the paradligm from the coding anld desiningactivities into the earlier activity of reurmnsalyi.Ts hpl, begins by discussing the application of...response time: 0.1 seconds.I Step le: Identify Known Restrictions on the Software.I " The cruise control system object code must fit within 16K of mem- orv...application of object-oriented techniques to the coding and desigll phases of the life cycle, as well as various approaches to requirements analysis. 3

  16. Dynamic Forces in Spur Gears - Measurement, Prediction, and Code Validation

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Townsend, Dennis P.; Rebbechi, Brian; Lin, Hsiang Hsi

    1996-01-01

    Measured and computed values for dynamic loads in spur gears were compared to validate a new version of the NASA gear dynamics code DANST-PC. Strain gage data from six gear sets with different tooth profiles were processed to determine the dynamic forces acting between the gear teeth. Results demonstrate that the analysis code successfully simulates the dynamic behavior of the gears. Differences between analysis and experiment were less than 10 percent under most conditions.

  17. Developing and Implementing the Data Mining Algorithms in RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less

  18. Peptide code-on-a-microplate for protease activity analysis via MALDI-TOF mass spectrometric quantitation.

    PubMed

    Hu, Junjie; Liu, Fei; Ju, Huangxian

    2015-04-21

    A peptide-encoded microplate was proposed for MALDI-TOF mass spectrometric (MS) analysis of protease activity. The peptide codes were designed to contain a coding region and the substrate of protease for enzymatic cleavage, respectively, and an internal standard method was proposed for the MS quantitation of the cleavage products of these peptide codes. Upon the cleavage reaction in the presence of target proteases, the coding regions were released from the microplate, which were directly quantitated by using corresponding peptides with one-amino acid difference as the internal standards. The coding region could be used as the unique "Protease ID" for the identification of corresponding protease, and the amount of the cleavage product was used for protease activity analysis. Using trypsin and chymotrypsin as the model proteases to verify the multiplex protease assay, the designed "Trypsin ID" and "Chymotrypsin ID" occurred at m/z 761.6 and 711.6. The logarithm value of the intensity ratio of "Protease ID" to internal standard was proportional to trypsin and chymotrypsin concentration in a range from 5.0 to 500 and 10 to 500 nM, respectively. The detection limits for trypsin and chymotrypsin were 2.3 and 5.2 nM, respectively. The peptide-encoded microplate showed good selectivity. This proposed method provided a powerful tool for convenient identification and activity analysis of multiplex proteases.

  19. RY-Coding and Non-Homogeneous Models Can Ameliorate the Maximum-Likelihood Inferences From Nucleotide Sequence Data with Parallel Compositional Heterogeneity.

    PubMed

    Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo

    2012-01-01

    In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.

  20. Code Pulse: Software Assurance (SWA) Visual Analytics for Dynamic Analysis of Code

    DTIC Science & Technology

    2014-09-01

    31 4.5.1 Market Analysis...competitive market analysis to assess the tool potential. The final transition targets were selected and expressed along with our research on the topic...public release milestones. Details of our testing methodology is in our Software Test Plan deliv- erable, CP- STP -0001. A summary of this approach is

  1. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  2. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  3. NOAA/DOE CWP structural analysis package. [CWPFLY, CWPEXT, COTEC, and XOTEC codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pompa, J.A.; Lunz, D.F.

    1979-09-01

    The theoretical development and computer code user's manual for analysis of the Ocean Thermal Energy Conversion (OTEC) plant cold water pipe (CWP) are presented. The analysis of the CWP includes coupled platform/CWP loadngs and dynamic responses. This report with the exception of the Introduction and Appendix F was orginally published as Hydronautics, Inc., Technical Report No. 7825-2 (by Barr, Chang, and Thasanatorn) in November 1978. A detailed theoretical development of the equations describing the coupled platform/CWP system and preliminary validation efforts are described. The appendices encompass a complete user's manual, describing the inputs, outputs and operation of the four componentmore » programs, and detail changes and updates implemented since the original release of the code by Hydronautics. The code itself is available through NOAA's Office of Ocean Technology and Engineering Services.« less

  4. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  5. [Convergent origin of repeats in genes coding for globular proteins. An analysis of the factors determining the presence of inverted and symmetrical repeats].

    PubMed

    Solov'ev, V V; Kel', A E; Kolchanov, N A

    1989-01-01

    The factors, determining the presence of inverted and symmetrical repeats in genes coding for globular proteins, have been analysed. An interesting property of genetical code has been revealed in the analysis of symmetrical repeats: the pairs of symmetrical codons corresponded to pairs of amino acids with mostly similar physical-chemical parameters. This property may explain the presence of symmetrical repeats and palindromes only in genes coding for beta-structural proteins-polypeptides, where amino acids with similar physical-chemical properties occupy symmetrical positions. A stochastic model of evolution of polynucleotide sequences has been used for analysis of inverted repeats. The modelling demonstrated that only limiting of sequences (uneven frequencies of used codons) is enough for arising of nonrandom inverted repeats in genes.

  6. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  7. Experiences and wisdom behind the numbers: qualitative analysis of the National Action Alliance for Suicide Prevention's Research Prioritization Task Force stakeholder survey.

    PubMed

    Booth, Chelsea L

    2014-09-01

    The Research Prioritization Task Force of the National Action Alliance for Suicide Prevention conducted a stakeholder survey including 716 respondents from 49 U.S. states and 18 foreign countries. To conduct a qualitative analysis on responses from individuals representing four main stakeholder groups: attempt and loss survivors, researchers, providers, and policy/administrators. This article focuses on a qualitative analysis of the early-round, open-ended responses collected in a modified online Delphi process, and, as an illustration of the research method, focuses on analysis of respondents' views of the role of life and emotional skills in suicide prevention. Content analysis was performed using both inductive and deductive code and category development and systematic qualitative methods. After the inductive coding was completed, the same data set was re-coded using the 12 Aspirational Goals (AGs) identified by the Delphi process. Codes and thematic categories produced from the inductive coding process were, in some cases, very similar or identical to the 12 AGs (i.e., those dealing with risk and protective factors, provider training, preventing reattempts, and stigma). Other codes highlighted areas that were not identified as important in the Delphi process (e.g., cultural/social factors of suicide, substance use). Qualitative and mixed-methods research are essential to the future of suicide prevention work. By design, qualitative research is explorative and appropriate for complex, culturally embedded social issues such as suicide. Such research can be used to generate hypotheses for testing and, as in this analysis, illuminate areas that would be missed in an approach that imposed predetermined categories on data. Published by Elsevier Inc.

  8. Maximum likelihood decoding analysis of Accumulate-Repeat-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    Repeat-Accumulate (RA) codes are the simplest turbo-like codes that achieve good performance. However, they cannot compete with Turbo codes or low-density parity check codes (LDPC) as far as performance is concerned. The Accumulate Repeat Accumulate (ARA) codes, as a subclass of LDPC codes, are obtained by adding a pre-coder in front of RA codes with puncturing where an accumulator is chosen as a precoder. These codes not only are very simple, but also achieve excellent performance with iterative decoding. In this paper, the performance of these codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. The weight distribution of some simple ARA codes is obtained, and through existing tightest bounds we have shown the ML SNR threshold of ARA codes approaches very closely to the performance of random codes. We have shown that the use of precoder improves the SNR threshold but interleaving gain remains unchanged with respect to RA code with puncturing.

  9. FPCAS3D User's guide: A three dimensional full potential aeroelastic program, version 1

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.

    1995-01-01

    The FPCAS3D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady three-dimensional full potential equation which is solved for a blade row. The structural analysis is based on a finite-element model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS3D code. A complete description of the input data is provided in this report. In addition, six examples, including inputs and outputs, are provided.

  10. FPCAS2D user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.

    1994-01-01

    The FPCAS2D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady two-dimensional full potential equation which is solved for a cascade of blades. The structural analysis is based on a two degree-of-freedom rigid typical section model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS2D code. A complete description of the input data is provided in this report. In addition, four test cases, including inputs and outputs, are provided.

  11. TRANSURANUS: a fuel rod analysis code ready for use

    NASA Astrophysics Data System (ADS)

    Lassmann, K.

    1992-06-01

    TRANSURANUS is a computer program for the thermal and mechanical analysis of fuel rods in nuclear reactors and was developed at the European Institute for Transuranium Elements (TUI). The TRANSURANUS code consists of a clearly defined mechanical-mathematical framework into which physical models can easily be incorporated. Besides its flexibility for different fuel rod designs the TRANSURANUS code can deal with very different situations, as given for instance in an experiment, under normal, off-normal and accident conditions. The time scale of the problems to be treated may range from milliseconds to years. The code has a comprehensive material data bank for oxide, mixed oxide, carbide and nitride fuels, Zircaloy and steel claddings and different coolants. During its development great effort was spent on obtaining an extremely flexible tool which is easy to handle, exhibiting very fast running times. The total development effort is approximately 40 man-years. In recent years the interest to use this code grew and the code is in use in several organisations, both research and private industry. The code is now available to all interested parties. The paper outlines the main features and capabilities of the TRANSURANUS code, its validation and treats also some practical aspects.

  12. New quantum codes derived from a family of antiprimitive BCH codes

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  13. An Analysis of the Changes in Communication Techniques in the Italian Codes of Medical Deontology.

    PubMed

    Conti, Andrea Alberto

    2017-04-28

    The code of deontology of the Italian National Federation of the Colleges of Physicians, Surgeons and Dentists (FNOMCeO) contains the principles and rules to which the professional medical practitioner must adhere. This work identifies and analyzes the medical-linguistic choices and the expressive techniques present in the different editions of the code, and evaluates their purpose and function, focusing on the first appearance and the subsequent frequency of key terms. Various aspects of the formal and expressive revisions of the eight editions of the Codes of Medical Deontology published after the Second World War (from 1947/48 to 2014) are here presented, starting from a brief comparison with the first edition of 1903. Formal characteristics, choices of medical terminology and the introduction of new concepts and communicative attitudes are here identified and evaluated. This paper, in presenting a quantitative and epistemological analysis of variations, modifications and confirmations in the different editions of the Italian code of medical deontology over the last century, enucleates and demonstrates the dynamic paradigm of changing attitudes in the medical profession. This analysis shows the evolution in medical-scientific communication as embodied in the Italian code of medical deontology. This code, in its adoption, changes and adaptations, as evidenced in its successive editions, bears witness to the expressions and attitudes pertinent to and characteristic of the deontological stance of the medical profession during the twentieth century.

  14. Empirical Analysis of Using Erasure Coding in Outsourcing Data Storage With Provable Security

    DTIC Science & Technology

    2016-06-01

    the fastest encoding performance among the four tested schemes. We expected to observe that Cauchy Reed-Solomonwould be faster than Reed- Solomon for all...providing recoverability for POR. We survey MDS codes and select Reed- Solomon and Cauchy Reed- Solomon MDS codes to be implemented into a prototype POR...tools providing recoverability for POR. We survey MDS codes and select Reed- Solomon and Cauchy Reed- Solomon MDS codes to be implemented into a

  15. Coupled Analysis of an Inlet and Fan for a Quiet Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Conners, Timothy R.; Wayman, Thomas R.

    2009-01-01

    A computational analysis of a Gulfstream isentropic external compression supersonic inlet coupled to a Rolls-Royce fan was completed. The inlet was designed for a small, low sonic boom supersonic vehicle with a design cruise condition of M = 1.6 at 45,000 feet. The inlet design included an annular bypass duct that routed flow subsonically around an engine-mounted gearbox and diverted flow with high shock losses away from the fan tip. Two Reynolds-averaged Navier-Stokes codes were used for the analysis: an axisymmetric code called AVCS for the inlet and a 3-D code called SWIFT for the fan. The codes were coupled at a mixing plane boundary using a separate code for data exchange. The codes were used to determine the performance of the inlet/fan system at the design point and to predict the performance and operability of the system over the flight profile. At the design point the core inlet had a recovery of 96 percent, and the fan operated near its peak efficiency and pressure ratio. A large hub radial distortion generated in the inlet was not eliminated by the fan and could pose a challenge for subsequent booster stages. The system operated stably at all points along the flight profile. Reduced stall margin was seen at low altitude and Mach number where flow separated on the interior lips of the cowl and bypass ducts. The coupled analysis gave consistent solutions at all points on the flight profile that would be difficult or impossible to predict by analysis of isolated components.

  16. Coupled Analysis of an Inlet and Fan for a Quiet Supersonic Jet

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Conners, Timothy R.; Wayman, Thomas R.

    2010-01-01

    A computational analysis of a Gulfstream isentropic external compression supersonic inlet coupled to a Rolls-Royce fan has been completed. The inlet was designed for a small, low sonic boom supersonic vehicle with a design cruise condition of M = 1.6 at 45,000 ft. The inlet design included an annular bypass duct that routed flow subsonically around an engine-mounted gearbox and diverted flow with high shock losses away from the fan tip. Two Reynolds-averaged Navier-Stokes codes were used for the analysis: an axisymmetric code called AVCS for the inlet and a three dimensional (3-D) code called SWIFT for the fan. The codes were coupled at a mixing plane boundary using a separate code for data exchange. The codes were used to determine the performance of the inlet/fan system at the design point and to predict the performance and operability of the system over the flight profile. At the design point the core inlet had a recovery of 96 percent, and the fan operated near its peak efficiency and pressure ratio. A large hub radial distortion generated in the inlet was not eliminated by the fan and could pose a challenge for subsequent booster stages. The system operated stably at all points along the flight profile. Reduced stall margin was seen at low altitude and Mach number where flow separated on the interior lips of the cowl and bypass ducts. The coupled analysis gave consistent solutions at all points on the flight profile that would be difficult or impossible to predict by analysis of isolated components.

  17. Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code

    NASA Technical Reports Server (NTRS)

    Gaugler, Raymond E.; Lee, Chi-Miag (Technical Monitor)

    2001-01-01

    For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this paper, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery for space launch vehicle propulsion systems.

  18. Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code

    NASA Technical Reports Server (NTRS)

    Gaugfer, Raymond E.

    2002-01-01

    For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.

  19. Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier Stokes Heat Transfer Code

    NASA Technical Reports Server (NTRS)

    Gaugler, Raymond E.

    2002-01-01

    For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid beat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.

  20. Codes and morals: is there a missing link? (The Nuremberg Code revisited).

    PubMed

    Hick, C

    1998-01-01

    Codes are a well known and popular but weak form of ethical regulation in medical practice. There is, however, a lack of research on the relations between moral judgments and ethical Codes, or on the possibility of morally justifying these Codes. Our analysis begins by showing, given the Nuremberg Code, how a typical reference to natural law has historically served as moral justification. We then indicate, following the analyses of H. T. Engelhardt, Jr., and A. MacIntyre, why such general moral justifications of codes must necessarily fail in a society of "moral strangers." Going beyond Engelhardt we argue, that after the genealogical suspicion in morals raised by Nietzsche, not even Engelhardt's "principle of permission" can be rationally justified in a strong sense--a problem of transcendental argumentation in morals already realized by I. Kant. Therefore, we propose to abandon the project of providing general justifications for moral judgements and to replace it with a hermeneutical analysis of ethical meanings in real-world situations, starting with the archetypal ethical situation, the encounter with the Other (E. Levinas).

  1. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  2. Progressive fracture of fiber composites

    NASA Technical Reports Server (NTRS)

    Irvin, T. B.; Ginty, C. A.

    1983-01-01

    Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.

  3. Methodology, status and plans for development and assessment of HEXTRAN, TRAB and APROS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanttola, T.; Rajamaeki, M.; Tiihonen, O.

    1997-07-01

    A number of transient and accident analysis codes have been developed in Finland during the past twenty years mainly for the needs of their own power plants, but some of the codes have also been utilized elsewhere. The continuous validation, simultaneous development and experiences obtained in commercial applications have considerably improved the performance and range of application of the codes. At present, the methods allow fairly covering accident analysis of the Finnish nuclear power plants.

  4. Burner liner thermal-structural load modeling

    NASA Technical Reports Server (NTRS)

    Maffeo, R.

    1986-01-01

    The software package Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) was developed. The TRANCITS code is used to interface temperature data between thermal and structural analytical models. The use of this transfer module allows the heat transfer analyst to select the thermal mesh density and thermal analysis code best suited to solve the thermal problem and gives the same freedoms to the stress analyst, without the efficiency penalties associated with common meshes and the accuracy penalties associated with the manual transfer of thermal data.

  5. 3D measurement using combined Gray code and dual-frequency phase-shifting approach

    NASA Astrophysics Data System (ADS)

    Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Liu, Xin

    2018-04-01

    The combined Gray code and phase-shifting approach is a commonly used 3D measurement technique. In this technique, an error that equals integer multiples of the phase-shifted fringe period, i.e. period jump error, often exists in the absolute analog code, which can lead to gross measurement errors. To overcome this problem, the present paper proposes 3D measurement using a combined Gray code and dual-frequency phase-shifting approach. Based on 3D measurement using the combined Gray code and phase-shifting approach, one set of low-frequency phase-shifted fringe patterns with an odd-numbered multiple of the original phase-shifted fringe period is added. Thus, the absolute analog code measured value can be obtained by the combined Gray code and phase-shifting approach, and the low-frequency absolute analog code measured value can also be obtained by adding low-frequency phase-shifted fringe patterns. Then, the corrected absolute analog code measured value can be obtained by correcting the former by the latter, and the period jump errors can be eliminated, resulting in reliable analog code unwrapping. For the proposed approach, we established its measurement model, analyzed its measurement principle, expounded the mechanism of eliminating period jump errors by error analysis, and determined its applicable conditions. Theoretical analysis and experimental results show that the proposed approach can effectively eliminate period jump errors, reliably perform analog code unwrapping, and improve the measurement accuracy.

  6. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  7. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  8. Computer code for single-point thermodynamic analysis of hydrogen/oxygen expander-cycle rocket engines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.; Jones, Scott M.

    1991-01-01

    This analysis and this computer code apply to full, split, and dual expander cycles. Heat regeneration from the turbine exhaust to the pump exhaust is allowed. The combustion process is modeled as one of chemical equilibrium in an infinite-area or a finite-area combustor. Gas composition in the nozzle may be either equilibrium or frozen during expansion. This report, which serves as a users guide for the computer code, describes the system, the analysis methodology, and the program input and output. Sample calculations are included to show effects of key variables such as nozzle area ratio and oxidizer-to-fuel mass ratio.

  9. Combining Thermal And Structural Analyses

    NASA Technical Reports Server (NTRS)

    Winegar, Steven R.

    1990-01-01

    Computer code makes programs compatible so stresses and deformations calculated. Paper describes computer code combining thermal analysis with structural analysis. Called SNIP (for SINDA-NASTRAN Interfacing Program), code provides interface between finite-difference thermal model of system and finite-element structural model when no node-to-element correlation between models. Eliminates much manual work in converting temperature results of SINDA (Systems Improved Numerical Differencing Analyzer) program into thermal loads for NASTRAN (NASA Structural Analysis) program. Used to analyze concentrating reflectors for solar generation of electric power. Large thermal and structural models needed to predict distortion of surface shapes, and SNIP saves considerable time and effort in combining models.

  10. Potential flow analysis of glaze ice accretions on an airfoil

    NASA Technical Reports Server (NTRS)

    Zaguli, R. J.

    1984-01-01

    The results of an analytical/experimental study of the flow fields about an airfoil with leading edge glaze ice accretion shapes are presented. Tests were conducted in the Icing Research Tunnel to measure surface pressure distributions and boundary layer separation reattachment characteristics on a general aviation wing section to which was affixed wooden ice shapes which approximated typical glaze ice accretions. Comparisons were made with predicted pressure distributions using current airfoil analysis codes as well as the Bristow mixed analysis/design airfoil panel code. The Bristow code was also used to predict the separation reattachment dividing streamline by inputting the appropriate experimental surface pressure distribution.

  11. Recent applications of the transonic wing analysis computer code, TWING

    NASA Technical Reports Server (NTRS)

    Subramanian, N. R.; Holst, T. L.; Thomas, S. D.

    1982-01-01

    An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.

  12. A test of the validity of the motivational interviewing treatment integrity code.

    PubMed

    Forsberg, Lars; Berman, Anne H; Kallmén, Håkan; Hermansson, Ulric; Helgason, Asgeir R

    2008-01-01

    To evaluate the Swedish version of the Motivational Interviewing Treatment Code (MITI), MITI coding was applied to tape-recorded counseling sessions. Construct validity was assessed using factor analysis on 120 MITI-coded sessions. Discriminant validity was assessed by comparing MITI coding of motivational interviewing (MI) sessions with information- and advice-giving sessions as well as by comparing MI-trained practitioners with untrained practitioners. A principal-axis factoring analysis yielded some evidence for MITI construct validity. MITI differentiated between practitioners with different levels of MI training as well as between MI practitioners and advice-giving counselors, thus supporting discriminant validity. MITI may be used as a training tool together with supervision to confirm and enhance MI practice in clinical settings. MITI can also serve as a tool for evaluating MI integrity in clinical research.

  13. NASA. Marshall Space Flight Center Hydrostatic Bearing Activities

    NASA Technical Reports Server (NTRS)

    Benjamin, Theodore G.

    1991-01-01

    The basic approach for analyzing hydrostatic bearing flows at the Marshall Space Flight Center (MSFC) is briefly discussed. The Hydrostatic Bearing Team has responsibility for assessing and evaluating flow codes; evaluating friction, ignition, and galling effects; evaluating wear; and performing tests. The Office of Aerospace and Exploration Technology Turbomachinery Seals Tasks consist of tests and analysis. The MSFC in-house analyses utilize one-dimensional bulk-flow codes. Computational fluid dynamics (CFD) analysis is used to enhance understanding of bearing flow physics or to perform parametric analysis that are outside the bulk flow database. As long as the bulk flow codes are accurate enough for most needs, they will be utilized accordingly and will be supported by CFD analysis on an as-needed basis.

  14. Soft-decision decoding techniques for linear block codes and their error performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1996-01-01

    The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.

  15. SPAR improved structure-fluid dynamic analysis capability, phase 2

    NASA Technical Reports Server (NTRS)

    Pearson, M. L.

    1984-01-01

    An efficient and general method of analyzing a coupled dynamic system of fluid flow and elastic structures is investigated. The improvement of Structural Performance Analysis and Redesign (SPAR) code is summarized. All error codes are documented and the SPAR processor/subroutine cross reference is included.

  16. Novel microscopy-based screening method reveals regulators of contact-dependent intercellular transfer

    PubMed Central

    Michael Frei, Dominik; Hodneland, Erlend; Rios-Mondragon, Ivan; Burtey, Anne; Neumann, Beate; Bulkescher, Jutta; Schölermann, Julia; Pepperkok, Rainer; Gerdes, Hans-Hermann; Kögel, Tanja

    2015-01-01

    Contact-dependent intercellular transfer (codeIT) of cellular constituents can have functional consequences for recipient cells, such as enhanced survival and drug resistance. Pathogenic viruses, prions and bacteria can also utilize this mechanism to spread to adjacent cells and potentially evade immune detection. However, little is known about the molecular mechanism underlying this intercellular transfer process. Here, we present a novel microscopy-based screening method to identify regulators and cargo of codeIT. Single donor cells, carrying fluorescently labelled endocytic organelles or proteins, are co-cultured with excess acceptor cells. CodeIT is quantified by confocal microscopy and image analysis in 3D, preserving spatial information. An siRNA-based screening using this method revealed the involvement of several myosins and small GTPases as codeIT regulators. Our data indicates that cellular protrusions and tubular recycling endosomes are important for codeIT. We automated image acquisition and analysis to facilitate large-scale chemical and genetic screening efforts to identify key regulators of codeIT. PMID:26271723

  17. [Comparative study of three Western models of deontological codes for dentists].

    PubMed

    Macpherson Mayol, Ignacio; Roqué Sánchez, María Victoria; Gonzalvo-Cirac, Margarita; de Ribot, Eduard

    2013-01-01

    We performed a comparative analysis of the codes of ethics of three official organizations in Dentistry professional ethics: Code of Ethics for Dentists in the European Union, drawn up by the Council of European Dentists (CED); Código Español de Ética y Deontología Dental, published by the Consejo General de Colegios de Odontólogos y Estomatólogos de España (CGCOE); and Principles of Ethics and Code of Professional Conduct, of the American Dental Association (ADA). The analysis of the structure of the codes allows the discovery of different approaches governing professional ethics according to the ethical and legislative tradition from which they derive. While there are common elements inherent in Western culture, there are nuances in the grounds, the layout and wording of articles that allows to deduce the ethical foundations that underlie each code, and reflects the real problems encountered by dentists in the practice of their profession.

  18. The performance of trellis coded multilevel DPSK on a fading mobile satellite channel

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K.; Divsalar, Dariush

    1987-01-01

    The performance of trellis coded multilevel differential phase-shift-keying (MDPSK) over Rician and Rayleigh fading channels is discussed. For operation at L-Band, this signalling technique leads to a more robust system than the coherent system with dual pilot tone calibration previously proposed for UHF. The results are obtained using a combination of analysis and simulation. The analysis shows that the design criterion for trellis codes to be operated on fading channels with interleaving/deinterleaving is no longer free Euclidean distance. The correct design criterion for optimizing bit error probability of trellis coded MDPSK over fading channels will be presented along with examples illustrating its application.

  19. A qualitative content analysis of global health engagements in Peacekeeping and Stability Operations Institute's stability operations lessons learned and information management system.

    PubMed

    Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel

    2015-04-01

    Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  20. Computer codes for thermal analysis of a solid rocket motor nozzle

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1988-01-01

    A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.

  1. CASL VMA FY16 Milestone Report (L3:VMA.VUQ.P13.07) Westinghouse Mixing with COBRA-TF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, Natalie

    2016-09-30

    COBRA-TF (CTF) is a low-resolution code currently maintained as CASL's subchannel analysis tool. CTF operates as a two-phase, compressible code over a mesh comprised of subchannels and axial discretized nodes. In part because CTF is a low-resolution code, simulation run time is not computationally expensive, only on the order of minutes. Hi-resolution codes such as STAR-CCM+ can be used to train lower-fidelity codes such as CTF. Unlike STAR-CCM+, CTF has no turbulence model, only a two-phase turbulent mixing coefficient, β. β can be set to a constant value or calculated in terms of Reynolds number using an empirical correlation. Resultsmore » from STAR-CCM+ can be used to inform the appropriate value of β. Once β is calibrated, CTF runs can be an inexpensive alternative to costly STAR-CCM+ runs for scoping analyses. Based on the results of CTF runs, STAR-CCM+ can be run for specific parameters of interest. CASL areas of application are CIPS for single phase analysis and DNB-CTF for two-phase analysis.« less

  2. Computational Predictions of the Performance Wright 'Bent End' Propellers

    NASA Technical Reports Server (NTRS)

    Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)

    2002-01-01

    Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.

  3. Analysis of SMA Hybrid Composite Structures in MSC.Nastran and ABAQUS

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Patel, Hemant D.

    2005-01-01

    A thermoelastic constitutive model for shape memory alloy (SMA) actuators and SMA hybrid composite (SMAHC) structures was recently implemented in the commercial finite element codes MSC.Nastran and ABAQUS. The model may be easily implemented in any code that has the capability for analysis of laminated composite structures with temperature dependent material properties. The model is also relatively easy to use and requires input of only fundamental engineering properties. A brief description of the model is presented, followed by discussion of implementation and usage in the commercial codes. Results are presented from static and dynamic analysis of SMAHC beams of two types; a beam clamped at each end and a cantilever beam. Nonlinear static (post-buckling) and random response analyses are demonstrated for the first specimen. Static deflection (shape) control is demonstrated for the cantilever beam. Approaches for modeling SMAHC material systems with embedded SMA in ribbon and small round wire product forms are demonstrated and compared. The results from the commercial codes are compared to those from a research code as validation of the commercial implementations; excellent correlation is achieved in all cases.

  4. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  5. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    NASA Technical Reports Server (NTRS)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  6. Flowfield computer graphics

    NASA Technical Reports Server (NTRS)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  7. Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures

    DTIC Science & Technology

    2015-09-01

    soundness or completeness. An incomplete analysis will produce extra edges in the CFG that might allow an attacker to slip through. An unsound analysis...Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures by Isaac Noah Evans Submitted to the Department of Electrical...Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer

  8. The EUCLID/V1 Integrated Code for Safety Assessment of Liquid Metal Cooled Fast Reactors. Part 1: Basic Models

    NASA Astrophysics Data System (ADS)

    Mosunova, N. A.

    2018-05-01

    The article describes the basic models included in the EUCLID/V1 integrated code intended for safety analysis of liquid metal (sodium, lead, and lead-bismuth) cooled fast reactors using fuel rods with a gas gap and pellet dioxide, mixed oxide or nitride uranium-plutonium fuel under normal operation, under anticipated operational occurrences and accident conditions by carrying out interconnected thermal-hydraulic, neutronics, and thermal-mechanical calculations. Information about the Russian and foreign analogs of the EUCLID/V1 integrated code is given. Modeled objects, equation systems in differential form solved in each module of the EUCLID/V1 integrated code (the thermal-hydraulic, neutronics, fuel rod analysis module, and the burnup and decay heat calculation modules), the main calculated quantities, and also the limitations on application of the code are presented. The article also gives data on the scope of functions performed by the integrated code's thermal-hydraulic module, using which it is possible to describe both one- and twophase processes occurring in the coolant. It is shown that, owing to the availability of the fuel rod analysis module in the integrated code, it becomes possible to estimate the performance of fuel rods in different regimes of the reactor operation. It is also shown that the models implemented in the code for calculating neutron-physical processes make it possible to take into account the neutron field distribution over the fuel assembly cross section as well as other features important for the safety assessment of fast reactors.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Blas, Alfredo; Tapia, Carlos; Riego, Albert

    pGamma is a code developed by the NERG group of the Technical University of Catalonia - Barcelona Tech for the analysis of gamma spectra generated by the Equipment for the Continuous Measurement and Identification of Gamma Radioactivity on Aerosols with Paper Filter developed for our group and Raditel Servies company. Nowadays the code is in the process of adaptation for the monitors of the Environmental Radiological Surveillance Network of the Local Government of Catalonia (Generalitat of Catalonia), Spain. The code is a Spectrum Analysis System, it identifies the gamma emitters on the spectrum, determines its Concentration of Activity, generates alarmsmore » depending on the Activity of the emitters and generates a report. The Spectrum Analysis System includes a library with emitters of interest, NORM and artificial. The code is being used on the three stations with the aerosol monitor of the Network (Asco and Vandellos, near both Nuclear Power Plants and Barcelona). (authors)« less

  10. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younes, W.

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-raymore » data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.« less

  11. Computational fluid dynamics of airfoils and wings

    NASA Technical Reports Server (NTRS)

    Garabedian, P.; Mcfadden, G.

    1982-01-01

    It is pointed out that transonic flow is one of the fields where computational fluid dynamics turns out to be most effective. Codes for the design and analysis of supercritical airfoils and wings have become standard tools of the aircraft industry. The present investigation is concerned with mathematical models and theorems which account for some of the progress that has been made. The most successful aerodynamics codes are those for the analysis of flow at off-design conditions where weak shock waves appear. A major breakthrough was achieved by Murman and Cole (1971), who conceived of a retarded difference scheme which incorporates artificial viscosity to capture shocks in the supersonic zone. This concept has been used to develop codes for the analysis of transonic flow past a swept wing. Attention is given to the trailing edge and the boundary layer, entropy inequalities and wave drag, shockless airfoils, and the inverse swept wing code.

  12. Application of the DART Code for the Assessment of Advanced Fuel Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rest, J.; Totev, T.

    2007-07-01

    The Dispersion Analysis Research Tool (DART) code is a dispersion fuel analysis code that contains mechanistically-based fuel and reaction-product swelling models, a one dimensional heat transfer analysis, and mechanical deformation models. DART has been used to simulate the irradiation behavior of uranium oxide, uranium silicide, and uranium molybdenum aluminum dispersion fuels, as well as their monolithic counterparts. The thermal-mechanical DART code has been validated against RERTR tests performed in the ATR for irradiation data on interaction thickness, fuel, matrix, and reaction product volume fractions, and plate thickness changes. The DART fission gas behavior model has been validated against UO{sub 2}more » fission gas release data as well as measured fission gas-bubble size distributions. Here DART is utilized to analyze various aspects of the observed bubble growth in U-Mo/Al interaction product. (authors)« less

  13. Performance Analysis of New Binary User Codes for DS-CDMA Communication

    NASA Astrophysics Data System (ADS)

    Usha, Kamle; Jaya Sankar, Kottareddygari

    2016-03-01

    This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.

  14. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  15. [Bioethical analysis of the Brazilian Dentistry Code of Ethics].

    PubMed

    Pyrrho, Monique; do Prado, Mauro Machado; Cordón, Jorge; Garrafa, Volnei

    2009-01-01

    The Brazilian Dentistry Code of Ethics (DCE), Resolution CFO-71 from May 2006, is an instrument created to guide dentists' behavior in relation to the ethical aspects of professional practice. The purpose of the study is to analyze the above mentioned code comparing the deontological and bioethical focuses. In order to do so, an interpretative analysis of the code and of twelve selected texts was made. Six of the texts were about bioethics and six on deontology, and the analysis was made through the methodological classification of the context units, textual paragraphs and items from the code in the following categories: the referentials of bioethical principlism--autonomy, beneficence, nonmaleficence and justice -, technical aspects and moral virtues related to the profession. Together the four principles represented 22.9%, 39.8% and 54.2% of the content of the DCE, of the deontological texts and of the bioethical texts respectively. In the DCE, 42% of the items referred to virtues, 40.2% were associated to technical aspects and just 22.9% referred to principles. The virtues related to the professionals and the technical aspects together amounted to 70.1% of the code. Instead of focusing on the patient as the subject of the process of oral health care, the DCE focuses on the professional, and it is predominantly turned to legalistic and corporate aspects.

  16. Utilization of genetic tests: analysis of gene-specific billing in Medicare claims data.

    PubMed

    Lynch, Julie A; Berse, Brygida; Dotson, W David; Khoury, Muin J; Coomer, Nicole; Kautter, John

    2017-08-01

    We examined the utilization of precision medicine tests among Medicare beneficiaries through analysis of gene-specific tier 1 and 2 billing codes developed by the American Medical Association in 2012. We conducted a retrospective cross-sectional study. The primary source of data was 2013 Medicare 100% fee-for-service claims. We identified claims billed for each laboratory test, the number of patients tested, expenditures, and the diagnostic codes indicated for testing. We analyzed variations in testing by patient demographics and region of the country. Pharmacogenetic tests were billed most frequently, accounting for 48% of the expenditures for new codes. The most common indications for testing were breast cancer, long-term use of medications, and disorders of lipid metabolism. There was underutilization of guideline-recommended tumor mutation tests (e.g., epidermal growth factor receptor) and substantial overutilization of a test discouraged by guidelines (methylenetetrahydrofolate reductase). Methodology-based tier 2 codes represented 15% of all claims billed with the new codes. The highest rate of testing per beneficiary was in Mississippi and the lowest rate was in Alaska. Gene-specific billing codes significantly improved our ability to conduct population-level research of precision medicine. Analysis of these data in conjunction with clinical records should be conducted to validate findings.Genet Med advance online publication 26 January 2017.

  17. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    NASA Astrophysics Data System (ADS)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  18. Edge-diffraction effects in RCS predictions and their importance in systems analysis

    NASA Astrophysics Data System (ADS)

    Friess, W. F.; Klement, D.; Ruppel, M.; Stein, Volker

    1996-06-01

    In developing RCS prediction codes a variety of physical effects such as the edge diffraction effect have to be considered with the consequence that the computer effort increases considerably. This fact limits the field of application of such codes, especially if the RCS data serve as input parameters for system simulators which very often need these data for a high number of observation angles and/or frequencies. Vice versa the issues of a system analysis can be used to estimate the relevance of physical effects under system viewpoints and to rank them according to their magnitude. This paper tries to evaluate the importance of RCS predictions containing an edge diffracted field for systems analysis. A double dihedral with a strong depolarizing behavior and a generic airplane design containing many arbitrarily oriented edges are used as test structures. Data of the scattered field are generated by the RCS computer code SIGMA with and without including edge diffraction effects. These data are submitted to the code DORA to determine radar range and radar detectibility and to a SAR simulator code to generate SAR imagery. In both cases special scenarios are assumed. The essential features of the computer codes in their current state are described, the results are presented and discussed under systems viewpoints.

  19. 76 FR 57982 - Building Energy Codes Cost Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy [Docket No. EERE-2011-BT-BC-0046] Building Energy Codes Cost Analysis Correction In notice document 2011-23236 beginning on page... heading ``Table 1. Cash flow components'' should read ``Table 7. Cash flow components''. [FR Doc. C1-2011...

  20. Beyond Molecular Codes: Simple Rules to Wire Complex Brains

    PubMed Central

    Hassan, Bassem A.; Hiesinger, P. Robin

    2015-01-01

    Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480

  1. Categorical Variables in Multiple Regression: Some Cautions.

    ERIC Educational Resources Information Center

    O'Grady, Kevin E.; Medoff, Deborah R.

    1988-01-01

    Limitations of dummy coding and nonsense coding as methods of coding categorical variables for use as predictors in multiple regression analysis are discussed. The combination of these approaches often yields estimates and tests of significance that are not intended by researchers for inclusion in their models. (SLD)

  2. Natural Language Interface for Safety Certification of Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2011-01-01

    Model-based design and automated code generation are being used increasingly at NASA. The trend is to move beyond simulation and prototyping to actual flight code, particularly in the guidance, navigation, and control domain. However, there are substantial obstacles to more widespread adoption of code generators in such safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. The AutoCert generator plug-in supports the certification of automatically generated code by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews.

  3. A trend analysis of surgical operations under a global payment system in Tehran, Iran (2005–2015)

    PubMed Central

    Goudari, Faranak Behzadi; Rashidian, Arash; Arab, Mohammad; Mahmoudi, Mahmood

    2018-01-01

    Background Global payment system is a first example of per-case payment system that contains 60 commonly used surgical operations for which payment is based on the average cost per case in Iran. Objective The aim of the study was to determine the amount of reduction, increase or no change in the trend of global operations. Methods In this retrospective longitudinal study, data on the 60 primary global surgery codes was gathered from Tehran Health Insurance Organization within the ten-year period of 2005–2015 separately, for each month. Out of 60 surgery codes, only acceptable data for 46 codes were available based on the insurance documents sent by medical centers. A quantitative analysis of time series through Regression Analysis Model using STATA software v.11 was performed. Results Some global surgery codes had an upward trend and some were downwards. Of N Codes, N83, N20, N28, N63, and N93 had an upward trend (p<0.05) and N32, N43, N81 and N90 showed a significant downward trend (p<0.05). Similarly, all H Codes except for H18 had a significant upward trend (p<0.000). As such, K Codes including K45, K56 and K81 had an increasing movement. S Codes also experienced both increasing and decreasing trends. However, none of the O Codes changed according to time. Other global surgical codes like C61, E07, M51, L60, J98 (p<0.000), I84 (p<0.031) and I86 (p<0.000) shown upward and downward trends. Total global surgeries trend was significantly upwards (B=24.26109, p<0.000). Conclusion The varying trend of global surgeries can partly reflect the behavior of service providers in order to increase their profits and minimize their costs. PMID:29765576

  4. Improved Helicopter Rotor Performance Prediction through Loose and Tight CFD/CSD Coupling

    NASA Astrophysics Data System (ADS)

    Ickes, Jacob C.

    Helicopters and other Vertical Take-Off or Landing (VTOL) vehicles exhibit an interesting combination of structural dynamic and aerodynamic phenomena which together drive the rotor performance. The combination of factors involved make simulating the rotor a challenging and multidisciplinary effort, and one which is still an active area of interest in the industry because of the money and time it could save during design. Modern tools allow the prediction of rotorcraft physics from first principles. Analysis of the rotor system with this level of accuracy provides the understanding necessary to improve its performance. There has historically been a divide between the comprehensive codes which perform aeroelastic rotor simulations using simplified aerodynamic models, and the very computationally intensive Navier-Stokes Computational Fluid Dynamics (CFD) solvers. As computer resources become more available, efforts have been made to replace the simplified aerodynamics of the comprehensive codes with the more accurate results from a CFD code. The objective of this work is to perform aeroelastic rotorcraft analysis using first-principles simulations for both fluids and structural predictions using tools available at the University of Toledo. Two separate codes are coupled together in both loose coupling (data exchange on a periodic interval) and tight coupling (data exchange each time step) schemes. To allow the coupling to be carried out in a reliable and efficient way, a Fluid-Structure Interaction code was developed which automatically performs primary functions of loose and tight coupling procedures. Flow phenomena such as transonics, dynamic stall, locally reversed flow on a blade, and Blade-Vortex Interaction (BVI) were simulated in this work. Results of the analysis show aerodynamic load improvement due to the inclusion of the CFD-based airloads in the structural dynamics analysis of the Computational Structural Dynamics (CSD) code. Improvements came in the form of improved peak/trough magnitude prediction, better phase prediction of these locations, and a predicted signal with a frequency content more like the flight test data than the CSD code acting alone. Additionally, a tight coupling analysis was performed as a demonstration of the capability and unique aspects of such an analysis. This work shows that away from the center of the flight envelope, the aerodynamic modeling of the CSD code can be replaced with a more accurate set of predictions from a CFD code with an improvement in the aerodynamic results. The better predictions come at substantially increased computational costs between 1,000 and 10,000 processor-hours.

  5. Fundamental differences between optimization code test problems in engineering applications

    NASA Technical Reports Server (NTRS)

    Eason, E. D.

    1984-01-01

    The purpose here is to suggest that there is at least one fundamental difference between the problems used for testing optimization codes and the problems that engineers often need to solve; in particular, the level of precision that can be practically achieved in the numerical evaluation of the objective function, derivatives, and constraints. This difference affects the performance of optimization codes, as illustrated by two examples. Two classes of optimization problem were defined. Class One functions and constraints can be evaluated to a high precision that depends primarily on the word length of the computer. Class Two functions and/or constraints can only be evaluated to a moderate or a low level of precision for economic or modeling reasons, regardless of the computer word length. Optimization codes have not been adequately tested on Class Two problems. There are very few Class Two test problems in the literature, while there are literally hundreds of Class One test problems. The relative performance of two codes may be markedly different for Class One and Class Two problems. Less sophisticated direct search type codes may be less likely to be confused or to waste many function evaluations on Class Two problems. The analysis accuracy and minimization performance are related in a complex way that probably varies from code to code. On a problem where the analysis precision was varied over a range, the simple Hooke and Jeeves code was more efficient at low precision while the Powell code was more efficient at high precision.

  6. Coding visual features extracted from video sequences.

    PubMed

    Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2014-05-01

    Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.

  7. Empirical validation of the triple-code model of numerical processing for complex math operations using functional MRI and group Independent Component Analysis of the mental addition and subtraction of fractions.

    PubMed

    Schmithorst, Vincent J; Brown, Rhonda Douglas

    2004-07-01

    The suitability of a previously hypothesized triple-code model of numerical processing, involving analog magnitude, auditory verbal, and visual Arabic codes of representation, was investigated for the complex mathematical task of the mental addition and subtraction of fractions. Functional magnetic resonance imaging (fMRI) data from 15 normal adult subjects were processed using exploratory group Independent Component Analysis (ICA). Separate task-related components were found with activation in bilateral inferior parietal, left perisylvian, and ventral occipitotemporal areas. These results support the hypothesized triple-code model corresponding to the activated regions found in the individual components and indicate that the triple-code model may be a suitable framework for analyzing the neuropsychological bases of the performance of complex mathematical tasks. Copyright 2004 Elsevier Inc.

  8. Benchmarking of Improved DPAC Transient Deflagration Analysis Code

    DOE PAGES

    Laurinat, James E.; Hensel, Steve J.

    2017-09-27

    The deflagration pressure analysis code (DPAC) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vesselmore » walls. In addition, DPAC has been coupled with chemical equilibrium with applications (CEA), a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. As a result, the improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.« less

  9. Wing Weight Optimization Under Aeroelastic Loads Subject to Stress Constraints

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.; Issac, J.; Macmurdy, D.; Guruswamy, Guru P.

    1997-01-01

    A minimum weight optimization of the wing under aeroelastic loads subject to stress constraints is carried out. The loads for the optimization are based on aeroelastic trim. The design variables are the thickness of the wing skins and planform variables. The composite plate structural model incorporates first-order shear deformation theory, the wing deflections are expressed using Chebyshev polynomials and a Rayleigh-Ritz procedure is adopted for the structural formulation. The aerodynamic pressures provided by the aerodynamic code at a discrete number of grid points is represented as a bilinear distribution on the composite plate code to solve for the deflections and stresses in the wing. The lifting-surface aerodynamic code FAST is presently being used to generate the pressure distribution over the wing. The envisioned ENSAERO/Plate is an aeroelastic analysis code which combines ENSAERO version 3.0 (for analysis of wing-body configurations) with the composite plate code.

  10. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  11. Benchmarking of Improved DPAC Transient Deflagration Analysis Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurinat, James E.; Hensel, Steve J.

    The deflagration pressure analysis code (DPAC) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vesselmore » walls. In addition, DPAC has been coupled with chemical equilibrium with applications (CEA), a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. As a result, the improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.« less

  12. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  13. Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS

    NASA Astrophysics Data System (ADS)

    Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.

    2017-04-01

    The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.

  14. Prevalence of Chronic Hypoparathyroidism in a Mediterranean Region as Estimated by the Analysis of Anonymous Healthcare Database.

    PubMed

    Cianferotti, Luisella; Parri, Simone; Gronchi, Giorgio; Marcucci, Gemma; Cipriani, Cristiana; Pepe, Jessica; Raglianti, Marco; Minisola, Salvatore; Brandi, Maria Luisa

    2018-03-08

    Epidemiological data on prevalence and incidence of chronic hypoparathyroidism are still scarce. This study aimed to establish prevalence of chronic hypoparathyroidism and incidence of surgical hypoparathyroidism using the analysis of electronic anonymous public health care database. Data referred to a 5-year period (2009-2013, Region of Tuscany, Italy, as a sample representative of the whole Mediterranean/European population, estimated mean population: 3,750,000 inhabitants) were retrieved by the analysis of pharmaceutical distribution dataset, containing data related to drugs reimbursed by public health system, hospital discharge and procedures codes, and ICD9 exemption codes for chronic diseases. The application of a specific algorithm was applied to indirectly identify people with chronic hypoparathyroidism as assuming chronic therapy with active vitamin D metabolites (AVDM). The number of people taking AVDM for a period equal to or longer than 6 months till the end of the study period, with ICD9 exemption code for hypoparathyroidism, and with a disease-related discharge code were identified. Within this restricted group, patients with chronic kidney disease and osteoporosis were excluded. The indirect estimate of chronic hypoparathyroidism in a European Mediterranean subpopulation by means of the analysis of chronic therapy with AVDM was 27/100,000 inhabitants (female:male ratio = 2.2:1), with a mean age of 63.5 ± 16.7 years. The risk of developing hypoparathyroidism after neck surgery was 1.5%. While the epidemiological approaches based on disease code and hospital discharge code greatly underestimates the prevalence of hypoparathyroidism, the indirect estimate of this disease through the analysis of prescriptions of AVDM in a European region is in line with the results of studies performed in other regions of the world.

  15. The impact of conventional dietary intake data coding methods on foods typically consumed by low-income African-American and White urban populations.

    PubMed

    Mason, Marc A; Fanelli Kuczmarski, Marie; Allegro, Deanne; Zonderman, Alan B; Evans, Michele K

    2015-08-01

    Analysing dietary data to capture how individuals typically consume foods is dependent on the coding variables used. Individual foods consumed simultaneously, like coffee with milk, are given codes to identify these combinations. Our literature review revealed a lack of discussion about using combination codes in analysis. The present study identified foods consumed at mealtimes and by race when combination codes were or were not utilized. Duplicate analysis methods were performed on separate data sets. The original data set consisted of all foods reported; each food was coded as if it was consumed individually. The revised data set was derived from the original data set by first isolating coded foods consumed as individual items from those foods consumed simultaneously and assigning a code to designate a combination. Foods assigned a combination code, like pancakes with syrup, were aggregated and associated with a food group, defined by the major food component (i.e. pancakes), and then appended to the isolated coded foods. Healthy Aging in Neighborhoods of Diversity across the Life Span study. African-American and White adults with two dietary recalls (n 2177). Differences existed in lists of foods most frequently consumed by mealtime and race when comparing results based on original and revised data sets. African Americans reported consumption of sausage/luncheon meat and poultry, while ready-to-eat cereals and cakes/doughnuts/pastries were reported by Whites on recalls. Use of combination codes provided more accurate representation of how foods were consumed by populations. This information is beneficial when creating interventions and exploring diet-health relationships.

  16. Leveraging Code Comments to Improve Software Reliability

    ERIC Educational Resources Information Center

    Tan, Lin

    2009-01-01

    Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…

  17. "Hour of Code": A Case Study

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2018-01-01

    This study investigates the delivery of the "Hour of Code" tutorials to college students. The college students who participated in this study were surveyed about their opinion of the Hour of Code. First, the students' comments were discussed. Next, a content analysis of the offered tutorials highlights their reliance on visual…

  18. Finite-SNR analysis for partial relaying cooperation with channel coding and opportunistic relay selection

    NASA Astrophysics Data System (ADS)

    Vu, Thang X.; Duhamel, Pierre; Chatzinotas, Symeon; Ottersten, Bjorn

    2017-12-01

    This work studies the performance of a cooperative network which consists of two channel-coded sources, multiple relays, and one destination. To achieve high spectral efficiency, we assume that a single time slot is dedicated to relaying. Conventional network-coded-based cooperation (NCC) selects the best relay which uses network coding to serve the two sources simultaneously. The bit error rate (BER) performance of NCC with channel coding, however, is still unknown. In this paper, we firstly study the BER of NCC via a closed-form expression and analytically show that NCC only achieves diversity of order two regardless of the number of available relays and the channel code. Secondly, we propose a novel partial relaying-based cooperation (PARC) scheme to improve the system diversity in the finite signal-to-noise ratio (SNR) regime. In particular, closed-form expressions for the system BER and diversity order of PARC are derived as a function of the operating SNR value and the minimum distance of the channel code. We analytically show that the proposed PARC achieves full (instantaneous) diversity order in the finite SNR regime, given that an appropriate channel code is used. Finally, numerical results verify our analysis and demonstrate a large SNR gain of PARC over NCC in the SNR region of interest.

  19. Protograph-Based Raptor-Like Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  20. Performance analysis of optical wireless communication system based on two-fold turbo code

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Huang, Dexiu; Yuan, Xiuhua

    2005-11-01

    Optical wireless communication (OWC) is beginning to emerge in the telecommunications market as a strategy to meet last-mile demand owing to its unique combination of features. Turbo codes have an impressive near Shannon-limit error correcting performance. Twofold turbo codes have been recently introduced as the least complex member of the multifold turbo code family. In this paper, at first, we present the mathematical model of signal and optical wireless channel with fading and bit error rate model with scintillation, then we provide a new turbo code method to use in OWC system, we can obtain a better BER curse of OWC system with twofold turbo code than with common turbo code.

  1. General practitioners' justifications for therapeutic inertia in cardiovascular prevention: an empirically grounded typology.

    PubMed

    Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Vaillant-Roussel, Hélène; Pouchain, Denis; Yaouanc, Virginie; Aubin-Auger, Isabelle; Mercier, Alain; Rusch, Emmanuel; Remmen, Roy; Vermeire, Etienne; Hendrickx, Kristin

    2016-05-13

    To construct a typology of general practitioners' (GPs) responses regarding their justification of therapeutic inertia in cardiovascular primary prevention for high-risk patients with hypertension. Empirically grounded construction of typology. Types were defined by attributes derived from the qualitative analysis of GPs' reported reasons for inaction. 256 GPs randomised in the intervention group of a cluster randomised controlled trial. GPs members of 23 French Regional Colleges of Teachers in General Practice, included in the EffectS of a multifaceted intervention on CArdiovascular risk factors in high-risk hyPErtensive patients (ESCAPE) trial. The database consisted of 2638 written responses given by the GPs to an open-ended question asking for the reasons why drug treatment was not changed as suggested by the national guidelines. All answers were coded using constant comparison analysis. A matrix analysis of codes per GP allowed the construction of a response typology, where types were defined by codes as attributes. Initial coding and definition of types were performed independently by two teams. Initial coding resulted in a list of 69 codes in the final codebook, representing 4764 coded references in the question responses. A typology including seven types was constructed. 100 GPs were allocated to one and only one of these types, while 25 GPs did not provide enough data to allow classification. Types (numbers of GPs allocated) were: 'optimists' (28), 'negotiators' (20), 'checkers' (15), 'contextualisers' (13), 'cautious' (11), 'rounders' (8) and 'scientists' (5). For the 36 GPs that provided 50 or more coded references, analysis of the code evolution over time and across patients showed a consistent belonging to the initial type for any given GP. This typology could provide GPs with some insight into their general ways of considering changes in the treatment/management of cardiovascular risk factors and guide design of specific physician-centred interventions to reduce inappropriate inaction. NCT00348855. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. AphasiaBank: a resource for clinicians.

    PubMed

    Forbes, Margaret M; Fromm, Davida; Macwhinney, Brian

    2012-08-01

    AphasiaBank is a shared, multimedia database containing videos and transcriptions of ~180 aphasic individuals and 140 nonaphasic controls performing a uniform set of discourse tasks. The language in the videos is transcribed in Codes for the Human Analysis of Transcripts (CHAT) format and coded for analysis with Computerized Language ANalysis (CLAN) programs, which can perform a wide variety of language analyses. The database and the CLAN programs are freely available to aphasia researchers and clinicians for educational, clinical, and scholarly uses. This article describes the database, suggests some ways in which clinicians and clinician researchers might find these materials useful, and introduces a new language analysis program, EVAL, designed to streamline the transcription and coding processes, while still producing an extensive and useful language profile. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. COBRA-SFS thermal-hydraulic analysis code for spent fuel storage and transportation casks: Models and methods

    DOE PAGES

    Michener, Thomas E.; Rector, David R.; Cuta, Judith M.

    2017-09-01

    COBRA-SFS, a thermal-hydraulics code developed for steady-state and transient analysis of multi-assembly spent-fuel storage and transportation systems, has been incorporated into the Used Nuclear Fuel-Storage, Transportation and Disposal Analysis Resource and Data System tool as a module devoted to spent fuel package thermal analysis. This paper summarizes the basic formulation of the equations and models used in the COBRA-SFS code, showing that COBRA-SFS fully captures the important physical behavior governing the thermal performance of spent fuel storage systems, with internal and external natural convection flow patterns, and heat transfer by convection, conduction, and thermal radiation. Of particular significance is themore » capability for detailed thermal radiation modeling within the fuel rod array.« less

  4. COBRA-SFS thermal-hydraulic analysis code for spent fuel storage and transportation casks: Models and methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michener, Thomas E.; Rector, David R.; Cuta, Judith M.

    COBRA-SFS, a thermal-hydraulics code developed for steady-state and transient analysis of multi-assembly spent-fuel storage and transportation systems, has been incorporated into the Used Nuclear Fuel-Storage, Transportation and Disposal Analysis Resource and Data System tool as a module devoted to spent fuel package thermal analysis. This paper summarizes the basic formulation of the equations and models used in the COBRA-SFS code, showing that COBRA-SFS fully captures the important physical behavior governing the thermal performance of spent fuel storage systems, with internal and external natural convection flow patterns, and heat transfer by convection, conduction, and thermal radiation. Of particular significance is themore » capability for detailed thermal radiation modeling within the fuel rod array.« less

  5. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  6. Statistical inference of static analysis rules

    NASA Technical Reports Server (NTRS)

    Engler, Dawson Richards (Inventor)

    2009-01-01

    Various apparatus and methods are disclosed for identifying errors in program code. Respective numbers of observances of at least one correctness rule by different code instances that relate to the at least one correctness rule are counted in the program code. Each code instance has an associated counted number of observances of the correctness rule by the code instance. Also counted are respective numbers of violations of the correctness rule by different code instances that relate to the correctness rule. Each code instance has an associated counted number of violations of the correctness rule by the code instance. A respective likelihood of the validity is determined for each code instance as a function of the counted number of observances and counted number of violations. The likelihood of validity indicates a relative likelihood that a related code instance is required to observe the correctness rule. The violations may be output in order of the likelihood of validity of a violated correctness rule.

  7. NSEG, a segmented mission analysis program for low and high speed aircraft. Volume 1: Theoretical development

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Rozendaal, H. L.

    1977-01-01

    A rapid mission analysis code based on the use of approximate flight path equations of motion is presented. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed characteristics were specified in tabular form. The code also contains extensive flight envelope performance mapping capabilities. Approximate take off and landing analyses were performed. At high speeds, centrifugal lift effects were accounted for. Extensive turbojet and ramjet engine scaling procedures were incorporated in the code.

  8. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  9. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  10. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less

  11. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  12. Nonlinear static and dynamic finite element analysis of an eccentrically loaded graphite-epoxy beam

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jackson, Karen E.; Jones, Lisa E.

    1991-01-01

    The Dynamic Crash Analysis of Structures (DYCAT) and NIKE3D nonlinear finite element codes were used to model the static and implulsive response of an eccentrically loaded graphite-epoxy beam. A 48-ply unidirectional composite beam was tested under an eccentric axial compressive load until failure. This loading configuration was chosen to highlight the capabilities of two finite element codes for modeling a highly nonlinear, large deflection structural problem which has an exact solution. These codes are currently used to perform dynamic analyses of aircraft structures under impact loads to study crashworthiness and energy absorbing capabilities. Both beam and plate element models were developed to compare with the experimental data using the DYCAST and NIKE3D codes.

  13. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  14. Translating an AI application from Lisp to Ada: A case study

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    A set of benchmarks was developed to test the performance of a newly designed computer executing both Lisp and Ada. Among these was AutoClassII -- a large Artificial Intelligence (AI) application written in Common Lisp. The extraction of a representative subset of this complex application was aided by a Lisp Code Analyzer (LCA). The LCA enabled rapid analysis of the code, putting it in a concise and functionally readable form. An equivalent benchmark was created in Ada through manual translation of the Lisp version. A comparison of the execution results of both programs across a variety of compiler-machine combinations indicate that line-by-line translation coupled with analysis of the initial code can produce relatively efficient and reusable target code.

  15. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less

  16. AlleleCoder: a PERL script for coding codominant polymorphism data for PCA analysis

    USDA-ARS?s Scientific Manuscript database

    A useful biological interpretation of diploid heterozygotes is in terms of the dose of the common allele (0, 1 or 2 copies). We have developed a PERL script that converts FASTA files into coded spreadsheets suitable for Principal Component Analysis (PCA). In combination with R and R Commander, two- ...

  17. Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel

    NASA Astrophysics Data System (ADS)

    Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin; Park, Jong Man; Sohn, Dong-Seong

    2018-04-01

    A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature- and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS). The code was validated using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code.

  18. Stress Analysis and Fracture in Nanolaminate Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A stress analysis is performed on a nanolaminate subjected to bending. A composite mechanics computer code that is based on constituent properties and nanoelement formulation is used to evaluate the nanolaminate stresses. The results indicate that the computer code is sufficient for the analysis. The results also show that when a stress concentration is present, the nanolaminate stresses exceed their corresponding matrix-dominated strengths and the nanofiber fracture strength.

  19. Structural Damage Prediction and Analysis for Hypervelocity Impact. BUMPERII Suggestion and Problem Reports

    NASA Technical Reports Server (NTRS)

    1995-01-01

    In the course of preparing the SD_SURF space debris analysis code, several problems and possibilities for improvement of the BUMPERII code were documented and sent to MSFC. These suggestions and problem reports are included here as a part of the contract final report. This includes reducing BUMPERII memory requirements, compiling problems with BUMPERII, FORTRAN-lint analysis of BUMPERII, and error in function PRV in BUMPERII.

  20. Thermal-hydraulic interfacing code modules for CANDU reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W.S.; Gold, M.; Sills, H.

    1997-07-01

    The approach for CANDU reactor safety analysis in Ontario Hydro Nuclear (OHN) and Atomic Energy of Canada Limited (AECL) is presented. Reflecting the unique characteristics of CANDU reactors, the procedure of coupling the thermal-hydraulics, reactor physics and fuel channel/element codes in the safety analysis is described. The experience generated in the Canadian nuclear industry may be useful to other types of reactors in the areas of reactor safety analysis.

  1. Automated Discovery of Machine-Specific Code Improvements

    DTIC Science & Technology

    1984-12-01

    operation of the source language. Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient...Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient code. Such analysis is optional...incorporate knowledge of the source language, but do not refer to features of the target machine. These early phases are sometimes referred to as the

  2. Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold

    1997-01-01

    The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.

  3. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  4. Implicit time-integration method for simultaneous solution of a coupled non-linear system

    NASA Astrophysics Data System (ADS)

    Watson, Justin Kyle

    Historically large physical problems have been divided into smaller problems based on the physics involved. This is no different in reactor safety analysis. The problem of analyzing a nuclear reactor for design basis accidents is performed by a handful of computer codes each solving a portion of the problem. The reactor thermal hydraulic response to an event is determined using a system code like TRAC RELAP Advanced Computational Engine (TRACE). The core power response to the same accident scenario is determined using a core physics code like Purdue Advanced Core Simulator (PARCS). Containment response to the reactor depressurization in a Loss Of Coolant Accident (LOCA) type event is calculated by a separate code. Sub-channel analysis is performed with yet another computer code. This is just a sample of the computer codes used to solve the overall problems of nuclear reactor design basis accidents. Traditionally each of these codes operates independently from each other using only the global results from one calculation as boundary conditions to another. Industry's drive to uprate power for reactors has motivated analysts to move from a conservative approach to design basis accident towards a best estimate method. To achieve a best estimate calculation efforts have been aimed at coupling the individual physics models to improve the accuracy of the analysis and reduce margins. The current coupling techniques are sequential in nature. During a calculation time-step data is passed between the two codes. The individual codes solve their portion of the calculation and converge to a solution before the calculation is allowed to proceed to the next time-step. This thesis presents a fully implicit method of simultaneous solving the neutron balance equations, heat conduction equations and the constitutive fluid dynamics equations. It discusses the problems involved in coupling different physics phenomena within multi-physics codes and presents a solution to these problems. The thesis also outlines the basic concepts behind the nodal balance equations, heat transfer equations and the thermal hydraulic equations, which will be coupled to form a fully implicit nonlinear system of equations. The coupling of separate physics models to solve a larger problem and improve accuracy and efficiency of a calculation is not a new idea, however implementing them in an implicit manner and solving the system simultaneously is. Also the application to reactor safety codes is new and has not be done with thermal hydraulics and neutronics codes on realistic applications in the past. The coupling technique described in this thesis is applicable to other similar coupled thermal hydraulic and core physics reactor safety codes. This technique is demonstrated using coupled input decks to show that the system is solved correctly and then verified by using two derivative test problems based on international benchmark problems the OECD/NRC Three mile Island (TMI) Main Steam Line Break (MSLB) problem (representative of pressurized water reactor analysis) and the OECD/NRC Peach Bottom (PB) Turbine Trip (TT) benchmark (representative of boiling water reactor analysis).

  5. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  6. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  7. TRAC-PD2 posttest analysis of the CCTF Evaluation-Model Test C1-19 (Run 38). [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Motley, F.

    The results of a Transient Reactor Analysis Code posttest analysis of the Cylindral Core Test Facility Evaluation-Model Test agree very well with the results of the experiment. The good agreement obtained verifies the multidimensional analysis capability of the TRAC code. Because of the steep radial power profile, the importance of using fine noding in the core region was demonstrated (as compared with poorer results obtained from an earlier pretest prediction that used a coarsely noded model).

  8. Current Research on Non-Coding Ribonucleic Acid (RNA).

    PubMed

    Wang, Jing; Samuels, David C; Zhao, Shilin; Xiang, Yu; Zhao, Ying-Yong; Guo, Yan

    2017-12-05

    Non-coding ribonucleic acid (RNA) has without a doubt captured the interest of biomedical researchers. The ability to screen the entire human genome with high-throughput sequencing technology has greatly enhanced the identification, annotation and prediction of the functionality of non-coding RNAs. In this review, we discuss the current landscape of non-coding RNA research and quantitative analysis. Non-coding RNA will be categorized into two major groups by size: long non-coding RNAs and small RNAs. In long non-coding RNA, we discuss regular long non-coding RNA, pseudogenes and circular RNA. In small RNA, we discuss miRNA, transfer RNA, piwi-interacting RNA, small nucleolar RNA, small nuclear RNA, Y RNA, single recognition particle RNA, and 7SK RNA. We elaborate on the origin, detection method, and potential association with disease, putative functional mechanisms, and public resources for these non-coding RNAs. We aim to provide readers with a complete overview of non-coding RNAs and incite additional interest in non-coding RNA research.

  9. Systematic analysis of coding and noncoding DNA sequences using methods of statistical linguistics

    NASA Technical Reports Server (NTRS)

    Mantegna, R. N.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    We compare the statistical properties of coding and noncoding regions in eukaryotic and viral DNA sequences by adapting two tests developed for the analysis of natural languages and symbolic sequences. The data set comprises all 30 sequences of length above 50 000 base pairs in GenBank Release No. 81.0, as well as the recently published sequences of C. elegans chromosome III (2.2 Mbp) and yeast chromosome XI (661 Kbp). We find that for the three chromosomes we studied the statistical properties of noncoding regions appear to be closer to those observed in natural languages than those of coding regions. In particular, (i) a n-tuple Zipf analysis of noncoding regions reveals a regime close to power-law behavior while the coding regions show logarithmic behavior over a wide interval, while (ii) an n-gram entropy measurement shows that the noncoding regions have a lower n-gram entropy (and hence a larger "n-gram redundancy") than the coding regions. In contrast to the three chromosomes, we find that for vertebrates such as primates and rodents and for viral DNA, the difference between the statistical properties of coding and noncoding regions is not pronounced and therefore the results of the analyses of the investigated sequences are less conclusive. After noting the intrinsic limitations of the n-gram redundancy analysis, we also briefly discuss the failure of the zeroth- and first-order Markovian models or simple nucleotide repeats to account fully for these "linguistic" features of DNA. Finally, we emphasize that our results by no means prove the existence of a "language" in noncoding DNA.

  10. Long-range correlation properties of coding and noncoding DNA sequences: GenBank analysis.

    PubMed

    Buldyrev, S V; Goldberger, A L; Havlin, S; Mantegna, R N; Matsa, M E; Peng, C K; Simons, M; Stanley, H E

    1995-05-01

    An open question in computational molecular biology is whether long-range correlations are present in both coding and noncoding DNA or only in the latter. To answer this question, we consider all 33301 coding and all 29453 noncoding eukaryotic sequences--each of length larger than 512 base pairs (bp)--in the present release of the GenBank to dtermine whether there is any statistically significant distinction in their long-range correlation properties. Standard fast Fourier transform (FFT) analysis indicates that coding sequences have practically no correlations in the range from 10 bp to 100 bp (spectral exponent beta=0.00 +/- 0.04, where the uncertainty is two standard deviations). In contrast, for noncoding sequences, the average value of the spectral exponent beta is positive (0.16 +/- 0.05) which unambiguously shows the presence of long-range correlations. We also separately analyze the 874 coding and the 1157 noncoding sequences that have more than 4096 bp and find a larger region of power-law behavior. We calculate the probability that these two data sets (coding and noncoding) were drawn from the same distribution and we find that it is less than 10(-10). We obtain independent confirmation of these findings using the method of detrended fluctuation analysis (DFA), which is designed to treat sequences with statistical heterogeneity, such as DNA's known mosaic structure ("patchiness") arising from the nonstationarity of nucleotide concentration. The near-perfect agreement between the two independent analysis methods, FFT and DFA, increases the confidence in the reliability of our conclusion.

  11. Long-range correlation properties of coding and noncoding DNA sequences: GenBank analysis

    NASA Technical Reports Server (NTRS)

    Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Matsa, M. E.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    An open question in computational molecular biology is whether long-range correlations are present in both coding and noncoding DNA or only in the latter. To answer this question, we consider all 33301 coding and all 29453 noncoding eukaryotic sequences--each of length larger than 512 base pairs (bp)--in the present release of the GenBank to dtermine whether there is any statistically significant distinction in their long-range correlation properties. Standard fast Fourier transform (FFT) analysis indicates that coding sequences have practically no correlations in the range from 10 bp to 100 bp (spectral exponent beta=0.00 +/- 0.04, where the uncertainty is two standard deviations). In contrast, for noncoding sequences, the average value of the spectral exponent beta is positive (0.16 +/- 0.05) which unambiguously shows the presence of long-range correlations. We also separately analyze the 874 coding and the 1157 noncoding sequences that have more than 4096 bp and find a larger region of power-law behavior. We calculate the probability that these two data sets (coding and noncoding) were drawn from the same distribution and we find that it is less than 10(-10). We obtain independent confirmation of these findings using the method of detrended fluctuation analysis (DFA), which is designed to treat sequences with statistical heterogeneity, such as DNA's known mosaic structure ("patchiness") arising from the nonstationarity of nucleotide concentration. The near-perfect agreement between the two independent analysis methods, FFT and DFA, increases the confidence in the reliability of our conclusion.

  12. LSENS, a general chemical kinetics and sensitivity analysis code for homogeneous gas-phase reactions. 2: Code description and usage

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Bittker, David A.

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 2 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 2 describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part 1 (NASA RP-1328) derives the governing equations describes the numerical solution procedures for the types of problems that can be solved by lSENS. Part 3 (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.

  13. Factors Affecting Christian Parents' School Choice Decision Processes: A Grounded Theory Study

    ERIC Educational Resources Information Center

    Prichard, Tami G.; Swezey, James A.

    2016-01-01

    This study identifies factors affecting the decision processes for school choice by Christian parents. Grounded theory design incorporated interview transcripts, field notes, and a reflective journal to analyze themes. Comparative analysis, including open, axial, and selective coding, was used to reduce the coded statements to five code families:…

  14. Modified NASA-Lewis chemical equilibrium code for MHD applications

    NASA Technical Reports Server (NTRS)

    Sacks, R. A.; Geyer, H. K.; Grammel, S. J.; Doss, E. D.

    1979-01-01

    A substantially modified version of the NASA-Lewis Chemical Equilibrium Code was recently developed. The modifications were designed to extend the power and convenience of the Code as a tool for performing combustor analysis for MHD systems studies. The effect of the programming details is described from a user point of view.

  15. Coupling of TRAC-PF1/MOD2, Version 5.4.25, with NESTLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knepper, P.L.; Hochreiter, L.E.; Ivanov, K.N.

    1999-09-01

    A three-dimensional (3-D) spatial kinetics capability within a thermal-hydraulics system code provides a more correct description of the core physics during reactor transients that involve significant variations in the neutron flux distribution. Coupled codes provide the ability to forecast safety margins in a best-estimate manner. The behavior of a reactor core and the feedback to the plant dynamics can be accurately simulated. For each time step, coupled codes are capable of resolving system interaction effects on neutronics feedback and are capable of describing local neutronics effects caused by the thermal hydraulics and neutronics coupling. With the improvements in computational technology,more » modeling complex reactor behaviors with coupled thermal hydraulics and spatial kinetics is feasible. Previously, reactor analysis codes were limited to either a detailed thermal-hydraulics model with simplified kinetics or multidimensional neutron kinetics with a simplified thermal-hydraulics model. The authors discuss the coupling of the Transient Reactor Analysis Code (TRAC)-PF1/MOD2, Version 5.4.25, with the NESTLE code.« less

  16. Regulating alcohol advertising: content analysis of the adequacy of federal and self-regulation of magazine advertisements, 2008-2010.

    PubMed

    Smith, Katherine C; Cukier, Samantha; Jernigan, David H

    2014-10-01

    We analyzed beer, spirits, and alcopop magazine advertisements to determine adherence to federal and voluntary advertising standards. We assessed the efficacy of these standards in curtailing potentially damaging content and protecting public health. We obtained data from a content analysis of a census of 1795 unique advertising creatives for beer, spirits, and alcopops placed in nationally available magazines between 2008 and 2010. We coded creatives for manifest content and adherence to federal regulations and industry codes. Advertisements largely adhered to existing regulations and codes. We assessed only 23 ads as noncompliant with federal regulations and 38 with industry codes. Content consistent with the codes was, however, often culturally positive in terms of aspirational depictions. In addition, creatives included degrading and sexualized images, promoted risky behavior, and made health claims associated with low-calorie content. Existing codes and regulations are largely followed regarding content but do not adequately protect against content that promotes unhealthy and irresponsible consumption and degrades potentially vulnerable populations in its depictions. Our findings suggest further limitations and enhanced federal oversight may be necessary to protect public health.

  17. Monte Carol-based validation of neutronic methodology for EBR-II analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, J.R.; Finck, P.J.

    1993-01-01

    The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less

  18. Clean Energy in City Codes: A Baseline Analysis of Municipal Codification across the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeffrey J.; Aznar, Alexandra; Dane, Alexander

    Municipal governments in the United States are well positioned to influence clean energy (energy efficiency and alternative energy) and transportation technology and strategy implementation within their jurisdictions through planning, programs, and codification. Municipal governments are leveraging planning processes and programs to shape their energy futures. There is limited understanding in the literature related to codification, the primary way that municipal governments enact enforceable policies. The authors fill the gap in the literature by documenting the status of municipal codification of clean energy and transportation across the United States. More directly, we leverage online databases of municipal codes to develop nationalmore » and state-specific representative samples of municipal governments by population size. Our analysis finds that municipal governments with the authority to set residential building energy codes within their jurisdictions frequently do so. In some cases, communities set codes higher than their respective state governments. Examination of codes across the nation indicates that municipal governments are employing their code as a policy mechanism to address clean energy and transportation.« less

  19. Analysis of internal flows relative to the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Cooperative efforts between the Lockheed-Huntsville Computational Mechanics Group and the NASA-MSFC Computational Fluid Dynamics staff has resulted in improved capabilities for numerically simulating incompressible flows generic to the Space Shuttle Main Engine (SSME). A well established and documented CFD code was obtained, modified, and applied to laminar and turbulent flows of the type occurring in the SSME Hot Gas Manifold. The INS3D code was installed on the NASA-MSFC CRAY-XMP computer system and is currently being used by NASA engineers. Studies to perform a transient analysis of the FPB were conducted. The COBRA/TRAC code is recommended for simulating the transient flow of oxygen into the LOX manifold. Property data for modifying the code to represent LOX/GOX flow was collected. The ALFA code was developed and recommended for representing the transient combustion in the preburner. These two codes will couple through the transient boundary conditions to simulate the startup and/or shutdown of the fuel preburner. A study, NAS8-37461, is currently being conducted to implement this modeling effort.

  20. Comparative analysis of core heat transport of JET high density H-mode plasmas in carbon wall and ITER-like wall

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Tae; Romanelli, M.; Voitsekhovitch, I.; Koskela, T.; Conboy, J.; Giroud, C.; Maddison, G.; Joffrin, E.; contributors, JET

    2015-06-01

    A consistent deterioration of global confinement in H-mode experiments has been observed in JET [1] following the replacement of all carbon plasma facing components (PFCs) with an all metal (‘ITER-like’) wall (ILW). This has been correlated to the observed degradation of the pedestal confinement, as lower electron temperature (Te) values are routinely measured at the top of the edge barrier region. A comparative investigation of core heat transport in JET-ILW and JET-CW (carbon wall) discharges has been performed, to assess whether core confinement has also been affected by the wall change. The results presented here have been obtained by analysing a set of discharges consisting of high density JET-ILW H-mode plasmas and comparing them against their counterpart discharges in JET-CW having similar global operational parameters. The set contains 10 baseline ({βN}=1.5∼ 2 ) discharge-pairs with 2.7 T toroidal magnetic field, 2.5 MA plasma current, and 14 to 17 MW of neutral beam injection (NBI) heating. Based on a Te profile analysis using high resolution Thomson scattering (HRTS) data, the Te profile peaking (i.e. core Te (ρ = 0.3) / edge Te (ρ = 0.7)) is found to be similar, and weakly dependent on edge Te, for both JET-ILW and JET-CW discharges. When ILW discharges are seeded with N2, core and edge Te both increase to maintain a similar peaking factor. The change in core confinement is addressed with interpretative TRANSP simulations. It is found that JET-ILW H-mode plasmas have higher NBI power deposition to electrons and lower NBI power deposition to ions as compared to the JET-CW counterparts. This is an effect of the lower electron temperature at the top of the pedestal. As a result, the core electron energy confinement time is reduced in JET-ILW discharges, but the core ion energy confinement time is not decreased. Overall, the core energy confinement is found to be the same in the JET-ILW discharges compared to the JET-CW counterparts.

Top