Sample records for vessel simulator benchmark

  1. Excore Modeling with VERAShift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Tara M.; Evans, Thomas M.

    It is important to be able to accurately predict the neutron flux outside the immediate reactor core for a variety of safety and material analyses. Monte Carlo radiation transport calculations are required to produce the high fidelity excore responses. Under this milestone VERA (specifically the VERAShift package) has been extended to perform excore calculations by running radiation transport calculations with Shift. This package couples VERA-CS with Shift to perform excore tallies for multiple state points concurrently, with each component capable of parallel execution on independent domains. Specifically, this package performs fluence calculations in the core barrel and vessel, or, performsmore » the requested tallies in any user-defined excore regions. VERAShift takes advantage of the general geometry package in Shift. This gives VERAShift the flexibility to explicitly model features outside the core barrel, including detailed vessel models, detectors, and power plant details. A very limited set of experimental and numerical benchmarks is available for excore simulation comparison. The Consortium for the Advanced Simulation of Light Water Reactors (CASL) has developed a set of excore benchmark problems to include as part of the VERA-CS verification and validation (V&V) problems. The excore capability in VERAShift has been tested on small representative assembly problems, multiassembly problems, and quarter-core problems. VERAView has also been extended to visualize these vessel fluence results from VERAShift. Preliminary vessel fluence results for quarter-core multistate calculations look very promising. Further development is needed to determine the details relevant to excore simulations. Validation of VERA for fluence and excore detectors still needs to be performed against experimental and numerical results.« less

  2. Benchmarking MARS (accident management software) with the Browns Ferry fire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, S.M.; Liu, L.Y.; Raines, J.C.

    1992-01-01

    The MAAP Accident Response System (MARS) is a userfriendly computer software developed to provide management and engineering staff with the most needed insights, during actual or simulated accidents, of the current and future conditions of the plant based on current plant data and its trends. To demonstrate the reliability of the MARS code in simulatng a plant transient, MARS is being benchmarked with the available reactor pressure vessel (RPV) pressure and level data from the Browns Ferry fire. The MRS software uses the Modular Accident Analysis Program (MAAP) code as its basis to calculate plant response under accident conditions. MARSmore » uses a limited set of plant data to initialize and track the accidnt progression. To perform this benchmark, a simulated set of plant data was constructed based on actual report data containing the information necessary to initialize MARS and keep track of plant system status throughout the accident progression. The initial Browns Ferry fire data were produced by performing a MAAP run to simulate the accident. The remaining accident simulation used actual plant data.« less

  3. An imaging-based computational model for simulating angiogenesis and tumour oxygenation dynamics

    NASA Astrophysics Data System (ADS)

    Adhikarla, Vikram; Jeraj, Robert

    2016-05-01

    Tumour growth, angiogenesis and oxygenation vary substantially among tumours and significantly impact their treatment outcome. Imaging provides a unique means of investigating these tumour-specific characteristics. Here we propose a computational model to simulate tumour-specific oxygenation changes based on the molecular imaging data. Tumour oxygenation in the model is reflected by the perfused vessel density. Tumour growth depends on its doubling time (T d) and the imaged proliferation. Perfused vessel density recruitment rate depends on the perfused vessel density around the tumour (sMVDtissue) and the maximum VEGF concentration for complete vessel dysfunctionality (VEGFmax). The model parameters were benchmarked to reproduce the dynamics of tumour oxygenation over its entire lifecycle, which is the most challenging test. Tumour oxygenation dynamics were quantified using the peak pO2 (pO2peak) and the time to peak pO2 (t peak). Sensitivity of tumour oxygenation to model parameters was assessed by changing each parameter by 20%. t peak was found to be more sensitive to tumour cell line related doubling time (~30%) as compared to tissue vasculature density (~10%). On the other hand, pO2peak was found to be similarly influenced by the above tumour- and vasculature-associated parameters (~30-40%). Interestingly, both pO2peak and t peak were only marginally affected by VEGFmax (~5%). The development of a poorly oxygenated (hypoxic) core with tumour growth increased VEGF accumulation, thus disrupting the vessel perfusion as well as further increasing hypoxia with time. The model with its benchmarked parameters, is applied to hypoxia imaging data obtained using a [64Cu]Cu-ATSM PET scan of a mouse tumour and the temporal development of the vasculature and hypoxia maps are shown. The work underscores the importance of using tumour-specific input for analysing tumour evolution. An extended model incorporating therapeutic effects can serve as a powerful tool for analysing tumour response to anti-angiogenic therapies.

  4. Neutron and photon shielding benchmark calculations by MCNP on the LR-0 experimental facility.

    PubMed

    Hordósy, G

    2005-01-01

    In the framework of the REDOS project, the space-energy distribution of the neutron and photon flux has been calculated over the pressure vessel simulator thickness of the LR-0 experimental reactor, Rez, Czech Republic. The results calculated by the Monte Carlo code MCNP4C are compared with the measurements performed in the Nuclear Research Institute, Rez. The spectra have been measured at the barrel, in front of, inside and behind the pressure vessel in different configurations. The neutron measurements were performed in the energy range 0.1-10 MeV. This work has been done in the frame of the 5th Frame Work Programme of the European Community 1998-2002.

  5. Study on the shipboard radar reconnaissance equipment azimuth benchmark method

    NASA Astrophysics Data System (ADS)

    Liu, Zhenxing; Jiang, Ning; Ma, Qian; Liu, Songtao; Wang, Longtao

    2015-10-01

    The future naval battle will take place in a complex electromagnetic environment. Therefore, seizing the electromagnetic superiority has become the major actions of the navy. Radar reconnaissance equipment is an important part of the system to obtain and master battlefield electromagnetic radiation source information. Azimuth measurement function is one of the main function radar reconnaissance equipments. Whether the accuracy of direction finding meets the requirements, determines the vessels successful or not active jamming, passive jamming, guided missile attack and other combat missions, having a direct bearing on the vessels combat capabilities . How to test the performance of radar reconnaissance equipment, while affecting the task as little as possible is a problem. This paper, based on radar signal simulator and GPS positioning equipment, researches and experiments on one new method, which povides the azimuth benchmark required by the direction-finding precision test anytime anywhere, for the ships at jetty to test radar reconnaissance equipment performance in direction-finding. It provides a powerful means for the naval radar reconnaissance equipments daily maintenance and repair work[1].

  6. Multi-Constituent Simulation of Thrombus Deposition

    NASA Astrophysics Data System (ADS)

    Wu, Wei-Tao; Jamiolkowski, Megan A.; Wagner, William R.; Aubry, Nadine; Massoudi, Mehrdad; Antaki, James F.

    2017-02-01

    In this paper, we present a spatio-temporal mathematical model for simulating the formation and growth of a thrombus. Blood is treated as a multi-constituent mixture comprised of a linear fluid phase and a thrombus (solid) phase. The transport and reactions of 10 chemical and biological species are incorporated using a system of coupled convection-reaction-diffusion (CRD) equations to represent three processes in thrombus formation: initiation, propagation and stabilization. Computational fluid dynamic (CFD) simulations using the libraries of OpenFOAM were performed for two illustrative benchmark problems: in vivo thrombus growth in an injured blood vessel and in vitro thrombus deposition in micro-channels (1.5 mm × 1.6 mm × 0.1 mm) with small crevices (125 μm × 75 μm and 125 μm × 137 μm). For both problems, the simulated thrombus deposition agreed very well with experimental observations, both spatially and temporally. Based on the success with these two benchmark problems, which have very different flow conditions and biological environments, we believe that the current model will provide useful insight into the genesis of thrombosis in blood-wetted devices, and provide a tool for the design of less thrombogenic devices.

  7. Multi-Constituent Simulation of Thrombus Deposition

    PubMed Central

    Wu, Wei-Tao; Jamiolkowski, Megan A.; Wagner, William R.; Aubry, Nadine; Massoudi, Mehrdad; Antaki, James F.

    2017-01-01

    In this paper, we present a spatio-temporal mathematical model for simulating the formation and growth of a thrombus. Blood is treated as a multi-constituent mixture comprised of a linear fluid phase and a thrombus (solid) phase. The transport and reactions of 10 chemical and biological species are incorporated using a system of coupled convection-reaction-diffusion (CRD) equations to represent three processes in thrombus formation: initiation, propagation and stabilization. Computational fluid dynamic (CFD) simulations using the libraries of OpenFOAM were performed for two illustrative benchmark problems: in vivo thrombus growth in an injured blood vessel and in vitro thrombus deposition in micro-channels (1.5 mm × 1.6 mm × 0.1 mm) with small crevices (125 μm × 75 μm and 125 μm × 137 μm). For both problems, the simulated thrombus deposition agreed very well with experimental observations, both spatially and temporally. Based on the success with these two benchmark problems, which have very different flow conditions and biological environments, we believe that the current model will provide useful insight into the genesis of thrombosis in blood-wetted devices, and provide a tool for the design of less thrombogenic devices. PMID:28218279

  8. Multi-Constituent Simulation of Thrombus Deposition.

    PubMed

    Wu, Wei-Tao; Jamiolkowski, Megan A; Wagner, William R; Aubry, Nadine; Massoudi, Mehrdad; Antaki, James F

    2017-02-20

    In this paper, we present a spatio-temporal mathematical model for simulating the formation and growth of a thrombus. Blood is treated as a multi-constituent mixture comprised of a linear fluid phase and a thrombus (solid) phase. The transport and reactions of 10 chemical and biological species are incorporated using a system of coupled convection-reaction-diffusion (CRD) equations to represent three processes in thrombus formation: initiation, propagation and stabilization. Computational fluid dynamic (CFD) simulations using the libraries of OpenFOAM were performed for two illustrative benchmark problems: in vivo thrombus growth in an injured blood vessel and in vitro thrombus deposition in micro-channels (1.5 mm × 1.6 mm × 0.1 mm) with small crevices (125 μm × 75 μm and 125 μm × 137 μm). For both problems, the simulated thrombus deposition agreed very well with experimental observations, both spatially and temporally. Based on the success with these two benchmark problems, which have very different flow conditions and biological environments, we believe that the current model will provide useful insight into the genesis of thrombosis in blood-wetted devices, and provide a tool for the design of less thrombogenic devices.

  9. Seismic analysis of the Mirror Fusion Test Facility: soil structure interaction analyses of the Axicell vacuum vessel. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maslenikov, O.R.; Mraz, M.J.; Johnson, J.J.

    1986-03-01

    This report documents the seismic analyses performed by SMA for the MFTF-B Axicell vacuum vessel. In the course of this study we performed response spectrum analyses, CLASSI fixed-base analyses, and SSI analyses that included interaction effects between the vessel and vault. The response spectrum analysis served to benchmark certain modeling differences between the LLNL and SMA versions of the vessel model. The fixed-base analysis benchmarked the differences between analysis techniques. The SSI analyses provided our best estimate of vessel response to the postulated seismic excitation for the MFTF-B facility, and included consideration of uncertainties in soil properties by calculating responsemore » for a range of soil shear moduli. Our results are presented in this report as tables of comparisons of specific member forces from our analyses and the analyses performed by LLNL. Also presented are tables of maximum accelerations and relative displacements and plots of response spectra at various selected locations.« less

  10. Study of blood flow in several benchmark micro-channels using a two-fluid approach.

    PubMed

    Wu, Wei-Tao; Yang, Fang; Antaki, James F; Aubry, Nadine; Massoudi, Mehrdad

    2015-10-01

    It is known that in a vessel whose characteristic dimension (e.g., its diameter) is in the range of 20 to 500 microns, blood behaves as a non-Newtonian fluid, exhibiting complex phenomena, such as shear-thinning, stress relaxation, and also multi-component behaviors, such as the Fahraeus effect, plasma-skimming, etc. For describing these non-Newtonian and multi-component characteristics of blood, using the framework of mixture theory, a two-fluid model is applied, where the plasma is treated as a Newtonian fluid and the red blood cells (RBCs) are treated as shear-thinning fluid. A computational fluid dynamic (CFD) simulation incorporating the constitutive model was implemented using OpenFOAM® in which benchmark problems including a sudden expansion and various driven slots and crevices were studied numerically. The numerical results exhibited good agreement with the experimental observations with respect to both the velocity field and the volume fraction distribution of RBCs.

  11. Characterization of fish hold effluent discharged from commercial fishing vessels into harbor waters.

    PubMed

    Albert, Ryan J; McLaughlin, Christine; Falatko, Debra

    2014-10-15

    Fish hold effluent and the effluent produced from the cleaning of fish holds may contain organic material resulting from the degradation of seafood and cleaning products (e.g., soaps and detergents). This effluent is often discharged by vessels into near shore waters and, therefore, could have the potential to contribute to water pollution in bays and estuaries. We characterized effluent from commercial fishing vessels with holds containing refrigerated seawater, ice slurry, or chipped ice. Concentrations of trace heavy metals, wet chemistry parameters, and nutrients in effluent were compared to screening benchmarks to determine if there is a reasonable potential for effluent discharge to contribute to nonattainment of water quality standards. Most analytes (67%) exceeded their benchmark concentration and, therefore, may have the potential to pose risk to human health or the environment if discharges are in significant quantities or there are many vessels discharging in the same areas. Published by Elsevier Ltd.

  12. Analysis of dosimetry from the H.B. Robinson unit 2 pressure vessel benchmark using RAPTOR-M3G and ALPAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, G.A.

    2011-07-01

    Document available in abstract form only, full text of document follows: The dosimetry from the H. B. Robinson Unit 2 Pressure Vessel Benchmark is analyzed with a suite of Westinghouse-developed codes and data libraries. The radiation transport from the reactor core to the surveillance capsule and ex-vessel locations is performed by RAPTOR-M3G, a parallel deterministic radiation transport code that calculates high-resolution neutron flux information in three dimensions. The cross-section library used in this analysis is the ALPAN library, an Evaluated Nuclear Data File (ENDF)/B-VII.0-based library designed for reactor dosimetry and fluence analysis applications. Dosimetry is evaluated with the industry-standard SNLRMLmore » reactor dosimetry cross-section data library. (authors)« less

  13. Study of blood flow in several benchmark micro-channels using a two-fluid approach

    PubMed Central

    Wu, Wei-Tao; Yang, Fang; Antaki, James F.; Aubry, Nadine; Massoudi, Mehrdad

    2015-01-01

    It is known that in a vessel whose characteristic dimension (e.g., its diameter) is in the range of 20 to 500 microns, blood behaves as a non-Newtonian fluid, exhibiting complex phenomena, such as shear-thinning, stress relaxation, and also multi-component behaviors, such as the Fahraeus effect, plasma-skimming, etc. For describing these non-Newtonian and multi-component characteristics of blood, using the framework of mixture theory, a two-fluid model is applied, where the plasma is treated as a Newtonian fluid and the red blood cells (RBCs) are treated as shear-thinning fluid. A computational fluid dynamic (CFD) simulation incorporating the constitutive model was implemented using OpenFOAM® in which benchmark problems including a sudden expansion and various driven slots and crevices were studied numerically. The numerical results exhibited good agreement with the experimental observations with respect to both the velocity field and the volume fraction distribution of RBCs. PMID:26240438

  14. Inner and outer coronary vessel wall segmentation from CCTA using an active contour model with machine learning-based 3D voxel context-aware image force

    NASA Astrophysics Data System (ADS)

    Sivalingam, Udhayaraj; Wels, Michael; Rempfler, Markus; Grosskopf, Stefan; Suehling, Michael; Menze, Bjoern H.

    2016-03-01

    In this paper, we present a fully automated approach to coronary vessel segmentation, which involves calcification or soft plaque delineation in addition to accurate lumen delineation, from 3D Cardiac Computed Tomography Angiography data. Adequately virtualizing the coronary lumen plays a crucial role for simulating blood ow by means of fluid dynamics while additionally identifying the outer vessel wall in the case of arteriosclerosis is a prerequisite for further plaque compartment analysis. Our method is a hybrid approach complementing Active Contour Model-based segmentation with an external image force that relies on a Random Forest Regression model generated off-line. The regression model provides a strong estimate of the distance to the true vessel surface for every surface candidate point taking into account 3D wavelet-encoded contextual image features, which are aligned with the current surface hypothesis. The associated external image force is integrated in the objective function of the active contour model, such that the overall segmentation approach benefits from the advantages associated with snakes and from the ones associated with machine learning-based regression alike. This yields an integrated approach achieving competitive results on a publicly available benchmark data collection (Rotterdam segmentation challenge).

  15. Separation of Evans and Hiro currents in VDE of tokamak plasma

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Svidzinski, V. A.; Zakharov, L. E.

    2014-10-01

    Progress on the Disruption Simulation Code (DSC-3D) development and benchmarking will be presented. The DSC-3D is one-fluid nonlinear time-dependent MHD code, which utilizes fully 3D toroidal geometry for the first wall, pure vacuum and plasma itself, with adaptation to the moving plasma boundary and accurate resolution of the plasma surface current. Suppression of fast magnetosonic scale by the plasma inertia neglecting will be demonstrated. Due to code adaptive nature, self-consistent plasma surface current modeling during non-linear dynamics of the Vertical Displacement Event (VDE) is accurately provided. Separation of the plasma surface current on Evans and Hiro currents during simulation of fully developed VDE, then the plasma touches in-vessel tiles, will be discussed. Work is supported by the US DOE SBIR Grant # DE-SC0004487.

  16. The NAS parallel benchmarks

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.; Barszcz, E.; Barton, J. T.; Carter, R. L.; Lasinski, T. A.; Browning, D. S.; Dagum, L.; Fatoohi, R. A.; Frederickson, P. O.; Schreiber, R. S.

    1991-01-01

    A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers in the framework of the NASA Ames Numerical Aerodynamic Simulation (NAS) Program. These consist of five 'parallel kernel' benchmarks and three 'simulated application' benchmarks. Together they mimic the computation and data movement characteristics of large-scale computational fluid dynamics applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification-all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.

  17. Validation of the BUGJEFF311.BOLIB, BUGENDF70.BOLIB and BUGLE-B7 broad-group libraries on the PCA-Replica (H2O/Fe) neutron shielding benchmark experiment

    NASA Astrophysics Data System (ADS)

    Pescarini, Massimo; Orsi, Roberto; Frisoni, Manuela

    2016-03-01

    The PCA-Replica 12/13 (H2O/Fe) neutron shielding benchmark experiment was analysed using the TORT-3.2 3D SN code. PCA-Replica reproduces a PWR ex-core radial geometry with alternate layers of water and steel including a pressure vessel simulator. Three broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format with the same energy group structure (47 n + 20 γ) and based on different nuclear data were alternatively used: the ENEA BUGJEFF311.BOLIB (JEFF-3.1.1) and UGENDF70.BOLIB (ENDF/B-VII.0) libraries and the ORNL BUGLE-B7 (ENDF/B-VII.0) library. Dosimeter cross sections derived from the IAEA IRDF-2002 dosimetry file were employed. The calculated reaction rates for the Rh-103(n,n')Rh-103m, In-115(n,n')In-115m and S-32(n,p)P-32 threshold activation dosimeters and the calculated neutron spectra are compared with the corresponding experimental results.

  18. SeSBench - An initiative to benchmark reactive transport models for environmental subsurface processes

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik

    2017-04-01

    As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different environmental and geo-engineering applications. SeSBench will organize new workshops to add new benchmarks in a new special issue. Steefel, C. I., et al. (2015). "Reactive transport codes for subsurface environmental simulation." Computational Geosciences 19: 445-478.

  19. Analysis of the influence of the heat transfer phenomena on the late phase of the ThAI Iod-12 test

    NASA Astrophysics Data System (ADS)

    Gonfiotti, B.; Paci, S.

    2014-11-01

    Iodine is one of the major contributors to the source term during a severe accident in a Nuclear Power Plant for its volatility and high radiological consequences. Therefore, large efforts have been made to describe the Iodine behaviour during an accident, especially in the containment system. Due to the lack of experimental data, in the last years many attempts were carried out to fill the gaps on the knowledge of Iodine behaviour. In this framework, two tests (ThAI Iod-11 and Iod-12) were carried out inside a multi-compartment steel vessel. A quite complex transient characterizes these two tests; therefore they are also suitable for thermal- hydraulic benchmarks. The two tests were originally released for a benchmark exercise during the SARNET2 EU Project. At the end of this benchmark a report covering the main findings was issued, stating that the common codes employed in SA studies were able to simulate the tests but with large discrepancies. The present work is then related to the application of the new versions of ASTEC and MELCOR codes with the aim of carry out a new code-to-code comparison vs. ThAI Iod-12 experimental data, focusing on the influence of the heat exchanges with the outer environment, which seems to be one of the most challenging issues to cope with.

  20. Benchmarking nitrogen removal suspended-carrier biofilm systems using dynamic simulation.

    PubMed

    Vanhooren, H; Yuan, Z; Vanrolleghem, P A

    2002-01-01

    We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.

  1. A broad-group cross-section library based on ENDF/B-VII.0 for fast neutron dosimetry Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpan, F.A.

    2011-07-01

    A new ENDF/B-VII.0-based coupled 44-neutron, 20-gamma-ray-group cross-section library was developed to investigate the latest evaluated nuclear data file (ENDF) ,in comparison to ENDF/B-VI.3 used in BUGLE-96, as well as to generate an objective-specific library. The objectives selected for this work consisted of dosimetry calculations for in-vessel and ex-vessel reactor locations, iron atom displacement calculations for reactor internals and pressure vessel, and {sup 58}Ni(n,{gamma}) calculation that is important for gas generation in the baffle plate. The new library was generated based on the contribution and point-wise cross-section-driven (CPXSD) methodology and was applied to one of the most widely used benchmarks, themore » Oak Ridge National Laboratory Pool Critical Assembly benchmark problem. In addition to the new library, BUGLE-96 and an ENDF/B-VII.0-based coupled 47-neutron, 20-gamma-ray-group cross-section library was generated and used with both SNLRML and IRDF dosimetry cross sections to compute reaction rates. All reaction rates computed by the multigroup libraries are within {+-} 20 % of measurement data and meet the U. S. Nuclear Regulatory Commission acceptance criterion for reactor vessel neutron exposure evaluations specified in Regulatory Guide 1.190. (authors)« less

  2. WE-E-17A-01: Characterization of An Imaging-Based Model of Tumor Angiogenesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adhikarla, V; Jeraj, R

    2014-06-15

    Purpose: Understanding the transient dynamics of tumor oxygenation is important when evaluating tumor-vasculature response to anti-angiogenic therapies. An imaging-based tumor-vasculature model was used to elucidate factors that affect these dynamics. Methods: Tumor growth depends on its doubling time (Td). Hypoxia increases pro-angiogenic factor (VEGF) concentration which is modeled to reduce vessel perfusion, attributing to its effect of increasing vascular permeability. Perfused vessel recruitment depends on the existing perfused vasculature, VEGF concentration and maximum VEGF concentration (VEGFmax) for vessel dysfunction. A convolution-based algorithm couples the tumor to the normal tissue vessel density (VD-nt). The parameters are benchmarked to published pre-clinical datamore » and a sensitivity study evaluating the changes in the peak and time to peak tumor oxygenation characterizes them. The model is used to simulate changes in hypoxia and proliferation PET imaging data obtained using [Cu- 61]Cu-ATSM and [F-18]FLT respectively. Results: Td and VD-nt were found to be the most influential on peak tumor pO2 while VEGFmax was marginally influential. A +20 % change in Td, VD-nt and VEGFmax resulted in +50%, +25% and +5% increase in peak pO2. In contrast, Td was the most influential on the time to peak oxygenation with VD-nt and VEGFmax playing marginal roles. A +20% change in Td, VD-nt and VEGFmax increased the time to peak pO2 by +50%, +5% and +0%. A −20% change in the above parameters resulted in comparable decreases in the peak and time to peak pO2. Model application to the PET data was able to demonstrate the voxel-specific changes in hypoxia of the imaged tumor. Conclusion: Tumor-specific doubling time and vessel density are important parameters to be considered when evaluating hypoxia transients. While the current model simulates the oxygen dynamics of an untreated tumor, incorporation of therapeutic effects can make the model a potent tool for analyzing anti-angiogenic therapies.« less

  3. Pulse Jet Mixing Tests With Noncohesive Solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Perry A.; Bamberger, Judith A.; Enderlin, Carl W.

    2012-02-17

    This report summarizes results from pulse jet mixing (PJM) tests with noncohesive solids in Newtonian liquid. The tests were conducted during FY 2007 and 2008 to support the design of mixing systems for the Hanford Waste Treatment and Immobilization Plant (WTP). Tests were conducted at three geometric scales using noncohesive simulants, and the test data were used to develop models predicting two measures of mixing performance for full-scale WTP vessels. The models predict the cloud height (the height to which solids will be lifted by the PJM action) and the critical suspension velocity (the minimum velocity needed to ensure allmore » solids are suspended off the floor, though not fully mixed). From the cloud height, the concentration of solids at the pump inlet can be estimated. The predicted critical suspension velocity for lifting all solids is not precisely the same as the mixing requirement for 'disturbing' a sufficient volume of solids, but the values will be similar and closely related. These predictive models were successfully benchmarked against larger scale tests and compared well with results from computational fluid dynamics simulations. The application of the models to assess mixing in WTP vessels is illustrated in examples for 13 distinct designs and selected operational conditions. The values selected for these examples are not final; thus, the estimates of performance should not be interpreted as final conclusions of design adequacy or inadequacy. However, this work does reveal that several vessels may require adjustments to design, operating features, or waste feed properties to ensure confidence in operation. The models described in this report will prove to be valuable engineering tools to evaluate options as designs are finalized for the WTP. Revision 1 refines data sets used for model development and summarizes models developed since the completion of Revision 0.« less

  4. Benchmark calculation for radioactivity inventory using MAXS library based on JENDL-4.0 and JEFF-3.0/A for decommissioning BWR plants

    NASA Astrophysics Data System (ADS)

    Tanaka, Ken-ichi

    2016-06-01

    We performed benchmark calculation for radioactivity activated in a Primary Containment Vessel (PCV) of a Boiling Water Reactor (BWR) by using MAXS library, which was developed by collapsing with neutron energy spectra in the PCV of the BWR. Radioactivities due to neutron irradiation were measured by using activation foil detector of Gold (Au) and Nickel (Ni) at thirty locations in the PCV. We performed activation calculations of the foils with SCALE5.1/ORIGEN-S code with irradiation conditions of each foil location as the benchmark calculation. We compared calculations and measurements to estimate an effectiveness of MAXS library.

  5. Shift Verification and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over amore » burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.« less

  6. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less

  7. Competency based training in robotic surgery: benchmark scores for virtual reality robotic simulation.

    PubMed

    Raison, Nicholas; Ahmed, Kamran; Fossati, Nicola; Buffi, Nicolò; Mottrie, Alexandre; Dasgupta, Prokar; Van Der Poel, Henk

    2017-05-01

    To develop benchmark scores of competency for use within a competency based virtual reality (VR) robotic training curriculum. This longitudinal, observational study analysed results from nine European Association of Urology hands-on-training courses in VR simulation. In all, 223 participants ranging from novice to expert robotic surgeons completed 1565 exercises. Competency was set at 75% of the mean expert score. Benchmark scores for all general performance metrics generated by the simulator were calculated. Assessment exercises were selected by expert consensus and through learning-curve analysis. Three basic skill and two advanced skill exercises were identified. Benchmark scores based on expert performance offered viable targets for novice and intermediate trainees in robotic surgery. Novice participants met the competency standards for most basic skill exercises; however, advanced exercises were significantly more challenging. Intermediate participants performed better across the seven metrics but still did not achieve the benchmark standard in the more difficult exercises. Benchmark scores derived from expert performances offer relevant and challenging scores for trainees to achieve during VR simulation training. Objective feedback allows both participants and trainers to monitor educational progress and ensures that training remains effective. Furthermore, the well-defined goals set through benchmarking offer clear targets for trainees and enable training to move to a more efficient competency based curriculum. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  8. Closed-Loop Neuromorphic Benchmarks

    PubMed Central

    Stewart, Terrence C.; DeWolf, Travis; Kleinhans, Ashley; Eliasmith, Chris

    2015-01-01

    Evaluating the effectiveness and performance of neuromorphic hardware is difficult. It is even more difficult when the task of interest is a closed-loop task; that is, a task where the output from the neuromorphic hardware affects some environment, which then in turn affects the hardware's future input. However, closed-loop situations are one of the primary potential uses of neuromorphic hardware. To address this, we present a methodology for generating closed-loop benchmarks that makes use of a hybrid of real physical embodiment and a type of “minimal” simulation. Minimal simulation has been shown to lead to robust real-world performance, while still maintaining the practical advantages of simulation, such as making it easy for the same benchmark to be used by many researchers. This method is flexible enough to allow researchers to explicitly modify the benchmarks to identify specific task domains where particular hardware excels. To demonstrate the method, we present a set of novel benchmarks that focus on motor control for an arbitrary system with unknown external forces. Using these benchmarks, we show that an error-driven learning rule can consistently improve motor control performance across a randomly generated family of closed-loop simulations, even when there are up to 15 interacting joints to be controlled. PMID:26696820

  9. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  10. The NAS parallel benchmarks

    NASA Technical Reports Server (NTRS)

    Bailey, David (Editor); Barton, John (Editor); Lasinski, Thomas (Editor); Simon, Horst (Editor)

    1993-01-01

    A new set of benchmarks was developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of a set of kernels, the 'Parallel Kernels,' and a simulated application benchmark. Together they mimic the computation and data movement characteristics of large scale computational fluid dynamics (CFD) applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification - all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.

  11. WWTP dynamic disturbance modelling--an essential module for long-term benchmarking development.

    PubMed

    Gernaey, K V; Rosen, C; Jeppsson, U

    2006-01-01

    Intensive use of the benchmark simulation model No. 1 (BSM1), a protocol for objective comparison of the effectiveness of control strategies in biological nitrogen removal activated sludge plants, has also revealed a number of limitations. Preliminary definitions of the long-term benchmark simulation model No. 1 (BSM1_LT) and the benchmark simulation model No. 2 (BSM2) have been made to extend BSM1 for evaluation of process monitoring methods and plant-wide control strategies, respectively. Influent-related disturbances for BSM1_LT/BSM2 are to be generated with a model, and this paper provides a general overview of the modelling methods used. Typical influent dynamic phenomena generated with the BSM1_LT/BSM2 influent disturbance model, including diurnal, weekend, seasonal and holiday effects, as well as rainfall, are illustrated with simulation results. As a result of the work described in this paper, a proposed influent model/file has been released to the benchmark developers for evaluation purposes. Pending this evaluation, a final BSM1_LT/BSM2 influent disturbance model definition is foreseen. Preliminary simulations with dynamic influent data generated by the influent disturbance model indicate that default BSM1 activated sludge plant control strategies will need extensions for BSM1_LT/BSM2 to efficiently handle 1 year of influent dynamics.

  12. Simulation of Benchmark Cases with the Terminal Area Simulation System (TASS)

    NASA Technical Reports Server (NTRS)

    Ahmad, Nashat N.; Proctor, Fred H.

    2011-01-01

    The hydrodynamic core of the Terminal Area Simulation System (TASS) is evaluated against different benchmark cases. In the absence of closed form solutions for the equations governing atmospheric flows, the models are usually evaluated against idealized test cases. Over the years, various authors have suggested a suite of these idealized cases which have become standards for testing and evaluating the dynamics and thermodynamics of atmospheric flow models. In this paper, simulations of three such cases are described. In addition, the TASS model is evaluated against a test case that uses an exact solution of the Navier-Stokes equations. The TASS results are compared against previously reported simulations of these benchmark cases in the literature. It is demonstrated that the TASS model is highly accurate, stable and robust.

  13. Assessment of Static Delamination Propagation Capabilities in Commercial Finite Element Codes Using Benchmark Analysis

    NASA Technical Reports Server (NTRS)

    Orifici, Adrian C.; Krueger, Ronald

    2010-01-01

    With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.

  14. Validating Cellular Automata Lava Flow Emplacement Algorithms with Standard Benchmarks

    NASA Astrophysics Data System (ADS)

    Richardson, J. A.; Connor, L.; Charbonnier, S. J.; Connor, C.; Gallant, E.

    2015-12-01

    A major existing need in assessing lava flow simulators is a common set of validation benchmark tests. We propose three levels of benchmarks which test model output against increasingly complex standards. First, imulated lava flows should be morphologically identical, given changes in parameter space that should be inconsequential, such as slope direction. Second, lava flows simulated in simple parameter spaces can be tested against analytical solutions or empirical relationships seen in Bingham fluids. For instance, a lava flow simulated on a flat surface should produce a circular outline. Third, lava flows simulated over real world topography can be compared to recent real world lava flows, such as those at Tolbachik, Russia, and Fogo, Cape Verde. Success or failure of emplacement algorithms in these validation benchmarks can be determined using a Bayesian approach, which directly tests the ability of an emplacement algorithm to correctly forecast lava inundation. Here we focus on two posterior metrics, P(A|B) and P(¬A|¬B), which describe the positive and negative predictive value of flow algorithms. This is an improvement on less direct statistics such as model sensitivity and the Jaccard fitness coefficient. We have performed these validation benchmarks on a new, modular lava flow emplacement simulator that we have developed. This simulator, which we call MOLASSES, follows a Cellular Automata (CA) method. The code is developed in several interchangeable modules, which enables quick modification of the distribution algorithm from cell locations to their neighbors. By assessing several different distribution schemes with the benchmark tests, we have improved the performance of MOLASSES to correctly match early stages of the 2012-3 Tolbachik Flow, Kamchakta Russia, to 80%. We also can evaluate model performance given uncertain input parameters using a Monte Carlo setup. This illuminates sensitivity to model uncertainty.

  15. Simulator for SUPO, a Benchmark Aqueous Homogeneous Reactor (AHR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Steven Karl; Determan, John C.

    2015-10-14

    A simulator has been developed for SUPO (Super Power) an aqueous homogeneous reactor (AHR) that operated at Los Alamos National Laboratory (LANL) from 1951 to 1974. During that period SUPO accumulated approximately 600,000 kWh of operation. It is considered the benchmark for steady-state operation of an AHR. The SUPO simulator was developed using the process that resulted in a simulator for an accelerator-driven subcritical system, which has been previously reported.

  16. First Observation of Coseismic Seafloor Crustal Deformation due to M7 Class Earthquakes in the Philippine Sea Plate

    NASA Astrophysics Data System (ADS)

    Tadokoro, K.; Ikuta, R.; Ando, M.; Okuda, T.; Sugimoto, S.; Besana, G. M.; Kuno, M.

    2005-12-01

    The Mw7.3 and 7.5 earthquakes (Off Kii-Peninsula Earthquakes) occurred close to the source region of the anticipated Tonankai Trough in September 5, 2004. The focal mechanisms of the two earthquakes have no low angle nodal planes, which shows that the earthquakes are intraplate earthquakes in the Philippine Sea Plate. We observed coseismic horizontal displacement due to the Off Kii-Peninsula Earthquakes by means of a system for observing seafloor crustal deformation, which is the first observation of coseismic seafloor displacement in the world. We have developed a system for observing seafloor crustal deformation. The observation system is composed of 1) acoustic measurement between a ship transducer and sea-bottom transponders, and 2) kinematic GPS positioning of the observation vessel. We have installed a seafloor benchmark close to the epicenters of the Off Kii-Peninsula Earthquakes. The benchmark is composed of three sea-bottom transponders. The location of benchmark is defined as the weight center of the three transponders. We can determine the location of benchmark with an accuracy of about 5 cm at each observation. We have repeatedly measured the seafloor benchmark six times up to now: 1) July 12-16 and 21-22, 2004, 2) November 9-10, 3) January 19, 2005, 4) May 18-20, 5) July 19-20, and 6) August 18-19 and 29-30. The Off Kii-Peninsula Earthquakes occurred during the above monitoring period. The coseismic horizontal displacement of about 21 cm toward SSE was observed at our seafloor benchmark. The displacement is 3.5 times as large as the maximum displacement observed by on land GPS network in Japan, GEONET. The monitoring of seafloor crustal deformation is effective to detect the deformations associated with earthquakes occurring in ocean areas. This study is promoted by "Research Revolution 2002" of Ministry of Education, Culture, Sports, Science and Technology, Japan. We are grateful to the captain and crews of Research Vessel, Asama, of Mie Prefectural Science and Technology Promotion Center, Japan.

  17. Results of the GABLS3 diurnal-cycle benchmark for wind energy applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodrigo, J. Sanz; Allaerts, D.; Avila, M.

    We present results of the GABLS3 model intercomparison benchmark revisited for wind energy applications. The case consists of a diurnal cycle, measured at the 200-m tall Cabauw tower in the Netherlands, including a nocturnal low-level jet. The benchmark includes a sensitivity analysis of WRF simulations using two input meteorological databases and five planetary boundary-layer schemes. A reference set of mesoscale tendencies is used to drive microscale simulations using RANS k-ϵ and LES turbulence models. The validation is based on rotor-based quantities of interest. Cycle-integrated mean absolute errors are used to quantify model performance. The results of the benchmark are usedmore » to discuss input uncertainties from mesoscale modelling, different meso-micro coupling strategies (online vs offline) and consistency between RANS and LES codes when dealing with boundary-layer mean flow quantities. Altogether, all the microscale simulations produce a consistent coupling with mesoscale forcings.« less

  18. Results of the GABLS3 diurnal-cycle benchmark for wind energy applications

    DOE PAGES

    Rodrigo, J. Sanz; Allaerts, D.; Avila, M.; ...

    2017-06-13

    We present results of the GABLS3 model intercomparison benchmark revisited for wind energy applications. The case consists of a diurnal cycle, measured at the 200-m tall Cabauw tower in the Netherlands, including a nocturnal low-level jet. The benchmark includes a sensitivity analysis of WRF simulations using two input meteorological databases and five planetary boundary-layer schemes. A reference set of mesoscale tendencies is used to drive microscale simulations using RANS k-ϵ and LES turbulence models. The validation is based on rotor-based quantities of interest. Cycle-integrated mean absolute errors are used to quantify model performance. The results of the benchmark are usedmore » to discuss input uncertainties from mesoscale modelling, different meso-micro coupling strategies (online vs offline) and consistency between RANS and LES codes when dealing with boundary-layer mean flow quantities. Altogether, all the microscale simulations produce a consistent coupling with mesoscale forcings.« less

  19. Benchmark Simulation Model No 2: finalisation of plant layout and default control strategy.

    PubMed

    Nopens, I; Benedetti, L; Jeppsson, U; Pons, M-N; Alex, J; Copp, J B; Gernaey, K V; Rosen, C; Steyer, J-P; Vanrolleghem, P A

    2010-01-01

    The COST/IWA Benchmark Simulation Model No 1 (BSM1) has been available for almost a decade. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the research work related to the benchmark simulation models has resulted in more than 300 publications worldwide demonstrates the interest in and need of such tools within the research community. Recent efforts within the IWA Task Group on "Benchmarking of control strategies for WWTPs" have focused on an extension of the benchmark simulation model. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently, includes both pretreatment of wastewater as well as the processes describing sludge treatment. The motivation for the extension is the increasing interest and need to operate and control wastewater treatment systems not only at an individual process level but also on a plant-wide basis. To facilitate the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one week BSM1 evaluation period. In this paper, the finalised plant layout is summarised and, as was done for BSM1, a default control strategy is proposed. A demonstration of how BSM2 can be used to evaluate control strategies is also given.

  20. Surgical stent planning: simulation parameter study for models based on DICOM standards.

    PubMed

    Scherer, S; Treichel, T; Ritter, N; Triebel, G; Drossel, W G; Burgert, O

    2011-05-01

    Endovascular Aneurysm Repair (EVAR) can be facilitated by a realistic simulation model of stent-vessel-interaction. Therefore, numerical feasibility and integrability in the clinical environment was evaluated. The finite element method was used to determine necessary simulation parameters for stent-vessel-interaction in EVAR. Input variables and result data of the simulation model were examined for their standardization using DICOM supplements. The study identified four essential parameters for the stent-vessel simulation: blood pressure, intima constitution, plaque occurrence and the material properties of vessel and plaque. Output quantities such as radial force of the stent and contact pressure between stent/vessel can help the surgeon to evaluate implant fixation and sealing. The model geometry can be saved with DICOM "Surface Segmentation" objects and the upcoming "Implant Templates" supplement. Simulation results can be stored using the "Structured Report". A standards-based general simulation model for optimizing stent-graft selection may be feasible. At present, there are limitations due to specification of individual vessel material parameters and for simulating the proximal fixation of stent-grafts with hooks. Simulation data with clinical relevance for documentation and presentation can be stored using existing or new DICOM extensions.

  1. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE PAGES

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.; ...

    2018-03-26

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  2. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  3. Patch-based automatic retinal vessel segmentation in global and local structural context.

    PubMed

    Cao, Shuoying; Bharath, Anil A; Parker, Kim H; Ng, Jeffrey

    2012-01-01

    In this paper, we extend our published work [1] and propose an automated system to segment retinal vessel bed in digital fundus images with enough adaptability to analyze images from fluorescein angiography. This approach takes into account both the global and local context and enables both vessel segmentation and microvascular centreline extraction. These tools should allow researchers and clinicians to estimate and assess vessel diameter, capillary blood volume and microvascular topology for early stage disease detection, monitoring and treatment. Global vessel bed segmentation is achieved by combining phase-invariant orientation fields with neighbourhood pixel intensities in a patch-based feature vector for supervised learning. This approach is evaluated against benchmarks on the DRIVE database [2]. Local microvascular centrelines within Regions-of-Interest (ROIs) are segmented by linking the phase-invariant orientation measures with phase-selective local structure features. Our global and local structural segmentation can be used to assess both pathological structural alterations and microemboli occurrence in non-invasive clinical settings in a longitudinal study.

  4. Fossil fuel furnace reactor

    DOEpatents

    Parkinson, William J.

    1987-01-01

    A fossil fuel furnace reactor is provided for simulating a continuous processing plant with a batch reactor. An internal reaction vessel contains a batch of shale oil, with the vessel having a relatively thin wall thickness for a heat transfer rate effective to simulate a process temperature history in the selected continuous processing plant. A heater jacket is disposed about the reactor vessel and defines a number of independent controllable temperature zones axially spaced along the reaction vessel. Each temperature zone can be energized to simulate a time-temperature history of process material through the continuous plant. A pressure vessel contains both the heater jacket and the reaction vessel at an operating pressure functionally selected to simulate the continuous processing plant. The process yield from the oil shale may be used as feedback information to software simulating operation of the continuous plant to provide operating parameters, i.e., temperature profiles, ambient atmosphere, operating pressure, material feed rates, etc., for simulation in the batch reactor.

  5. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  6. Simulation Studies for Inspection of the Benchmark Test with PATRASH

    NASA Astrophysics Data System (ADS)

    Shimosaki, Y.; Igarashi, S.; Machida, S.; Shirakata, M.; Takayama, K.; Noda, F.; Shigaki, K.

    2002-12-01

    In order to delineate the halo-formation mechanisms in a typical FODO lattice, a 2-D simulation code PATRASH (PArticle TRAcking in a Synchrotron for Halo analysis) has been developed. The electric field originating from the space charge is calculated by the Hybrid Tree code method. Benchmark tests utilizing three simulation codes of ACCSIM, PATRASH and SIMPSONS were carried out. These results have been confirmed to be fairly in agreement with each other. The details of PATRASH simulation are discussed with some examples.

  7. Verification of cardiac mechanics software: benchmark problems and solutions for testing active and passive material behaviour.

    PubMed

    Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A

    2015-12-08

    Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.

  8. Simulation-based comprehensive benchmarking of RNA-seq aligners

    PubMed Central

    Baruzzo, Giacomo; Hayer, Katharina E; Kim, Eun Ji; Di Camillo, Barbara; FitzGerald, Garret A; Grant, Gregory R

    2018-01-01

    Alignment is the first step in most RNA-seq analysis pipelines, and the accuracy of downstream analyses depends heavily on it. Unlike most steps in the pipeline, alignment is particularly amenable to benchmarking with simulated data. We performed a comprehensive benchmarking of 14 common splice-aware aligners for base, read, and exon junction-level accuracy and compared default with optimized parameters. We found that performance varied by genome complexity, and accuracy and popularity were poorly correlated. The most widely cited tool underperforms for most metrics, particularly when using default settings. PMID:27941783

  9. Three-dimensional benchmark for variable-density flow and transport simulation: matching semi-analytic stability modes for steady unstable convection in an inclined porous box

    USGS Publications Warehouse

    Voss, Clifford I.; Simmons, Craig T.; Robinson, Neville I.

    2010-01-01

    This benchmark for three-dimensional (3D) numerical simulators of variable-density groundwater flow and solute or energy transport consists of matching simulation results with the semi-analytical solution for the transition from one steady-state convective mode to another in a porous box. Previous experimental and analytical studies of natural convective flow in an inclined porous layer have shown that there are a variety of convective modes possible depending on system parameters, geometry and inclination. In particular, there is a well-defined transition from the helicoidal mode consisting of downslope longitudinal rolls superimposed upon an upslope unicellular roll to a mode consisting of purely an upslope unicellular roll. Three-dimensional benchmarks for variable-density simulators are currently (2009) lacking and comparison of simulation results with this transition locus provides an unambiguous means to test the ability of such simulators to represent steady-state unstable 3D variable-density physics.

  10. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators.

    PubMed

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-12-19

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented.

  11. Determining the sample size required to establish whether a medical device is non-inferior to an external benchmark.

    PubMed

    Sayers, Adrian; Crowther, Michael J; Judge, Andrew; Whitehouse, Michael R; Blom, Ashley W

    2017-08-28

    The use of benchmarks to assess the performance of implants such as those used in arthroplasty surgery is a widespread practice. It provides surgeons, patients and regulatory authorities with the reassurance that implants used are safe and effective. However, it is not currently clear how or how many implants should be statistically compared with a benchmark to assess whether or not that implant is superior, equivalent, non-inferior or inferior to the performance benchmark of interest.We aim to describe the methods and sample size required to conduct a one-sample non-inferiority study of a medical device for the purposes of benchmarking. Simulation study. Simulation study of a national register of medical devices. We simulated data, with and without a non-informative competing risk, to represent an arthroplasty population and describe three methods of analysis (z-test, 1-Kaplan-Meier and competing risks) commonly used in surgical research. We evaluate the performance of each method using power, bias, root-mean-square error, coverage and CI width. 1-Kaplan-Meier provides an unbiased estimate of implant net failure, which can be used to assess if a surgical device is non-inferior to an external benchmark. Small non-inferiority margins require significantly more individuals to be at risk compared with current benchmarking standards. A non-inferiority testing paradigm provides a useful framework for determining if an implant meets the required performance defined by an external benchmark. Current contemporary benchmarking standards have limited power to detect non-inferiority, and substantially larger samples sizes, in excess of 3200 procedures, are required to achieve a power greater than 60%. It is clear when benchmarking implant performance, net failure estimated using 1-KM is preferential to crude failure estimated by competing risk models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. The NAS kernel benchmark program

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.; Barton, J. T.

    1985-01-01

    A collection of benchmark test kernels that measure supercomputer performance has been developed for the use of the NAS (Numerical Aerodynamic Simulation) program at the NASA Ames Research Center. This benchmark program is described in detail and the specific ground rules are given for running the program as a performance test.

  13. 75 FR 19882 - Safety Zone; Benchmark Destination Corporate Party, Fireworks Display, San Francisco, CA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ... participants and spectators from the dangers associated with the pyrotechnics. Unauthorized persons or vessels... by the pyrotechnics used in these fireworks displays, it would be contrary to the public interest to... delay in the effective date of this rule would expose mariners to the dangers posed by the pyrotechnics...

  14. Benchmarking of MCNP for calculating dose rates at an interim storage facility for nuclear waste.

    PubMed

    Heuel-Fabianek, Burkhard; Hille, Ralf

    2005-01-01

    During the operation of research facilities at Research Centre Jülich, Germany, nuclear waste is stored in drums and other vessels in an interim storage building on-site, which has a concrete shielding at the side walls. Owing to the lack of a well-defined source, measured gamma spectra were unfolded to determine the photon flux on the surface of the containers. The dose rate simulation, including the effects of skyshine, using the Monte Carlo transport code MCNP is compared with the measured dosimetric data at some locations in the vicinity of the interim storage building. The MCNP data for direct radiation confirm the data calculated using a point-kernel method. However, a comparison of the modelled dose rates for direct radiation and skyshine with the measured data demonstrate the need for a more precise definition of the source. Both the measured and the modelled dose rates verified the fact that the legal limits (<1 mSv a(-1)) are met in the area outside the perimeter fence of the storage building to which members of the public have access. Using container surface data (gamma spectra) to define the source may be a useful tool for practical calculations and additionally for benchmarking of computer codes if the discussed critical aspects with respect to the source can be addressed adequately.

  15. VVER-440 and VVER-1000 reactor dosimetry benchmark - BUGLE-96 versus ALPAN VII.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duo, J. I.

    2011-07-01

    Document available in abstract form only, full text of document follows: Analytical results of the vodo-vodyanoi energetichesky reactor-(VVER-) 440 and VVER-1000 reactor dosimetry benchmarks developed from engineering mockups at the Nuclear Research Inst. Rez LR-0 reactor are discussed. These benchmarks provide accurate determination of radiation field parameters in the vicinity and over the thickness of the reactor pressure vessel. Measurements are compared to calculated results with two sets of tools: TORT discrete ordinates code and BUGLE-96 cross-section library versus the newly Westinghouse-developed RAPTOR-M3G and ALPAN VII.0. The parallel code RAPTOR-M3G enables detailed neutron distributions in energy and space in reducedmore » computational time. ALPAN VII.0 cross-section library is based on ENDF/B-VII.0 and is designed for reactor dosimetry applications. It uses a unique broad group structure to enhance resolution in thermal-neutron-energy range compared to other analogous libraries. The comparison of fast neutron (E > 0.5 MeV) results shows good agreement (within 10%) between BUGLE-96 and ALPAN VII.O libraries. Furthermore, the results compare well with analogous results of participants of the REDOS program (2005). Finally, the analytical results for fast neutrons agree within 15% with the measurements, for most locations in all three mockups. In general, however, the analytical results underestimate the attenuation through the reactor pressure vessel thickness compared to the measurements. (authors)« less

  16. CFD Analyses of Air-Ingress Accident for VHTRs

    NASA Astrophysics Data System (ADS)

    Ham, Tae Kyu

    The Very High Temperature Reactor (VHTR) is one of six proposed Generation-IV concepts for the next generation of nuclear powered plants. The VHTR is advantageous because it is able to operate at very high temperatures, thus producing highly efficient electrical generation and hydrogen production. A critical safety event of the VHTR is a loss-of-coolant accident. This accident is initiated, in its worst-case scenario, by a double-ended guillotine break of the cross vessel that connects the reactor vessel and the power conversion unit. Following the depressurization process, the air (i.e., the air and helium mixture) in the reactor cavity could enter the reactor core causing an air-ingress event. In the event of air-ingress into the reactor core, the high-temperature in-core graphite structures will chemically react with the air and could lose their structural integrity. We designed a 1/8th scaled-down test facility to develop an experimental database for studying the mechanisms involved in the air-ingress phenomenon. The current research focuses on the analysis of the air-ingress phenomenon using the computational fluid dynamics (CFD) tool ANSYS FLUENT for better understanding of the air-ingress phenomenon. The anticipated key steps in the air-ingress scenario for guillotine break of VHTR cross vessel are: 1) depressurization; 2) density-driven stratified flow; 3) local hot plenum natural circulation; 4) diffusion into the reactor core; and 5) global natural circulation. However, the OSU air-ingress test facility covers the time from depressurization to local hot plenum natural circulation. Prior to beginning the CFD simulations for the OSU air-ingress test facility, benchmark studies for the mechanisms which are related to the air-ingress accident, were performed to decide the appropriate physical models for the accident analysis. In addition, preliminary experiments were performed with a simplified 1/30th scaled down acrylic set-up to understand the air-ingress mechanism and to utilize the CFD simulation in the analysis of the phenomenon. Previous air-ingress studies simulated the depressurization process using simple assumptions or 1-D system code results. However, recent studies found flow oscillations near the end of the depressurization which could influence the next stage of the air-ingress accident. Therefore, CFD simulations were performed to examine the air-ingress mechanisms from the depressurization through the establishment of local natural circulation initiate. In addition to the double-guillotine break scenario, there are other scenarios that can lead to an air-ingress event such as a partial break were in the cross vessel with various break locations, orientations, and shapes. These additional situations were also investigated. The simulation results for the OSU test facility showed that the discharged helium coolant from a reactor vessel during the depressurization process will be mixed with the air in the containment. This process makes the density of the gas mixture in the containment lower and the density-driven air-ingress flow slower because the density-driven flow is established by the density difference of the gas species between the reactor vessel and the containment. In addition, for the simulations with various initial and boundary conditions, the simulation results showed that the total accumulated air in the containment collapsed within 10% standard deviation by: 1. multiplying the density ratio and viscosity ratio of the gas species between the containment and the reactor vessel and 2. multiplying the ratio of the air mole fraction and gas temperature to the reference value. By replacing the gas mixture in the reactor cavity with a gas heavier than the air, the air-ingress speed slowed down. Based on the understanding of the air-ingress phenomena for the GT-MHR air-ingress scenario, several mitigation measures of air-ingress accident are proposed. The CFD results are utilized to plan experimental strategy and apparatus installation to obtain the best results when conducting an experiment. The validation of the generated CFD solutions will be performed with the OSU air-ingress experimental results. (Abstract shortened by UMI.).

  17. Establishing objective benchmarks in robotic virtual reality simulation at the level of a competent surgeon using the RobotiX Mentor simulator.

    PubMed

    Watkinson, William; Raison, Nicholas; Abe, Takashige; Harrison, Patrick; Khan, Shamim; Van der Poel, Henk; Dasgupta, Prokar; Ahmed, Kamran

    2018-05-01

    To establish objective benchmarks at the level of a competent robotic surgeon across different exercises and metrics for the RobotiX Mentor virtual reality (VR) simulator suitable for use within a robotic surgical training curriculum. This retrospective observational study analysed results from multiple data sources, all of which used the RobotiX Mentor VR simulator. 123 participants with varying experience from novice to expert completed the exercises. Competency was established as the 25th centile of the mean advanced intermediate score. Three basic skill exercises and two advanced skill exercises were used. King's College London. 84 Novice, 26 beginner intermediates, 9 advanced intermediates and 4 experts were used in this retrospective observational study. Objective benchmarks derived from the 25th centile of the mean scores of the advanced intermediates provided suitably challenging yet also achievable targets for training surgeons. The disparity in scores was greatest for the advanced exercises. Novice surgeons are able to achieve the benchmarks across all exercises in the majority of metrics. We have successfully created this proof-of-concept study, which requires validation in a larger cohort. Objective benchmarks obtained from the 25th centile of the mean scores of advanced intermediates provide clinically relevant benchmarks at the standard of a competent robotic surgeon that are challenging yet also attainable. That can be used within a VR training curriculum allowing participants to track and monitor their progress in a structured and progressional manner through five exercises. Providing clearly defined targets, ensuring that a universal training standard has been achieved across training surgeons. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Access to a simulator is not enough: the benefits of virtual reality training based on peer-group-derived benchmarks--a randomized controlled trial.

    PubMed

    von Websky, Martin W; Raptis, Dimitri A; Vitz, Martina; Rosenthal, Rachel; Clavien, P A; Hahnloser, Dieter

    2013-11-01

    Virtual reality (VR) simulators are widely used to familiarize surgical novices with laparoscopy, but VR training methods differ in efficacy. In the present trial, self-controlled basic VR training (SC-training) was tested against training based on peer-group-derived benchmarks (PGD-training). First, novice laparoscopic residents were randomized into a SC group (n = 34), and a group using PGD-benchmarks (n = 34) for basic laparoscopic training. After completing basic training, both groups performed 60 VR laparoscopic cholecystectomies for performance analysis. Primary endpoints were simulator metrics; secondary endpoints were program adherence, trainee motivation, and training efficacy. Altogether, 66 residents completed basic training, and 3,837 of 3,960 (96.8 %) cholecystectomies were available for analysis. Course adherence was good, with only two dropouts, both in the SC-group. The PGD-group spent more time and repetitions in basic training until the benchmarks were reached and subsequently showed better performance in the readout cholecystectomies: Median time (gallbladder extraction) showed significant differences of 520 s (IQR 354-738 s) in SC-training versus 390 s (IQR 278-536 s) in the PGD-group (p < 0.001) and 215 s (IQR 175-276 s) in experts, respectively. Path length of the right instrument also showed significant differences, again with the PGD-training group being more efficient. Basic VR laparoscopic training based on PGD benchmarks with external assessment is superior to SC training, resulting in higher trainee motivation and better performance in simulated laparoscopic cholecystectomies. We recommend such a basic course based on PGD benchmarks before advancing to more elaborate VR training.

  19. A new deadlock resolution protocol and message matching algorithm for the extreme-scale simulator

    DOE PAGES

    Engelmann, Christian; Naughton, III, Thomas J.

    2016-03-22

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different HPC architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1)~a new deadlock resolution protocol to reduce the parallel discrete event simulation overhead and (2)~a new simulated MPI message matchingmore » algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement. The simulation overhead for running the NAS Parallel Benchmark suite was reduced from 102% to 0% for the embarrassingly parallel (EP) benchmark and from 1,020% to 238% for the conjugate gradient (CG) benchmark. xSim offers a highly accurate simulation mode for better tracking of injected MPI process failures. Furthermore, with highly accurate simulation, the overhead was reduced from 3,332% to 204% for EP and from 37,511% to 13,808% for CG.« less

  20. Benchmark problems for numerical implementations of phase field models

    DOE PAGES

    Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; ...

    2016-10-01

    Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verifymore » new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.« less

  1. BACT Simulation User Guide (Version 7.0)

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1997-01-01

    This report documents the structure and operation of a simulation model of the Benchmark Active Control Technology (BACT) Wind-Tunnel Model. The BACT system was designed, built, and tested at NASA Langley Research Center as part of the Benchmark Models Program and was developed to perform wind-tunnel experiments to obtain benchmark quality data to validate computational fluid dynamics and computational aeroelasticity codes, to verify the accuracy of current aeroservoelasticity design and analysis tools, and to provide an active controls testbed for evaluating new and innovative control algorithms for flutter suppression and gust load alleviation. The BACT system has been especially valuable as a control system testbed.

  2. An overview of the ENEA activities in the field of coupled codes NPP simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parisi, C.; Negrenti, E.; Sepielli, M.

    2012-07-01

    In the framework of the nuclear research activities in the fields of safety, training and education, ENEA (the Italian National Agency for New Technologies, Energy and the Sustainable Development) is in charge of defining and pursuing all the necessary steps for the development of a NPP engineering simulator at the 'Casaccia' Research Center near Rome. A summary of the activities in the field of the nuclear power plants simulation by coupled codes is here presented with the long term strategy for the engineering simulator development. Specifically, results from the participation in international benchmarking activities like the OECD/NEA 'Kalinin-3' benchmark andmore » the 'AER-DYN-002' benchmark, together with simulations of relevant events like the Fukushima accident, are here reported. The ultimate goal of such activities performed using state-of-the-art technology is the re-establishment of top level competencies in the NPP simulation field in order to facilitate the development of Enhanced Engineering Simulators and to upgrade competencies for supporting national energy strategy decisions, the nuclear national safety authority, and the R and D activities on NPP designs. (authors)« less

  3. [Benchmark experiment to verify radiation transport calculations for dosimetry in radiation therapy].

    PubMed

    Renner, Franziska

    2016-09-01

    Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide. Copyright © 2015. Published by Elsevier GmbH.

  4. Generalizable open source urban water portfolio simulation framework demonstrated using a multi-objective risk-based planning benchmark problem.

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.

    2017-12-01

    The growing access and reduced cost for computing power in recent years has promoted rapid development and application of multi-objective water supply portfolio planning. As this trend continues there is a pressing need for flexible risk-based simulation frameworks and improved algorithm benchmarking for emerging classes of water supply planning and management problems. This work contributes the Water Utilities Management and Planning (WUMP) model: a generalizable and open source simulation framework designed to capture how water utilities can minimize operational and financial risks by regionally coordinating planning and management choices, i.e. making more efficient and coordinated use of restrictions, water transfers and financial hedging combined with possible construction of new infrastructure. We introduce the WUMP simulation framework as part of a new multi-objective benchmark problem for planning and management of regionally integrated water utility companies. In this problem, a group of fictitious water utilities seek to balance the use of the mentioned reliability driven actions (e.g., restrictions, water transfers and infrastructure pathways) and their inherent financial risks. Several traits of this problem make it ideal for a benchmark problem, namely the presence of (1) strong non-linearities and discontinuities in the Pareto front caused by the step-wise nature of the decision making formulation and by the abrupt addition of storage through infrastructure construction, (2) noise due to the stochastic nature of the streamflows and water demands, and (3) non-separability resulting from the cooperative formulation of the problem, in which decisions made by stakeholder may substantially impact others. Both the open source WUMP simulation framework and its demonstration in a challenging benchmarking example hold value for promoting broader advances in urban water supply portfolio planning for regions confronting change.

  5. Analysis of artery blood flow before and after angioplasty

    NASA Astrophysics Data System (ADS)

    Tomaszewski, Michał; Baranowski, Paweł; Małachowski, Jerzy; Damaziak, Krzysztof; Bukała, Jakub

    2018-01-01

    The study presents a comparison of results obtained from numerical simulations of blood flow in two different arteries. One of them was considered to be narrowed in order to simulate an arteriosclerosis obstructing the blood flow in the vessel, whereas the second simulates the vessel after angioplasty treatment. During the treatment, a biodegradable stent is inserted into the artery, which prevents the vessel walls from collapsing. The treatment was simulated through the use of numerical simulation using the finite element method. The final mesh geometry obtained from the analysis was exported to the dedicated software in order to create geometry in which a flow domain inside the artery with the stent was created. The flow analysis was conducted in ANSYS Fluent software with non-deformable vessel walls.

  6. Comprehensive Benchmark Suite for Simulation of Particle Laden Flows Using the Discrete Element Method with Performance Profiles from the Multiphase Flow with Interface eXchanges (MFiX) Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Peiyuan; Brown, Timothy; Fullmer, William D.

    Five benchmark problems are developed and simulated with the computational fluid dynamics and discrete element model code MFiX. The benchmark problems span dilute and dense regimes, consider statistically homogeneous and inhomogeneous (both clusters and bubbles) particle concentrations and a range of particle and fluid dynamic computational loads. Several variations of the benchmark problems are also discussed to extend the computational phase space to cover granular (particles only), bidisperse and heat transfer cases. A weak scaling analysis is performed for each benchmark problem and, in most cases, the scalability of the code appears reasonable up to approx. 103 cores. Profiling ofmore » the benchmark problems indicate that the most substantial computational time is being spent on particle-particle force calculations, drag force calculations and interpolating between discrete particle and continuum fields. Hardware performance analysis was also carried out showing significant Level 2 cache miss ratios and a rather low degree of vectorization. These results are intended to serve as a baseline for future developments to the code as well as a preliminary indicator of where to best focus performance optimizations.« less

  7. A chemical EOR benchmark study of different reservoir simulators

    NASA Astrophysics Data System (ADS)

    Goudarzi, Ali; Delshad, Mojdeh; Sepehrnoori, Kamy

    2016-09-01

    Interest in chemical EOR processes has intensified in recent years due to the advancements in chemical formulations and injection techniques. Injecting Polymer (P), surfactant/polymer (SP), and alkaline/surfactant/polymer (ASP) are techniques for improving sweep and displacement efficiencies with the aim of improving oil production in both secondary and tertiary floods. There has been great interest in chemical flooding recently for different challenging situations. These include high temperature reservoirs, formations with extreme salinity and hardness, naturally fractured carbonates, and sandstone reservoirs with heavy and viscous crude oils. More oil reservoirs are reaching maturity where secondary polymer floods and tertiary surfactant methods have become increasingly important. This significance has added to the industry's interest in using reservoir simulators as tools for reservoir evaluation and management to minimize costs and increase the process efficiency. Reservoir simulators with special features are needed to represent coupled chemical and physical processes present in chemical EOR processes. The simulators need to be first validated against well controlled lab and pilot scale experiments to reliably predict the full field implementations. The available data from laboratory scale include 1) phase behavior and rheological data; and 2) results of secondary and tertiary coreflood experiments for P, SP, and ASP floods under reservoir conditions, i.e. chemical retentions, pressure drop, and oil recovery. Data collected from corefloods are used as benchmark tests comparing numerical reservoir simulators with chemical EOR modeling capabilities such as STARS of CMG, ECLIPSE-100 of Schlumberger, REVEAL of Petroleum Experts. The research UTCHEM simulator from The University of Texas at Austin is also included since it has been the benchmark for chemical flooding simulation for over 25 years. The results of this benchmark comparison will be utilized to improve chemical design for field-scale studies using commercial simulators. The benchmark tests illustrate the potential of commercial simulators for chemical flooding projects and provide a comprehensive table of strengths and limitations of each simulator for a given chemical EOR process. Mechanistic simulations of chemical EOR processes will provide predictive capability and can aid in optimization of the field injection projects. The objective of this paper is not to compare the computational efficiency and solution algorithms; it only focuses on the process modeling comparison.

  8. A suite of exercises for verifying dynamic earthquake rupture codes

    USGS Publications Warehouse

    Harris, Ruth A.; Barall, Michael; Aagaard, Brad T.; Ma, Shuo; Roten, Daniel; Olsen, Kim B.; Duan, Benchun; Liu, Dunyu; Luo, Bin; Bai, Kangchen; Ampuero, Jean-Paul; Kaneko, Yoshihiro; Gabriel, Alice-Agnes; Duru, Kenneth; Ulrich, Thomas; Wollherr, Stephanie; Shi, Zheqiang; Dunham, Eric; Bydlon, Sam; Zhang, Zhenguo; Chen, Xiaofei; Somala, Surendra N.; Pelties, Christian; Tago, Josue; Cruz-Atienza, Victor Manuel; Kozdon, Jeremy; Daub, Eric; Aslam, Khurram; Kase, Yuko; Withers, Kyle; Dalguer, Luis

    2018-01-01

    We describe a set of benchmark exercises that are designed to test if computer codes that simulate dynamic earthquake rupture are working as intended. These types of computer codes are often used to understand how earthquakes operate, and they produce simulation results that include earthquake size, amounts of fault slip, and the patterns of ground shaking and crustal deformation. The benchmark exercises examine a range of features that scientists incorporate in their dynamic earthquake rupture simulations. These include implementations of simple or complex fault geometry, off‐fault rock response to an earthquake, stress conditions, and a variety of formulations for fault friction. Many of the benchmarks were designed to investigate scientific problems at the forefronts of earthquake physics and strong ground motions research. The exercises are freely available on our website for use by the scientific community.

  9. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model

    PubMed Central

    Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.

    2014-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410

  10. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model.

    PubMed

    Saul, Katherine R; Hu, Xiao; Goehler, Craig M; Vidt, Meghan E; Daly, Melissa; Velisar, Anca; Murray, Wendy M

    2015-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms.

  11. Numerical investigations on pressurized AL-composite vessel response to hypervelocity impacts: Comparison between experimental works and a numerical code

    NASA Astrophysics Data System (ADS)

    Mespoulet, Jérôme; Plassard, Fabien; Hereil, Pierre-Louis

    2015-09-01

    Response of pressurized composite-Al vessels to hypervelocity impact of aluminum spheres have been numerically investigated to evaluate the influence of initial pressure on the vulnerability of these vessels. Investigated tanks are carbon-fiber overwrapped prestressed Al vessels. Explored internal air pressure ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from experiments (Xray radiographies, particle velocity measurement and post-mortem vessels) have been compared to numerical results given from LS-DYNA ALE-Lagrange-SPH full coupling models. Simulations exhibit an under estimation in term of debris cloud evolution and shock wave propagation in pressurized air but main modes of damage/rupture on the vessels given by simulations are coherent with post-mortem recovered vessels from experiments. First results of this numerical work are promising and further simulation investigations with additional experimental data will be done to increase the reliability of the simulation model. The final aim of this crossed work is to numerically explore a wide range of impact conditions (impact angle, projectile weight, impact velocity, initial pressure) that cannot be explore experimentally. Those whole results will define a rule of thumbs for the definition of a vulnerability analytical model for a given pressurized vessel.

  12. A suite of benchmark and challenge problems for enhanced geothermal systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark; Fu, Pengcheng; McClure, Mark

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilitiesmore » to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners. We present the suite of benchmark and challenge problems developed for the GTO-CCS, providing problem descriptions and sample solutions.« less

  13. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  14. Performance of Multi-chaotic PSO on a shifted benchmark functions set

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan

    2015-03-10

    In this paper the performance of Multi-chaotic PSO algorithm is investigated using two shifted benchmark functions. The purpose of shifted benchmark functions is to simulate the time-variant real-world problems. The results of chaotic PSO are compared with canonical version of the algorithm. It is concluded that using the multi-chaotic approach can lead to better results in optimization of shifted functions.

  15. Lattice Boltzmann method for simulating the viscous flow in large distensible blood vessels

    NASA Astrophysics Data System (ADS)

    Fang, Haiping; Wang, Zuowei; Lin, Zhifang; Liu, Muren

    2002-05-01

    A lattice Boltzmann method for simulating the viscous flow in large distensible blood vessels is presented by introducing a boundary condition for elastic and moving boundaries. The mass conservation for the boundary condition is tested in detail. The viscous flow in elastic vessels is simulated with a pressure-radius relationship similar to that of the pulmonary blood vessels. The numerical results for steady flow agree with the analytical prediction to very high accuracy, and the simulation results for pulsatile flow are comparable with those of the aortic flows observed experimentally. The model is expected to find many applications for studying blood flows in large distensible arteries, especially in those suffering from atherosclerosis, stenosis, aneurysm, etc.

  16. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  17. Benchmarking of Heavy Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  18. Generating Shifting Workloads to Benchmark Adaptability in Relational Database Systems

    NASA Astrophysics Data System (ADS)

    Rabl, Tilmann; Lang, Andreas; Hackl, Thomas; Sick, Bernhard; Kosch, Harald

    A large body of research concerns the adaptability of database systems. Many commercial systems already contain autonomic processes that adapt configurations as well as data structures and data organization. Yet there is virtually no possibility for a just measurement of the quality of such optimizations. While standard benchmarks have been developed that simulate real-world database applications very precisely, none of them considers variations in workloads produced by human factors. Today’s benchmarks test the performance of database systems by measuring peak performance on homogeneous request streams. Nevertheless, in systems with user interaction access patterns are constantly shifting. We present a benchmark that simulates a web information system with interaction of large user groups. It is based on the analysis of a real online eLearning management system with 15,000 users. The benchmark considers the temporal dependency of user interaction. Main focus is to measure the adaptability of a database management system according to shifting workloads. We will give details on our design approach that uses sophisticated pattern analysis and data mining techniques.

  19. Predictive simulation of bidirectional Glenn shunt using a hybrid blood vessel model.

    PubMed

    Li, Hao; Leow, Wee Kheng; Chiu, Ing-Sh

    2009-01-01

    This paper proposes a method for performing predictive simulation of cardiac surgery. It applies a hybrid approach to model the deformation of blood vessels. The hybrid blood vessel model consists of a reference Cosserat rod and a surface mesh. The reference Cosserat rod models the blood vessel's global bending, stretching, twisting and shearing in a physically correct manner, and the surface mesh models the surface details of the blood vessel. In this way, the deformation of blood vessels can be computed efficiently and accurately. Our predictive simulation system can produce complex surgical results given a small amount of user inputs. It allows the surgeon to easily explore various surgical options and evaluate them. Tests of the system using bidirectional Glenn shunt (BDG) as an application example show that the results produc by the system are similar to real surgical results.

  20. Linear-regression convolutional neural network for fully automated coronary lumen segmentation in intravascular optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Yong, Yan Ling; Tan, Li Kuo; McLaughlin, Robert A.; Chee, Kok Han; Liew, Yih Miin

    2017-12-01

    Intravascular optical coherence tomography (OCT) is an optical imaging modality commonly used in the assessment of coronary artery diseases during percutaneous coronary intervention. Manual segmentation to assess luminal stenosis from OCT pullback scans is challenging and time consuming. We propose a linear-regression convolutional neural network to automatically perform vessel lumen segmentation, parameterized in terms of radial distances from the catheter centroid in polar space. Benchmarked against gold-standard manual segmentation, our proposed algorithm achieves average locational accuracy of the vessel wall of 22 microns, and 0.985 and 0.970 in Dice coefficient and Jaccard similarity index, respectively. The average absolute error of luminal area estimation is 1.38%. The processing rate is 40.6 ms per image, suggesting the potential to be incorporated into a clinical workflow and to provide quantitative assessment of vessel lumen in an intraoperative time frame.

  1. Photosynthetic productivity and its efficiencies in ISIMIP2a biome models: benchmarking for impact assessment studies

    NASA Astrophysics Data System (ADS)

    Ito, Akihiko; Nishina, Kazuya; Reyer, Christopher P. O.; François, Louis; Henrot, Alexandra-Jane; Munhoven, Guy; Jacquemin, Ingrid; Tian, Hanqin; Yang, Jia; Pan, Shufen; Morfopoulos, Catherine; Betts, Richard; Hickler, Thomas; Steinkamp, Jörg; Ostberg, Sebastian; Schaphoff, Sibyll; Ciais, Philippe; Chang, Jinfeng; Rafique, Rashid; Zeng, Ning; Zhao, Fang

    2017-08-01

    Simulating vegetation photosynthetic productivity (or gross primary production, GPP) is a critical feature of the biome models used for impact assessments of climate change. We conducted a benchmarking of global GPP simulated by eight biome models participating in the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP2a) with four meteorological forcing datasets (30 simulations), using independent GPP estimates and recent satellite data of solar-induced chlorophyll fluorescence as a proxy of GPP. The simulated global terrestrial GPP ranged from 98 to 141 Pg C yr-1 (1981-2000 mean); considerable inter-model and inter-data differences were found. Major features of spatial distribution and seasonal change of GPP were captured by each model, showing good agreement with the benchmarking data. All simulations showed incremental trends of annual GPP, seasonal-cycle amplitude, radiation-use efficiency, and water-use efficiency, mainly caused by the CO2 fertilization effect. The incremental slopes were higher than those obtained by remote sensing studies, but comparable with those by recent atmospheric observation. Apparent differences were found in the relationship between GPP and incoming solar radiation, for which forcing data differed considerably. The simulated GPP trends co-varied with a vegetation structural parameter, leaf area index, at model-dependent strengths, implying the importance of constraining canopy properties. In terms of extreme events, GPP anomalies associated with a historical El Niño event and large volcanic eruption were not consistently simulated in the model experiments due to deficiencies in both forcing data and parameterized environmental responsiveness. Although the benchmarking demonstrated the overall advancement of contemporary biome models, further refinements are required, for example, for solar radiation data and vegetation canopy schemes.

  2. Terrestrial Microgravity Model and Threshold Gravity Simulation sing Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for such a gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successiblly simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars.

  3. Benchmark Calibration Tests Completed for Stirling Convertor Heater Head Life Assessment

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Halford, Gary R.; Bowman, Randy R.

    2005-01-01

    A major phase of benchmark testing has been completed at the NASA Glenn Research Center (http://www.nasa.gov/glenn/), where a critical component of the Stirling Radioisotope Generator (SRG) is undergoing extensive experimentation to aid the development of an analytical life-prediction methodology. Two special-purpose test rigs subjected SRG heater-head pressure-vessel test articles to accelerated creep conditions, using the standard design temperatures to stay within the wall material s operating creep-response regime, but increasing wall stresses up to 7 times over the design point. This resulted in well-controlled "ballooning" of the heater-head hot end. The test plan was developed to provide critical input to analytical parameters in a reasonable period of time.

  4. An analytical benchmark and a Mathematica program for MD codes: Testing LAMMPS on the 2nd generation Brenner potential

    NASA Astrophysics Data System (ADS)

    Favata, Antonino; Micheletti, Andrea; Ryu, Seunghwa; Pugno, Nicola M.

    2016-10-01

    An analytical benchmark and a simple consistent Mathematica program are proposed for graphene and carbon nanotubes, that may serve to test any molecular dynamics code implemented with REBO potentials. By exploiting the benchmark, we checked results produced by LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) when adopting the second generation Brenner potential, we made evident that this code in its current implementation produces results which are offset from those of the benchmark by a significant amount, and provide evidence of the reason.

  5. Simulant Basis for the Standard High Solids Vessel Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Reid A.; Fiskum, Sandra K.; Suffield, Sarah R.

    The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. Themore » simulant recipes that meet this basis will be provided in a subsequent document.« less

  6. Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.

    2007-03-01

    In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.

  7. IgSimulator: a versatile immunosequencing simulator.

    PubMed

    Safonova, Yana; Lapidus, Alla; Lill, Jennie

    2015-10-01

    The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Benchmarks for target tracking

    NASA Astrophysics Data System (ADS)

    Dunham, Darin T.; West, Philip D.

    2011-09-01

    The term benchmark originates from the chiseled horizontal marks that surveyors made, into which an angle-iron could be placed to bracket ("bench") a leveling rod, thus ensuring that the leveling rod can be repositioned in exactly the same place in the future. A benchmark in computer terms is the result of running a computer program, or a set of programs, in order to assess the relative performance of an object by running a number of standard tests and trials against it. This paper will discuss the history of simulation benchmarks that are being used by multiple branches of the military and agencies of the US government. These benchmarks range from missile defense applications to chemical biological situations. Typically, a benchmark is used with Monte Carlo runs in order to tease out how algorithms deal with variability and the range of possible inputs. We will also describe problems that can be solved by a benchmark.

  9. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  10. Measurement of stray EC radiation on W7-AS

    NASA Astrophysics Data System (ADS)

    Gandini, F.; Hirsch, M.; Cirant, S.; Erckmann, V.; Granucci, G.; Kasparek, W.; Laqua, H. P.; Muzzini, V.; Nowak, S.; Radau, S.

    2001-10-01

    In the framework of a collaboration between IFP-CNR Milano, IPP Garching/Greifswald and IPF Stuttgart, a set of four millimeterwave probes has been installed in W7-AS stellarator at selected positions of the inner vessel wall. Their purpose is to observe RF stray radiation during operation in presence of strong level of Electron Cyclotron (EC) waves, used for plasma start-up, heating and current drive. The aim of these measurements is to benchmark two complementary theoretical models for the distribution of the stray radiation in the vessel. From these codes, quantitative predictions are expected for the spatial distribution of the RF wall load and the RF-impact on in-vessel components in large future devices such as W7-X and, possibly, ITER. This input is important to optimize the wall armour and select rf-compatible in-vessel materials. We present first measurements from different heating and startup scenarios, with up to 800 kW of injected power at 140 GHz and different launching geometries. An analysis of measurements performed on FTU using a previous version of sniffer probe is also presented.

  11. Combining Phase Identification and Statistic Modeling for Automated Parallel Benchmark Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Ye; Ma, Xiaosong; Liu, Qing Gary

    2015-01-01

    Parallel application benchmarks are indispensable for evaluating/optimizing HPC software and hardware. However, it is very challenging and costly to obtain high-fidelity benchmarks reflecting the scale and complexity of state-of-the-art parallel applications. Hand-extracted synthetic benchmarks are time-and labor-intensive to create. Real applications themselves, while offering most accurate performance evaluation, are expensive to compile, port, reconfigure, and often plainly inaccessible due to security or ownership concerns. This work contributes APPRIME, a novel tool for trace-based automatic parallel benchmark generation. Taking as input standard communication-I/O traces of an application's execution, it couples accurate automatic phase identification with statistical regeneration of event parameters tomore » create compact, portable, and to some degree reconfigurable parallel application benchmarks. Experiments with four NAS Parallel Benchmarks (NPB) and three real scientific simulation codes confirm the fidelity of APPRIME benchmarks. They retain the original applications' performance characteristics, in particular the relative performance across platforms.« less

  12. Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronningen, Reginald Martin; Remec, Igor; Heilbronn, Lawrence H.

    Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS that could be used for designmore » simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the Report of the 2003 RIA R&D Workshop".« less

  13. Humidification of Blow-By Oxygen During Recovery of Postoperative Pediatric Patients: One Unit's Journey.

    PubMed

    Donahue, Suzanne; DiBlasi, Robert M; Thomas, Karen

    2018-02-02

    To examine the practice of nebulizer cool mist blow-by oxygen administered to spontaneously breathing postanesthesia care unit (PACU) pediatric patients during Phase one recovery. Existing evidence was evaluated. Informal benchmarking documented practices in peer organizations. An in vitro study was then conducted to simulate clinical practice and determine depth and amount of airway humidity delivery with blow-by oxygen. Informal benchmarking information was obtained by telephone interview. Using a three-dimensional printed simulation model of the head connected to a breathing lung simulator, depth and amount of moisture delivery in the respiratory tree were measured. Evidence specific to PACU administration of cool mist blow-by oxygen was limited. Informal benchmarking revealed that routine cool mist oxygenated blow-by administration was not widely practiced. The laboratory experiment revealed minimal moisture reaching the mid-tracheal area of the simulated airway model. Routine use of oxygenated cool mist in spontaneously breathing pediatric PACU patients is not supported. Copyright © 2017 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  14. Properties important to mixing and simulant recommendations for WTP full-scale vessel testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poirier, M. R.; Martino, C. J.

    2015-12-01

    Full Scale Vessel Testing (FSVT) is being planned by Bechtel National, Inc., to demonstrate the ability of the standard high solids vessel design (SHSVD) to meet mixing requirements over the range of fluid properties planned for processing in the Pretreatment Facility (PTF) of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. WTP personnel requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in FSVT. Among the tasks assignedmore » to SRNL was to develop a list of waste properties that are important to pulse-jet mixer (PJM) performance in WTP vessels with elevated concentrations of solids.« less

  15. Opto-Electronic and Interconnects Hierarchical Design Automation System (OE-IDEAS)

    DTIC Science & Technology

    2004-05-01

    NETBOOK WEBSITE............................................................71 8.2 SIMULATION OF CRITICAL PATH FROM THE MAYO “10G” SYSTEM MCM BOARD...Benchmarks from the DaVinci Netbook website In May 2002, CFDRC downloaded all the materials from the DaVinci Netbook website containing the benchmark

  16. ZPPR-20 phase D : a cylindrical assembly of polyethylene moderated U metal reflected by beryllium oxide and polyethylene.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lell, R.; Grimm, K.; McKnight, R.

    The Zero Power Physics Reactor (ZPPR) fast critical facility was built at the Argonne National Laboratory-West (ANL-W) site in Idaho in 1969 to obtain neutron physics information necessary for the design of fast breeder reactors. The ZPPR-20D Benchmark Assembly was part of a series of cores built in Assembly 20 (References 1 through 3) of the ZPPR facility to provide data for developing a nuclear power source for space applications (SP-100). The assemblies were beryllium oxide reflected and had core fuel compositions containing enriched uranium fuel, niobium and rhenium. ZPPR-20 Phase C (HEU-MET-FAST-075) was built as the reference flight configuration.more » Two other configurations, Phases D and E, simulated accident scenarios. Phase D modeled the water immersion scenario during a launch accident, and Phase E (SUB-HEU-MET-FAST-001) modeled the earth burial scenario during a launch accident. Two configurations were recorded for the simulated water immersion accident scenario (Phase D); the critical configuration, documented here, and the subcritical configuration (SUB-HEU-MET-MIXED-001). Experiments in Assembly 20 Phases 20A through 20F were performed in 1988. The reference water immersion configuration for the ZPPR-20D assembly was obtained as reactor loading 129 on October 7, 1988 with a fissile mass of 167.477 kg and a reactivity of -4.626 {+-} 0.044{cents} (k {approx} 0.9997). The SP-100 core was to be constructed of highly enriched uranium nitride, niobium, rhenium and depleted lithium. The core design called for two enrichment zones with niobium-1% zirconium alloy fuel cladding and core structure. Rhenium was to be used as a fuel pin liner to provide shut down in the event of water immersion and flooding. The core coolant was to be depleted lithium metal ({sup 7}Li). The core was to be surrounded radially with a niobium reactor vessel and bypass which would carry the lithium coolant to the forward inlet plenum. Immediately inside the reactor vessel was a rhenium baffle which would act as a neutron curtain in the event of water immersion. A fission gas plenum and coolant inlet plenum were located axially forward of the core. Some material substitutions had to be made in mocking up the SP-100 design. The ZPPR-20 critical assemblies were fueled by 93% enriched uranium metal because uranium nitride, which was the SP-100 fuel type, was not available. ZPPR Assembly 20D was designed to simulate a water immersion accident. The water was simulated by polyethylene (CH{sub 2}), which contains a similar amount of hydrogen and has a similar density. A very accurate transformation to a simplified model is needed to make any of the ZPPR assemblies a practical criticality-safety benchmark. There is simply too much geometric detail in an exact model of a ZPPR assembly, particularly as complicated an assembly as ZPPR-20D. The transformation must reduce the detail to a practical level without masking any of the important features of the critical experiment. And it must do this without increasing the total uncertainty far beyond that of the original experiment. Such a transformation will be described in a later section. First, Assembly 20D was modeled in full detail--every plate, drawer, matrix tube, and air gap was modeled explicitly. Then the regionwise compositions and volumes from this model were converted to an RZ model. ZPPR Assembly 20D has been determined to be an acceptable criticality-safety benchmark experiment.« less

  17. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  18. Benchmarking FEniCS for mantle convection simulations

    NASA Astrophysics Data System (ADS)

    Vynnytska, L.; Rognes, M. E.; Clark, S. R.

    2013-01-01

    This paper evaluates the usability of the FEniCS Project for mantle convection simulations by numerical comparison to three established benchmarks. The benchmark problems all concern convection processes in an incompressible fluid induced by temperature or composition variations, and cover three cases: (i) steady-state convection with depth- and temperature-dependent viscosity, (ii) time-dependent convection with constant viscosity and internal heating, and (iii) a Rayleigh-Taylor instability. These problems are modeled by the Stokes equations for the fluid and advection-diffusion equations for the temperature and composition. The FEniCS Project provides a novel platform for the automated solution of differential equations by finite element methods. In particular, it offers a significant flexibility with regard to modeling and numerical discretization choices; we have here used a discontinuous Galerkin method for the numerical solution of the advection-diffusion equations. Our numerical results are in agreement with the benchmarks, and demonstrate the applicability of both the discontinuous Galerkin method and FEniCS for such applications.

  19. Evaluation of Neutron Radiography Reactor LEU-Core Start-Up Measurements

    DOE PAGES

    Bess, John D.; Maddock, Thomas L.; Smolinski, Andrew T.; ...

    2014-11-04

    Benchmark models were developed to evaluate the cold-critical start-up measurements performed during the fresh core reload of the Neutron Radiography (NRAD) reactor with Low Enriched Uranium (LEU) fuel. Experiments include criticality, control-rod worth measurements, shutdown margin, and excess reactivity for four core loadings with 56, 60, 62, and 64 fuel elements. The worth of four graphite reflector block assemblies and an empty dry tube used for experiment irradiations were also measured and evaluated for the 60-fuel-element core configuration. Dominant uncertainties in the experimental k eff come from uncertainties in the manganese content and impurities in the stainless steel fuel claddingmore » as well as the 236U and erbium poison content in the fuel matrix. Calculations with MCNP5 and ENDF/B-VII.0 neutron nuclear data are approximately 1.4% (9σ) greater than the benchmark model eigenvalues, which is commonly seen in Monte Carlo simulations of other TRIGA reactors. Simulations of the worth measurements are within the 2σ uncertainty for most of the benchmark experiment worth values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  20. Evaluation of Neutron Radiography Reactor LEU-Core Start-Up Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Maddock, Thomas L.; Smolinski, Andrew T.

    Benchmark models were developed to evaluate the cold-critical start-up measurements performed during the fresh core reload of the Neutron Radiography (NRAD) reactor with Low Enriched Uranium (LEU) fuel. Experiments include criticality, control-rod worth measurements, shutdown margin, and excess reactivity for four core loadings with 56, 60, 62, and 64 fuel elements. The worth of four graphite reflector block assemblies and an empty dry tube used for experiment irradiations were also measured and evaluated for the 60-fuel-element core configuration. Dominant uncertainties in the experimental k eff come from uncertainties in the manganese content and impurities in the stainless steel fuel claddingmore » as well as the 236U and erbium poison content in the fuel matrix. Calculations with MCNP5 and ENDF/B-VII.0 neutron nuclear data are approximately 1.4% (9σ) greater than the benchmark model eigenvalues, which is commonly seen in Monte Carlo simulations of other TRIGA reactors. Simulations of the worth measurements are within the 2σ uncertainty for most of the benchmark experiment worth values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  1. Simulation of Benchmark Cases with the Terminal Area Simulation System (TASS)

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at; Proctor, Fred

    2011-01-01

    The hydrodynamic core of the Terminal Area Simulation System (TASS) is evaluated against different benchmark cases. In the absence of closed form solutions for the equations governing atmospheric flows, the models are usually evaluated against idealized test cases. Over the years, various authors have suggested a suite of these idealized cases which have become standards for testing and evaluating the dynamics and thermodynamics of atmospheric flow models. In this paper, simulations of three such cases are described. In addition, the TASS model is evaluated against a test case that uses an exact solution of the Navier-Stokes equations. The TASS results are compared against previously reported simulations of these banchmark cases in the literature. It is demonstrated that the TASS model is highly accurate, stable and robust.

  2. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    NASA Astrophysics Data System (ADS)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes

    2015-05-01

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  3. Evaluation of the Pool Critical Assembly Benchmark with Explicitly-Modeled Geometry using MCNP6

    DOE PAGES

    Kulesza, Joel A.; Martz, Roger Lee

    2017-03-01

    Despite being one of the most widely used benchmarks for qualifying light water reactor (LWR) radiation transport methods and data, no benchmark calculation of the Oak Ridge National Laboratory (ORNL) Pool Critical Assembly (PCA) pressure vessel wall benchmark facility (PVWBF) using MCNP6 with explicitly modeled core geometry exists. As such, this paper provides results for such an analysis. First, a criticality calculation is used to construct the fixed source term. Next, ADVANTG-generated variance reduction parameters are used within the final MCNP6 fixed source calculations. These calculations provide unadjusted dosimetry results using three sets of dosimetry reaction cross sections of varyingmore » ages (those packaged with MCNP6, from the IRDF-2002 multi-group library, and from the ACE-formatted IRDFF v1.05 library). These results are then compared to two different sets of measured reaction rates. The comparison agrees in an overall sense within 2% and on a specific reaction- and dosimetry location-basis within 5%. Except for the neptunium dosimetry, the individual foil raw calculation-to-experiment comparisons usually agree within 10% but is typically greater than unity. Finally, in the course of developing these calculations, geometry that has previously not been completely specified is provided herein for the convenience of future analysts.« less

  4. Benchmark simulation Model no 2 in Matlab-simulink: towards plant-wide WWTP control strategy evaluation.

    PubMed

    Vreck, D; Gernaey, K V; Rosen, C; Jeppsson, U

    2006-01-01

    In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment processes. Extended evaluation criteria are proposed for plant-wide control strategy assessment. Default open-loop and closed-loop strategies are also proposed to be used as references with which to compare other control strategies. Simulations indicate that the BM2 is an appropriate tool for plant-wide control strategy evaluation.

  5. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  6. Proficiency performance benchmarks for removal of simulated brain tumors using a virtual reality simulator NeuroTouch.

    PubMed

    AlZhrani, Gmaan; Alotaibi, Fahad; Azarnoush, Hamed; Winkler-Schwartz, Alexander; Sabbagh, Abdulrahman; Bajunaid, Khalid; Lajoie, Susanne P; Del Maestro, Rolando F

    2015-01-01

    Assessment of neurosurgical technical skills involved in the resection of cerebral tumors in operative environments is complex. Educators emphasize the need to develop and use objective and meaningful assessment tools that are reliable and valid for assessing trainees' progress in acquiring surgical skills. The purpose of this study was to develop proficiency performance benchmarks for a newly proposed set of objective measures (metrics) of neurosurgical technical skills performance during simulated brain tumor resection using a new virtual reality simulator (NeuroTouch). Each participant performed the resection of 18 simulated brain tumors of different complexity using the NeuroTouch platform. Surgical performance was computed using Tier 1 and Tier 2 metrics derived from NeuroTouch simulator data consisting of (1) safety metrics, including (a) volume of surrounding simulated normal brain tissue removed, (b) sum of forces utilized, and (c) maximum force applied during tumor resection; (2) quality of operation metric, which involved the percentage of tumor removed; and (3) efficiency metrics, including (a) instrument total tip path lengths and (b) frequency of pedal activation. All studies were conducted in the Neurosurgical Simulation Research Centre, Montreal Neurological Institute and Hospital, McGill University, Montreal, Canada. A total of 33 participants were recruited, including 17 experts (board-certified neurosurgeons) and 16 novices (7 senior and 9 junior neurosurgery residents). The results demonstrated that "expert" neurosurgeons resected less surrounding simulated normal brain tissue and less tumor tissue than residents. These data are consistent with the concept that "experts" focused more on safety of the surgical procedure compared with novices. By analyzing experts' neurosurgical technical skills performance on these different metrics, we were able to establish benchmarks for goal proficiency performance training of neurosurgery residents. This study furthers our understanding of expert neurosurgical performance during the resection of simulated virtual reality tumors and provides neurosurgical trainees with predefined proficiency performance benchmarks designed to maximize the learning of specific surgical technical skills. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. Development and Experimental Benchmark of Simulations to Predict Used Nuclear Fuel Cladding Temperatures during Drying and Transfer Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greiner, Miles

    Radial hydride formation in high-burnup used fuel cladding has the potential to radically reduce its ductility and suitability for long-term storage and eventual transport. To avoid this formation, the maximum post-reactor temperature must remain sufficiently low to limit the cladding hoop stress, and so that hydrogen from the existing circumferential hydrides will not dissolve and become available to re-precipitate into radial hydrides under the slow cooling conditions during drying, transfer and early dry-cask storage. The objective of this research is to develop and experimentallybenchmark computational fluid dynamics simulations of heat transfer in post-pool-storage drying operations, when high-burnup fuel cladding ismore » likely to experience its highest temperature. These benchmarked tools can play a key role in evaluating dry cask storage systems for extended storage of high-burnup fuels and post-storage transportation, including fuel retrievability. The benchmarked tools will be used to aid the design of efficient drying processes, as well as estimate variations of surface temperatures as a means of inferring helium integrity inside the canister or cask. This work will be conducted effectively because the principal investigator has experience developing these types of simulations, and has constructed a test facility that can be used to benchmark them.« less

  8. PMLB: a large benchmark suite for machine learning evaluation and comparison.

    PubMed

    Olson, Randal S; La Cava, William; Orzechowski, Patryk; Urbanowicz, Ryan J; Moore, Jason H

    2017-01-01

    The selection, development, or comparison of machine learning methods in data mining can be a difficult task based on the target problem and goals of a particular study. Numerous publicly available real-world and simulated benchmark datasets have emerged from different sources, but their organization and adoption as standards have been inconsistent. As such, selecting and curating specific benchmarks remains an unnecessary burden on machine learning practitioners and data scientists. The present study introduces an accessible, curated, and developing public benchmark resource to facilitate identification of the strengths and weaknesses of different machine learning methodologies. We compare meta-features among the current set of benchmark datasets in this resource to characterize the diversity of available data. Finally, we apply a number of established machine learning methods to the entire benchmark suite and analyze how datasets and algorithms cluster in terms of performance. From this study, we find that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. This work represents another important step towards understanding the limitations of popular benchmarking suites and developing a resource that connects existing benchmarking standards to more diverse and efficient standards in the future.

  9. Terrestrial Microgravity Model and Threshold Gravity Simulation using Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for such a gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successfully simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars. The paper will discuss experiments md modeling work to date in support of this project.

  10. Evaluation of control strategies using an oxidation ditch benchmark.

    PubMed

    Abusam, A; Keesman, K J; Spanjers, H; van, Straten G; Meinema, K

    2002-01-01

    This paper presents validation and implementation results of a benchmark developed for a specific full-scale oxidation ditch wastewater treatment plant. A benchmark is a standard simulation procedure that can be used as a tool in evaluating various control strategies proposed for wastewater treatment plants. It is based on model and performance criteria development. Testing of this benchmark, by comparing benchmark predictions to real measurements of the electrical energy consumptions and amounts of disposed sludge for a specific oxidation ditch WWTP, has shown that it can (reasonably) be used for evaluating the performance of this WWTP. Subsequently, the validated benchmark was then used in evaluating some basic and advanced control strategies. Some of the interesting results obtained are the following: (i) influent flow splitting ratio, between the first and the fourth aerated compartments of the ditch, has no significant effect on the TN concentrations in the effluent, and (ii) for evaluation of long-term control strategies, future benchmarks need to be able to assess settlers' performance.

  11. Comparison of mapping algorithms used in high-throughput sequencing: application to Ion Torrent data

    PubMed Central

    2014-01-01

    Background The rapid evolution in high-throughput sequencing (HTS) technologies has opened up new perspectives in several research fields and led to the production of large volumes of sequence data. A fundamental step in HTS data analysis is the mapping of reads onto reference sequences. Choosing a suitable mapper for a given technology and a given application is a subtle task because of the difficulty of evaluating mapping algorithms. Results In this paper, we present a benchmark procedure to compare mapping algorithms used in HTS using both real and simulated datasets and considering four evaluation criteria: computational resource and time requirements, robustness of mapping, ability to report positions for reads in repetitive regions, and ability to retrieve true genetic variation positions. To measure robustness, we introduced a new definition for a correctly mapped read taking into account not only the expected start position of the read but also the end position and the number of indels and substitutions. We developed CuReSim, a new read simulator, that is able to generate customized benchmark data for any kind of HTS technology by adjusting parameters to the error types. CuReSim and CuReSimEval, a tool to evaluate the mapping quality of the CuReSim simulated reads, are freely available. We applied our benchmark procedure to evaluate 14 mappers in the context of whole genome sequencing of small genomes with Ion Torrent data for which such a comparison has not yet been established. Conclusions A benchmark procedure to compare HTS data mappers is introduced with a new definition for the mapping correctness as well as tools to generate simulated reads and evaluate mapping quality. The application of this procedure to Ion Torrent data from the whole genome sequencing of small genomes has allowed us to validate our benchmark procedure and demonstrate that it is helpful for selecting a mapper based on the intended application, questions to be addressed, and the technology used. This benchmark procedure can be used to evaluate existing or in-development mappers as well as to optimize parameters of a chosen mapper for any application and any sequencing platform. PMID:24708189

  12. Benchmarking of measurement and simulation of transverse rms-emittance growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Dong-O

    2008-01-01

    Transverse emittance growth along the Alvarez DTL section is a major concern with respect to the preservation of beam quality of high current beams at the GSI UNILAC. In order to define measures to reduce this growth appropriated tools to simulate the beam dynamics are indispensable. This paper is about the benchmarking of three beam dynamics simulation codes, i.e. DYNAMION, PARMILA, and PARTRAN against systematic measurements of beam emittances for different machine settings. Experimental set-ups, data reduction, the preparation of the simulations, and the evaluation of the simulations will be described. It was found that the measured 100%-rmsemittances behind themore » DTL exceed the simulated values. Comparing measured 90%-rms-emittances to the simulated 95%-rms-emittances gives fair to good agreement instead. The sum of horizontal and vertical emittances is even described well by the codes as long as experimental 90%-rmsemittances are compared to simulated 95%-rms-emittances. Finally, the successful reduction of transverse emittance growth by systematic beam matching is reported.« less

  13. Linear-regression convolutional neural network for fully automated coronary lumen segmentation in intravascular optical coherence tomography.

    PubMed

    Yong, Yan Ling; Tan, Li Kuo; McLaughlin, Robert A; Chee, Kok Han; Liew, Yih Miin

    2017-12-01

    Intravascular optical coherence tomography (OCT) is an optical imaging modality commonly used in the assessment of coronary artery diseases during percutaneous coronary intervention. Manual segmentation to assess luminal stenosis from OCT pullback scans is challenging and time consuming. We propose a linear-regression convolutional neural network to automatically perform vessel lumen segmentation, parameterized in terms of radial distances from the catheter centroid in polar space. Benchmarked against gold-standard manual segmentation, our proposed algorithm achieves average locational accuracy of the vessel wall of 22 microns, and 0.985 and 0.970 in Dice coefficient and Jaccard similarity index, respectively. The average absolute error of luminal area estimation is 1.38%. The processing rate is 40.6 ms per image, suggesting the potential to be incorporated into a clinical workflow and to provide quantitative assessment of vessel lumen in an intraoperative time frame. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  14. Engine dynamic analysis with general nonlinear finite element codes. Part 2: Bearing element implementation overall numerical characteristics and benchmaking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Fertis, J.; Zeid, I.; Lam, P.

    1982-01-01

    Finite element codes are used in modelling rotor-bearing-stator structure common to the turbine industry. Engine dynamic simulation is used by developing strategies which enable the use of available finite element codes. benchmarking the elements developed are benchmarked by incorporation into a general purpose code (ADINA); the numerical characteristics of finite element type rotor-bearing-stator simulations are evaluated through the use of various types of explicit/implicit numerical integration operators. Improving the overall numerical efficiency of the procedure is improved.

  15. Lattice Boltzmann Simulation of Blood Flow in Blood Vessels with the Rolling Massage

    NASA Astrophysics Data System (ADS)

    Yi, Hou-Hui; Xu, Shi-Xiong; Qian, Yue-Hong; Fang, Hai-Ping

    2005-12-01

    The rolling massage manipulation is a classic Chinese massage, which is expected to improve the circulation by pushing, pulling and kneading of the muscle. A model for the rolling massage manipulation is proposed and the lattice Boltzmann method is applied to study the blood flow in the blood vessels. The simulation results show that the blood flux is considerably modified by the rolling massage and the explicit value depends on the rolling frequency, the rolling depth, and the diameter of the vessel. The smaller the diameter of the blood vessel, the larger the enhancement of the blood flux by the rolling massage. The model, together with the simulation results, is expected to be helpful to understand the mechanism and further development of rolling massage techniques.

  16. Preliminary Results for the OECD/NEA Time Dependent Benchmark using Rattlesnake, Rattlesnake-IQS and TDKENO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark D.; Mausolff, Zander; Weems, Zach

    2016-08-01

    One goal of the MAMMOTH M&S project is to validate the analysis capabilities within MAMMOTH. Historical data has shown limited value for validation of full three-dimensional (3D) multi-physics methods. Initial analysis considered the TREAT startup minimum critical core and one of the startup transient tests. At present, validation is focusing on measurements taken during the M8CAL test calibration series. These exercises will valuable in preliminary assessment of the ability of MAMMOTH to perform coupled multi-physics calculations; calculations performed to date are being used to validate the neutron transport solver Rattlesnake\\cite{Rattlesnake} and the fuels performance code BISON. Other validation projects outsidemore » of TREAT are available for single-physics benchmarking. Because the transient solution capability of Rattlesnake is one of the key attributes that makes it unique for TREAT transient simulations, validation of the transient solution of Rattlesnake using other time dependent kinetics benchmarks has considerable value. The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has recently developed a computational benchmark for transient simulations. This benchmark considered both two-dimensional (2D) and 3D configurations for a total number of 26 different transients. All are negative reactivity insertions, typically returning to the critical state after some time.« less

  17. Targeting the affordability of cigarettes: a new benchmark for taxation policy in low-income and-middle-income countries.

    PubMed

    Blecher, Evan

    2010-08-01

    To investigate the appropriateness of tax incidence (the percentage of the retail price occupied by taxes) benchmarking in low-income and-middle-income countries (LMICs) with rapidly growing economies and to explore the viability of an alternative tax policy rule based on the affordability of cigarettes. The paper outlines criticisms of tax incidence benchmarking, particularly in the context of LMICs. It then considers an affordability-based benchmark using relative income price (RIP) as a measure of affordability. The RIP measures the percentage of annual per capita GDP required to purchase 100 packs of cigarettes. Using South Africa as a case study of an LMIC, future consumption is simulated using both tax incidence benchmarks and affordability benchmarks. I show that a tax incidence benchmark is not an optimal policy tool in South Africa and that an affordability benchmark could be a more effective means of reducing tobacco consumption in the future. Although a tax incidence benchmark was successful in increasing prices and reducing tobacco consumption in South Africa in the past, this approach has drawbacks, particularly in the context of a rapidly growing LMIC economy. An affordability benchmark represents an appropriate alternative that would be more effective in reducing future cigarette consumption.

  18. An Enriched Shell Element for Delamination Simulation in Composite Laminates

    NASA Technical Reports Server (NTRS)

    McElroy, Mark

    2015-01-01

    A formulation is presented for an enriched shell finite element capable of delamination simulation in composite laminates. The element uses an adaptive splitting approach for damage characterization that allows for straightforward low-fidelity model creation and a numerically efficient solution. The Floating Node Method is used in conjunction with the Virtual Crack Closure Technique to predict delamination growth and represent it discretely at an arbitrary ply interface. The enriched element is verified for Mode I delamination simulation using numerical benchmark data. After determining important mesh configuration guidelines for the vicinity of the delamination front in the model, a good correlation was found between the enriched shell element model results and the benchmark data set.

  19. First benchmark of the Unstructured Grid Adaptation Working Group

    NASA Technical Reports Server (NTRS)

    Ibanez, Daniel; Barral, Nicolas; Krakos, Joshua; Loseille, Adrien; Michal, Todd; Park, Mike

    2017-01-01

    Unstructured grid adaptation is a technology that holds the potential to improve the automation and accuracy of computational fluid dynamics and other computational disciplines. Difficulty producing the highly anisotropic elements necessary for simulation on complex curved geometries that satisfies a resolution request has limited this technology's widespread adoption. The Unstructured Grid Adaptation Working Group is an open gathering of researchers working on adapting simplicial meshes to conform to a metric field. Current members span a wide range of institutions including academia, industry, and national laboratories. The purpose of this group is to create a common basis for understanding and improving mesh adaptation. We present our first major contribution: a common set of benchmark cases, including input meshes and analytic metric specifications, that are publicly available to be used for evaluating any mesh adaptation code. We also present the results of several existing codes on these benchmark cases, to illustrate their utility in identifying key challenges common to all codes and important differences between available codes. Future directions are defined to expand this benchmark to mature the technology necessary to impact practical simulation workflows.

  20. Comparing Hospital Processes and Outcomes in California Medicare Beneficiaries: Simulation Prompts Reconsideration.

    PubMed

    Escobar, Gabriel J; Baker, Jennifer M; Turk, Benjamin J; Draper, David; Liu, Vincent; Kipnis, Patricia

    2017-01-01

    This article is not a traditional research report. It describes how conducting a specific set of benchmarking analyses led us to broader reflections on hospital benchmarking. We reexamined an issue that has received far less attention from researchers than in the past: How variations in the hospital admission threshold might affect hospital rankings. Considering this threshold made us reconsider what benchmarking is and what future benchmarking studies might be like. Although we recognize that some of our assertions are speculative, they are based on our reading of the literature and previous and ongoing data analyses being conducted in our research unit. We describe the benchmarking analyses that led to these reflections. The Centers for Medicare and Medicaid Services' Hospital Compare Web site includes data on fee-for-service Medicare beneficiaries but does not control for severity of illness, which requires physiologic data now available in most electronic medical records.To address this limitation, we compared hospital processes and outcomes among Kaiser Permanente Northern California's (KPNC) Medicare Advantage beneficiaries and non-KPNC California Medicare beneficiaries between 2009 and 2010. We assigned a simulated severity of illness measure to each record and explored the effect of having the additional information on outcomes. We found that if the admission severity of illness in non-KPNC hospitals increased, KPNC hospitals' mortality performance would appear worse; conversely, if admission severity at non-KPNC hospitals' decreased, KPNC hospitals' performance would appear better. Future hospital benchmarking should consider the impact of variation in admission thresholds.

  1. INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Javier Ortensi; Sonat Sen

    2013-09-01

    The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less

  2. Benchmark Evaluation of Start-Up and Zero-Power Measurements at the High-Temperature Engineering Test Reactor

    DOE PAGES

    Bess, John D.; Fujimoto, Nozomu

    2014-10-09

    Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  3. Ultracool dwarf benchmarks with Gaia primaries

    NASA Astrophysics Data System (ADS)

    Marocco, F.; Pinfield, D. J.; Cook, N. J.; Zapatero Osorio, M. R.; Montes, D.; Caballero, J. A.; Gálvez-Ortiz, M. C.; Gromadzki, M.; Jones, H. R. A.; Kurtev, R.; Smart, R. L.; Zhang, Z.; Cabrera Lavers, A. L.; García Álvarez, D.; Qi, Z. X.; Rickard, M. J.; Dover, L.

    2017-10-01

    We explore the potential of Gaia for the field of benchmark ultracool/brown dwarf companions, and present the results of an initial search for metal-rich/metal-poor systems. A simulated population of resolved ultracool dwarf companions to Gaia primary stars is generated and assessed. Of the order of ˜24 000 companions should be identifiable outside of the Galactic plane (|b| > 10 deg) with large-scale ground- and space-based surveys including late M, L, T and Y types. Our simulated companion parameter space covers 0.02 ≤ M/M⊙ ≤ 0.1, 0.1 ≤ age/Gyr ≤ 14 and -2.5 ≤ [Fe/H] ≤ 0.5, with systems required to have a false alarm probability <10-4, based on projected separation and expected constraints on common distance, common proper motion and/or common radial velocity. Within this bulk population, we identify smaller target subsets of rarer systems whose collective properties still span the full parameter space of the population, as well as systems containing primary stars that are good age calibrators. Our simulation analysis leads to a series of recommendations for candidate selection and observational follow-up that could identify ˜500 diverse Gaia benchmarks. As a test of the veracity of our methodology and simulations, our initial search uses UKIRT Infrared Deep Sky Survey and Sloan Digital Sky Survey to select secondaries, with the parameters of primaries taken from Tycho-2, Radial Velocity Experiment, Large sky Area Multi-Object fibre Spectroscopic Telescope and Tycho-Gaia Astrometric Solution. We identify and follow up 13 new benchmarks. These include M8-L2 companions, with metallicity constraints ranging in quality, but robust in the range -0.39 ≤ [Fe/H] ≤ +0.36, and with projected physical separation in the range 0.6 < s/kau < 76. Going forward, Gaia offers a very high yield of benchmark systems, from which diverse subsamples may be able to calibrate a range of foundational ultracool/sub-stellar theory and observation.

  4. Numerical simulation and analysis of accurate blood oxygenation measurement by using optical resolution photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Yu, Tianhao; Li, Qian; Li, Lin; Zhou, Chuanqing

    2016-10-01

    Accuracy of photoacoustic signal is the crux on measurement of oxygen saturation in functional photoacoustic imaging, which is influenced by factors such as defocus of laser beam, curve shape of large vessels and nonlinear saturation effect of optical absorption in biological tissues. We apply Monte Carlo model to simulate energy deposition in tissues and obtain photoacoustic signals reaching a simulated focused surface detector to investigate corresponding influence of these factors. We also apply compensation on photoacoustic imaging of in vivo cat cerebral cortex blood vessels, in which signals from different lateral positions of vessels are corrected based on simulation results. And this process on photoacoustic images can improve the smoothness and accuracy of oxygen saturation results.

  5. Direct numerical simulation of cellular-scale blood flow in microvascular networks

    NASA Astrophysics Data System (ADS)

    Balogh, Peter; Bagchi, Prosenjit

    2017-11-01

    A direct numerical simulation method is developed to study cellular-scale blood flow in physiologically realistic microvascular networks that are constructed in silico following published in vivo images and data, and are comprised of bifurcating, merging, and winding vessels. The model resolves large deformation of individual red blood cells (RBC) flowing in such complex networks. The vascular walls and deformable interfaces of the RBCs are modeled using the immersed-boundary methods. Time-averaged hemodynamic quantities obtained from the simulations agree quite well with published in vivo data. Our simulations reveal that in several vessels the flow rates and pressure drops could be negatively correlated. The flow resistance and hematocrit are also found to be negatively correlated in some vessels. These observations suggest a deviation from the classical Poiseuille's law in such vessels. The cells are observed to frequently jam at vascular bifurcations resulting in reductions in hematocrit and flow rate in the daughter and mother vessels. We find that RBC jamming results in several orders of magnitude increase in hemodynamic resistance, and thus provides an additional mechanism of increased in vivo blood viscosity as compared to that determined in vitro. Funded by NSF CBET 1604308.

  6. Bio-Adaption between Magnesium Alloy Stent and the Blood Vessel: A Review.

    PubMed

    Ma, Jun; Zhao, Nan; Betts, Lexxus; Zhu, Donghui

    2016-09-01

    Biodegradable magnesium (Mg) alloy stents are the most promising next generation of bio-absorbable stents. In this article, we summarized the progresses on the in vitro studies, animal testing and clinical trials of biodegradable Mg alloy stents in the past decades. These exciting findings led us to propose the importance of the concept "bio-adaption" between the Mg alloy stent and the local tissue microenvironment after implantation. The healing responses of stented blood vessel can be generally described in three overlapping phases: inflammation, granulation and remodeling. The ideal bio-adaption of the Mg alloy stent, once implanted into the blood vessel, needs to be a reasonable function of the time and the space/dimension. First, a very slow degeneration of mechanical support is expected in the initial four months in order to provide sufficient mechanical support to the injured vessels. Although it is still arguable whether full mechanical support in stented lesions is mandatory during the first four months after implantation, it would certainly be a safety design parameter and a benchmark for regulatory evaluations based on the fact that there is insufficient human in vivo data available, especially the vessel wall mechanical properties during the healing/remodeling phase. Second, once the Mg alloy stent being degraded, the void space will be filled by the regenerated blood vessel tissues. The degradation of the Mg alloy stent should be 100% completed with no residues, and the degradation products (e.g., ions and hydrogen) will be helpful for the tissue reconstruction of the blood vessel. Toward this target, some future research perspectives are also discussed.

  7. Bio-Adaption between Magnesium Alloy Stent and the Blood Vessel: A Review

    PubMed Central

    Ma, Jun; Zhao, Nan; Betts, Lexxus; Zhu, Donghui

    2016-01-01

    Biodegradable magnesium (Mg) alloy stents are the most promising next generation of bio-absorbable stents. In this article, we summarized the progresses on the in vitro studies, animal testing and clinical trials of biodegradable Mg alloy stents in the past decades. These exciting findings led us to propose the importance of the concept “bio-adaption” between the Mg alloy stent and the local tissue microenvironment after implantation. The healing responses of stented blood vessel can be generally described in three overlapping phases: inflammation, granulation and remodeling. The ideal bio-adaption of the Mg alloy stent, once implanted into the blood vessel, needs to be a reasonable function of the time and the space/dimension. First, a very slow degeneration of mechanical support is expected in the initial four months in order to provide sufficient mechanical support to the injured vessels. Although it is still arguable whether full mechanical support in stented lesions is mandatory during the first four months after implantation, it would certainly be a safety design parameter and a benchmark for regulatory evaluations based on the fact that there is insufficient human in vivo data available, especially the vessel wall mechanical properties during the healing/remodeling phase. Second, once the Mg alloy stent being degraded, the void space will be filled by the regenerated blood vessel tissues. The degradation of the Mg alloy stent should be 100% completed with no residues, and the degradation products (e.g., ions and hydrogen) will be helpful for the tissue reconstruction of the blood vessel. Toward this target, some future research perspectives are also discussed. PMID:27698548

  8. Simulation of Coast Guard Vessel Traffic Service Operations by Model and Experiment

    DOT National Transportation Integrated Search

    1980-09-01

    A technique for computer simulation of operations of U.S. Coast Guard Vessel Traffic Services is described and verified with data obtained in four field studies. Uses of the Technique are discussed and illustrated. A field experiment is described in ...

  9. Experimental benchmark of kinetic simulations of capacitively coupled plasmas in molecular gases

    NASA Astrophysics Data System (ADS)

    Donkó, Z.; Derzsi, A.; Korolov, I.; Hartmann, P.; Brandt, S.; Schulze, J.; Berger, B.; Koepke, M.; Bruneau, B.; Johnson, E.; Lafleur, T.; Booth, J.-P.; Gibson, A. R.; O'Connell, D.; Gans, T.

    2018-01-01

    We discuss the origin of uncertainties in the results of numerical simulations of low-temperature plasma sources, focusing on capacitively coupled plasmas. These sources can be operated in various gases/gas mixtures, over a wide domain of excitation frequency, voltage, and gas pressure. At low pressures, the non-equilibrium character of the charged particle transport prevails and particle-based simulations become the primary tools for their numerical description. The particle-in-cell method, complemented with Monte Carlo type description of collision processes, is a well-established approach for this purpose. Codes based on this technique have been developed by several authors/groups, and have been benchmarked with each other in some cases. Such benchmarking demonstrates the correctness of the codes, but the underlying physical model remains unvalidated. This is a key point, as this model should ideally account for all important plasma chemical reactions as well as for the plasma-surface interaction via including specific surface reaction coefficients (electron yields, sticking coefficients, etc). In order to test the models rigorously, comparison with experimental ‘benchmark data’ is necessary. Examples will be given regarding the studies of electron power absorption modes in O2, and CF4-Ar discharges, as well as on the effect of modifications of the parameters of certain elementary processes on the computed discharge characteristics in O2 capacitively coupled plasmas.

  10. SU-E-I-25: Quantification of Coronary Artery Cross-Sectional Area in CT Angiography Using Integrated Density: A Simulation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, T; Ding, H; Lipinski, J

    2015-06-15

    Purpose: To develop a physics-based model for accurate quantification of the cross-sectional area (CSA) of coronary arteries in CT angiography by measuring the integrated density to account for the partial volume effect. Methods: In this technique the integrated density of the object as compared with its local background is measured to account for the partial volume effect. Normal vessels were simulated as circles with diameters in the range of 0.1–3mm. Diseased vessels were simulated as 2, 3, and 4mm diameter vessels with 10–90% area stenosis, created by inserting circular plaques. A simplified two material model was used with the lumenmore » as 8mg/ml Iodine and background as lipid. The contrast-to-noise ratio between lumen and background was approximately 26. Linear fits to the known CSA were calculated. The precision and accuracy of the measurement were quantified using the root-mean-square fit deviations (RMSD) and errors to the known CSA (RMSE). Results compared to manual segmentation of the vessel lumen. To assess the impact of random variations, coefficients of variation (CV) from 10 simulations for each vessel were computed to determine reliability. Measurements with CVs less than 10% were considered reliable. Results: For normal vessels, the precision and accuracy of the integrated density technique were 0.12mm{sup 2} and 0.28mm{sup 2}, respectively. The corresponding results for manual segmentation were 0.27mm{sup 2} and 0.43mm{sup 2}. For diseased vessels, the precision and accuracy of the integrated density technique were 0.14mm{sup 2} and 0.19mm{sup 2}. Corresponding results for manual segmentation were 0.42mm{sup 2} and 0.71mm{sup 2}. Reliable CSAs were obtained for normal vessels with diameters larger than 1 mm and for diseased vessels with area as low as 1.26mm2. Conclusion: The CSA based on integrated density showed improved precision and accuracy as compared with manual segmentation in simulation. These results indicate the potential of using integrated density to quantify CSA of coronary arteries in CT angiography.« less

  11. Issues in benchmarking human reliability analysis methods : a literature review.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted,more » reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less

  12. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing pastmore » benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less

  13. The philosophy of benchmark testing a standards-based picture archiving and communications system.

    PubMed

    Richardson, N E; Thomas, J A; Lyche, D K; Romlein, J; Norton, G S; Dolecek, Q E

    1999-05-01

    The Department of Defense issued its requirements for a Digital Imaging Network-Picture Archiving and Communications System (DIN-PACS) in a Request for Proposals (RFP) to industry in January 1997, with subsequent contracts being awarded in November 1997 to the Agfa Division of Bayer and IBM Global Government Industry. The Government's technical evaluation process consisted of evaluating a written technical proposal as well as conducting a benchmark test of each proposed system at the vendor's test facility. The purpose of benchmark testing was to evaluate the performance of the fully integrated system in a simulated operational environment. The benchmark test procedures and test equipment were developed through a joint effort between the Government, academic institutions, and private consultants. Herein the authors discuss the resources required and the methods used to benchmark test a standards-based PACS.

  14. Human Vision-Motivated Algorithm Allows Consistent Retinal Vessel Classification Based on Local Color Contrast for Advancing General Diagnostic Exams.

    PubMed

    Ivanov, Iliya V; Leitritz, Martin A; Norrenberg, Lars A; Völker, Michael; Dynowski, Marek; Ueffing, Marius; Dietter, Johannes

    2016-02-01

    Abnormalities of blood vessel anatomy, morphology, and ratio can serve as important diagnostic markers for retinal diseases such as AMD or diabetic retinopathy. Large cohort studies demand automated and quantitative image analysis of vascular abnormalities. Therefore, we developed an analytical software tool to enable automated standardized classification of blood vessels supporting clinical reading. A dataset of 61 images was collected from a total of 33 women and 8 men with a median age of 38 years. The pupils were not dilated, and images were taken after dark adaption. In contrast to current methods in which classification is based on vessel profile intensity averages, and similar to human vision, local color contrast was chosen as a discriminator to allow artery vein discrimination and arterial-venous ratio (AVR) calculation without vessel tracking. With 83% ± 1 standard error of the mean for our dataset, we achieved best classification for weighted lightness information from a combination of the red, green, and blue channels. Tested on an independent dataset, our method reached 89% correct classification, which, when benchmarked against conventional ophthalmologic classification, shows significantly improved classification scores. Our study demonstrates that vessel classification based on local color contrast can cope with inter- or intraimage lightness variability and allows consistent AVR calculation. We offer an open-source implementation of this method upon request, which can be integrated into existing tool sets and applied to general diagnostic exams.

  15. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Development and testing of the VITAMIN-B7/BUGLE-B7 coupled neutron-gamma multigroup cross-section libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risner, J.M.; Wiarda, D.; Miller, T.M.

    2011-07-01

    The U.S. Nuclear Regulatory Commission's Regulatory Guide 1.190 states that calculational methods used to estimate reactor pressure vessel (RPV) fluence should use the latest version of the evaluated nuclear data file (ENDF). The VITAMIN-B6 fine-group library and BUGLE-96 broad-group library, which are widely used for RPV fluence calculations, were generated using ENDF/B-VI.3 data, which was the most current data when Regulatory Guide 1.190 was issued. We have developed new fine-group (VITAMIN-B7) and broad-group (BUGLE-B7) libraries based on ENDF/B-VII.0. These new libraries, which were processed using the AMPX code system, maintain the same group structures as the VITAMIN-B6 and BUGLE-96 libraries.more » Verification and validation of the new libraries were accomplished using diagnostic checks in AMPX, 'unit tests' for each element in VITAMIN-B7, and a diverse set of benchmark experiments including critical evaluations for fast and thermal systems, a set of experimental benchmarks that are used for SCALE regression tests, and three RPV fluence benchmarks. The benchmark evaluation results demonstrate that VITAMIN-B7 and BUGLE-B7 are appropriate for use in RPV fluence calculations and meet the calculational uncertainty criterion in Regulatory Guide 1.190. (authors)« less

  17. Development and Testing of the VITAMIN-B7/BUGLE-B7 Coupled Neutron-Gamma Multigroup Cross-Section Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risner, Joel M; Wiarda, Dorothea; Miller, Thomas Martin

    2011-01-01

    The U.S. Nuclear Regulatory Commission s Regulatory Guide 1.190 states that calculational methods used to estimate reactor pressure vessel (RPV) fluence should use the latest version of the Evaluated Nuclear Data File (ENDF). The VITAMIN-B6 fine-group library and BUGLE-96 broad-group library, which are widely used for RPV fluence calculations, were generated using ENDF/B-VI data, which was the most current data when Regulatory Guide 1.190 was issued. We have developed new fine-group (VITAMIN-B7) and broad-group (BUGLE-B7) libraries based on ENDF/B-VII. These new libraries, which were processed using the AMPX code system, maintain the same group structures as the VITAMIN-B6 and BUGLE-96more » libraries. Verification and validation of the new libraries was accomplished using diagnostic checks in AMPX, unit tests for each element in VITAMIN-B7, and a diverse set of benchmark experiments including critical evaluations for fast and thermal systems, a set of experimental benchmarks that are used for SCALE regression tests, and three RPV fluence benchmarks. The benchmark evaluation results demonstrate that VITAMIN-B7 and BUGLE-B7 are appropriate for use in LWR shielding applications, and meet the calculational uncertainty criterion in Regulatory Guide 1.190.« less

  18. Simulated Space Radiation and Weightlessness: Vascular-Bone Coupling Mechanisms to Preserve Skeletal Health

    NASA Technical Reports Server (NTRS)

    Globus, R. K.; Alwood, J.; Tahimic, C.; Schreurs, A.-S.; Shirazi-Fard, Y.; Terada, M.; Zaragoza, J.; Truong, T.; Bruns, K.; Castillo, A.; hide

    2018-01-01

    We examined experimentally the effects of radiation and/or simulated weightlessness by hindlimb unloading on bone and blood vessel function either after a short period or at a later time after transient exposures in adult male, C57Bl6J mice. In sum, recent findings from our studies show that in the short term, ionizing radiation and simulate weightlessness cause greater deficits in blood vessels when combined compared to either challenge alone. In the long term, heavy ion radiation, but not unloading, can lead to persistent, adverse consequences for bone and vessel function, possibly due to oxidative stress-related pathways.

  19. Technical Report: Benchmarking for Quasispecies Abundance Inference with Confidence Intervals from Metagenomic Sequence Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLoughlin, K.

    2016-01-22

    The software application “MetaQuant” was developed by our group at Lawrence Livermore National Laboratory (LLNL). It is designed to profile microbial populations in a sample using data from whole-genome shotgun (WGS) metagenomic DNA sequencing. Several other metagenomic profiling applications have been described in the literature. We ran a series of benchmark tests to compare the performance of MetaQuant against that of a few existing profiling tools, using real and simulated sequence datasets. This report describes our benchmarking procedure and results.

  20. MoMaS reactive transport benchmark using PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Park, H.

    2017-12-01

    MoMaS benchmark was developed to enhance numerical simulation capability for reactive transport modeling in porous media. The benchmark was published in late September of 2009; it is not taken from a real chemical system, but realistic and numerically challenging tests. PFLOTRAN is a state-of-art massively parallel subsurface flow and reactive transport code that is being used in multiple nuclear waste repository projects at Sandia National Laboratories including Waste Isolation Pilot Plant and Used Fuel Disposition. MoMaS benchmark has three independent tests with easy, medium, and hard chemical complexity. This paper demonstrates how PFLOTRAN is applied to this benchmark exercise and shows results of the easy benchmark test case which includes mixing of aqueous components and surface complexation. Surface complexations consist of monodentate and bidentate reactions which introduces difficulty in defining selectivity coefficient if the reaction applies to a bulk reference volume. The selectivity coefficient becomes porosity dependent for bidentate reaction in heterogeneous porous media. The benchmark is solved by PFLOTRAN with minimal modification to address the issue and unit conversions were made properly to suit PFLOTRAN.

  1. Dependence of light scattering profile in tissue on blood vessel diameter and distribution: a computer simulation study.

    PubMed

    Duadi, Hamootal; Fixler, Dror; Popovtzer, Rachela

    2013-11-01

    Most methods for measuring light-tissue interactions focus on the volume reflectance while very few measure the transmission. We investigate both diffusion reflection and diffuse transmission at all exit angles to receive the full scattering profile. We also investigate the influence of blood vessel diameter on the scattering profile of a circular tissue. The photon propagation path at a wavelength of 850 nm is calculated from the absorption and scattering constants via Monte Carlo simulation. Several simulations are performed where a different vessel diameter and location were chosen but the blood volume was kept constant. The fraction of photons exiting the tissue at several central angles is presented for each vessel diameter. The main result is that there is a central angle that below which the photon transmission decreased for lower vessel diameters while above this angle the opposite occurred. We find this central angle to be 135 deg for a two-dimensional 10-mm diameter circular tissue cross-section containing blood vessels. These findings can be useful for monitoring blood perfusion and oxygen delivery in the ear lobe and pinched tissues. © 2013 Society of Photo-Optical Instrumentation Engineers (SPIE)

  2. Computer Simulations of the Tumor Vasculature: Applications to Interstitial Fluid Flow, Drug Delivery, and Oxygen Supply.

    PubMed

    Welter, Michael; Rieger, Heiko

    2016-01-01

    Tumor vasculature, the blood vessel network supplying a growing tumor with nutrients such as oxygen or glucose, is in many respects different from the hierarchically organized arterio-venous blood vessel network in normal tissues. Angiogenesis (the formation of new blood vessels), vessel cooption (the integration of existing blood vessels into the tumor vasculature), and vessel regression remodel the healthy vascular network into a tumor-specific vasculature. Integrative models, based on detailed experimental data and physical laws, implement, in silico, the complex interplay of molecular pathways, cell proliferation, migration, and death, tissue microenvironment, mechanical and hydrodynamic forces, and the fine structure of the host tissue vasculature. With the help of computer simulations high-precision information about blood flow patterns, interstitial fluid flow, drug distribution, oxygen and nutrient distribution can be obtained and a plethora of therapeutic protocols can be tested before clinical trials. This chapter provides an overview over the current status of computer simulations of vascular remodeling during tumor growth including interstitial fluid flow, drug delivery, and oxygen supply within the tumor. The model predictions are compared with experimental and clinical data and a number of longstanding physiological paradigms about tumor vasculature and intratumoral solute transport are critically scrutinized.

  3. Simulation of Targets Feeding Pipe Rupture in Wendelstein 7-X Facility Using RELAP5 and COCOSYS Codes

    NASA Astrophysics Data System (ADS)

    Kaliatka, T.; Povilaitis, M.; Kaliatka, A.; Urbonavicius, E.

    2012-10-01

    Wendelstein nuclear fusion device W7-X is a stellarator type experimental device, developed by Max Planck Institute of plasma physics. Rupture of one of the 40 mm inner diameter coolant pipes providing water for the divertor targets during the "baking" regime of the facility operation is considered to be the most severe accident in terms of the plasma vessel pressurization. "Baking" regime is the regime of the facility operation during which plasma vessel structures are heated to the temperature acceptable for the plasma ignition in the vessel. This paper presents the model of W7-X cooling system (pumps, valves, pipes, hydro-accumulators, and heat exchangers), developed using thermal-hydraulic state-of-the-art RELAP5 Mod3.3 code, and model of plasma vessel, developed by employing the lumped-parameter code COCOSYS. Using both models the numerical simulation of processes in W7-X cooling system and plasma vessel has been performed. The results of simulation showed, that the automatic valve closure time 1 s is the most acceptable (no water hammer effect occurs) and selected area of the burst disk is sufficient to prevent pressure in the plasma vessel.

  4. Impact of carotid stent cell design on vessel scaffolding: a case study comparing experimental investigation and numerical simulations.

    PubMed

    Conti, Michele; Van Loo, Denis; Auricchio, Ferdinando; De Beule, Matthieu; De Santis, Gianluca; Verhegghe, Benedict; Pirrelli, Stefano; Odero, Attilio

    2011-06-01

    To quantitatively evaluate the impact of carotid stent cell design on vessel scaffolding by using patient-specific finite element analysis of carotid artery stenting (CAS). The study was organized in 2 parts: (1) validation of a patient-specific finite element analysis of CAS and (2) evaluation of vessel scaffolding. Micro-computed tomography (CT) images of an open-cell stent deployed in a patient-specific silicone mock artery were compared with the corresponding finite element analysis results. This simulation was repeated for the closed-cell counterpart. In the second part, the stent strut distribution, as reflected by the inter-strut angles, was evaluated for both cell types in different vessel cross sections as a measure of scaffolding. The results of the patient-specific finite element analysis of CAS matched well with experimental stent deployment both qualitatively and quantitatively, demonstrating the reliability of the numerical approach. The measured inter-strut angles suggested that the closed-cell design provided superior vessel scaffolding compared to the open-cell counterpart. However, the full strut interconnection of the closed-cell design reduced the stent's ability to accommodate to the irregular eccentric profile of the vessel cross section, leading to a gap between the stent surface and the vessel wall. Even though this study was limited to a single stent design and one vascular anatomy, the study confirmed the capability of dedicated computer simulations to predict differences in scaffolding by open- and closed-cell carotid artery stents. These simulations have the potential to be used in the design of novel carotid stents or for procedure planning.

  5. Integral Full Core Multi-Physics PWR Benchmark with Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forget, Benoit; Smith, Kord; Kumar, Shikhar

    In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less

  6. Studies on in-vessel debris coolability in ALPHA program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maruyama, Yu; Yamano, Norihiro; Moriyama, Kiyofumi

    1997-02-01

    In-vessel debris coolability experiments have been performed in ALPHA Program at JAERI. Aluminum oxide (Al{sub 2}O{sub 3}) produced by a thermite reaction was applied as a debris simulant. Two scoping experiments using approximately 30 kg or 50 kg of Al{sub 2}O{sub 3} were conducted. In addition to post-test observations, temperature histories of the debris simulant and the lower head experimental vessel were evaluated. Rapid temperature reduction observed on the outer surface of the experimental vessel may imply that water penetration into a gap between the solidified debris and the experimental vessel occurred resulting in an effective cooling of once heatedmore » vessel wall. Preliminary measurement of a gap width was made with an ultrasonic device. Signals to show the existence of gaps, ranging from 0.7 mm to 1.4 mm, were detected at several locations.« less

  7. A Navigation Safety Support Model for the Strait of Istanbul

    NASA Astrophysics Data System (ADS)

    Yazici, M. Anil; Otay, Emre N.

    In this study, a real time maritime traffic support model is developed for safe navigation in the Strait of Istanbul, also known as the Bosporus. The present model simulates vessel trajectories corresponding to possible headings, using channel geometry, counter traffic, and surface currents as input. A new MATLAB code is developed for the simulation and the Marine GNC Toolbox (Fossen and Perez, 2004) is used for the vessel hydrodynamics and the auto-pilot model. After computing the trajectory tree of the vessel by forward-mapping its position distribution with respect to the initial position vector, the casualty probabilities of each trajectory are found. Within certain restrictions on vessel geometry, the proposed model predicts the safest possible intended course for the transit vessels based on the navigational parameters including position, speed, and course of the vessel. The model is tested for the Strait of Istanbul for validation. Without loss of generality, the model can be used for any narrow channel with a vessel traffic system providing the necessary input.

  8. Incomplete Spontaneous Recovery from Airway Obstruction During Inhaled Anesthesia Induction: A Computational Simulation.

    PubMed

    Kuo, Alexander S; Vijjeswarapu, Mary A; Philip, James H

    2016-03-01

    Inhaled induction with spontaneous respiration is a technique used for difficult airways. One of the proposed advantages is if airway patency is lost, the anesthetic agent will spontaneously redistribute until anesthetic depth is reduced and airway patency can be recovered. There are little and conflicting clinical or experimental data regarding the kinetics of this anesthetic technique. We used computer simulation to investigate this situation. We used GasMan, a computer simulation of inhaled anesthetic kinetics. For each simulation, alveolar ventilation was initiated with a set anesthetic induction concentration. When the vessel-rich group level reached the simulation specified airway obstruction threshold, alveolar ventilation was set at 0 to simulate complete airway obstruction. The time until the vessel-rich group anesthetic level decreased below the airway obstruction threshold was designated time to spontaneous recovery. We varied the parameters for each simulation, exploring the use of sevoflurane and halothane, airway obstruction threshold from 0.5 to 2 minimum alveolar concentration (MAC), anesthetic induction concentration 2 to 4 MAC sevoflurane and 4 to 6 MAC halothane, cardiac output 2.5 to 10 L/min, functional residual capacity 1.5 to 3.5 L, and relative vessel-rich group perfusion 67% to 85%. In each simulation, there were 3 general phases: anesthetic wash-in, obstruction and overshoot, and then slow redistribution. During the first 2 phases, there was a large gradient between the alveolar and vessel-rich group. Alveolar do not reflect vessel-rich group anesthetic levels until the late third phase. Time to spontaneous recovery varied between 35 and 749 seconds for sevoflurane and 13 and 222 seconds for halothane depending on the simulation parameters. Halothane had a faster time to spontaneous recovery because of the lower alveolar gradient and less overshoot of the vessel-rich group, not faster redistribution. Higher airway obstruction thresholds, decreased anesthetic induction, and higher cardiac output reduced time to spontaneous recovery. To a lesser effect, decreased functional residual capacity and the decreased relative vessel-rich groups' perfusion also reduced the time to spontaneous recovery. Spontaneous recovery after complete airway obstruction during inhaled induction is plausible, but the recovery time is highly variable and depends on the clinical and physiologic situation. These results emphasize that induction is a non-steady-state situation, thus effect-site anesthetic levels should be modeled in future research, not alveolar concentration. Finally, this study provides an example of using computer simulation to explore situations that are difficult to investigate clinically.

  9. Performance of exchange-correlation functionals in density functional theory calculations for liquid metal: A benchmark test for sodium.

    PubMed

    Han, Jeong-Hwan; Oda, Takuji

    2018-04-14

    The performance of exchange-correlation functionals in density-functional theory (DFT) calculations for liquid metal has not been sufficiently examined. In the present study, benchmark tests of Perdew-Burke-Ernzerhof (PBE), Armiento-Mattsson 2005 (AM05), PBE re-parameterized for solids, and local density approximation (LDA) functionals are conducted for liquid sodium. The pair correlation function, equilibrium atomic volume, bulk modulus, and relative enthalpy are evaluated at 600 K and 1000 K. Compared with the available experimental data, the errors range from -11.2% to 0.0% for the atomic volume, from -5.2% to 22.0% for the bulk modulus, and from -3.5% to 2.5% for the relative enthalpy depending on the DFT functional. The generalized gradient approximation functionals are superior to the LDA functional, and the PBE and AM05 functionals exhibit the best performance. In addition, we assess whether the error tendency in liquid simulations is comparable to that in solid simulations, which would suggest that the atomic volume and relative enthalpy performances are comparable between solid and liquid states but that the bulk modulus performance is not. These benchmark test results indicate that the results of liquid simulations are significantly dependent on the exchange-correlation functional and that the DFT functional performance in solid simulations can be used to roughly estimate the performance in liquid simulations.

  10. Performance of exchange-correlation functionals in density functional theory calculations for liquid metal: A benchmark test for sodium

    NASA Astrophysics Data System (ADS)

    Han, Jeong-Hwan; Oda, Takuji

    2018-04-01

    The performance of exchange-correlation functionals in density-functional theory (DFT) calculations for liquid metal has not been sufficiently examined. In the present study, benchmark tests of Perdew-Burke-Ernzerhof (PBE), Armiento-Mattsson 2005 (AM05), PBE re-parameterized for solids, and local density approximation (LDA) functionals are conducted for liquid sodium. The pair correlation function, equilibrium atomic volume, bulk modulus, and relative enthalpy are evaluated at 600 K and 1000 K. Compared with the available experimental data, the errors range from -11.2% to 0.0% for the atomic volume, from -5.2% to 22.0% for the bulk modulus, and from -3.5% to 2.5% for the relative enthalpy depending on the DFT functional. The generalized gradient approximation functionals are superior to the LDA functional, and the PBE and AM05 functionals exhibit the best performance. In addition, we assess whether the error tendency in liquid simulations is comparable to that in solid simulations, which would suggest that the atomic volume and relative enthalpy performances are comparable between solid and liquid states but that the bulk modulus performance is not. These benchmark test results indicate that the results of liquid simulations are significantly dependent on the exchange-correlation functional and that the DFT functional performance in solid simulations can be used to roughly estimate the performance in liquid simulations.

  11. An Integrated Development Environment for Adiabatic Quantum Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; McCaskey, Alex; Bennink, Ryan S

    2014-01-01

    Adiabatic quantum computing is a promising route to the computational power afforded by quantum information processing. The recent availability of adiabatic hardware raises the question of how well quantum programs perform. Benchmarking behavior is challenging since the multiple steps to synthesize an adiabatic quantum program are highly tunable. We present an adiabatic quantum programming environment called JADE that provides control over all the steps taken during program development. JADE captures the workflow needed to rigorously benchmark performance while also allowing a variety of problem types, programming techniques, and processor configurations. We have also integrated JADE with a quantum simulation enginemore » that enables program profiling using numerical calculation. The computational engine supports plug-ins for simulation methodologies tailored to various metrics and computing resources. We present the design, integration, and deployment of JADE and discuss its use for benchmarking adiabatic quantum programs.« less

  12. The PPP Simulator: User’s Manual and Report

    DTIC Science & Technology

    1986-11-01

    simulator: Script started on Thu Aug 28 09:16:15 1986 1 ji] -> ppp -d Benchmarks/Par/ccon6.w pau load /a/hprg’fagin/ PPPl /Benchmarks/Par,’concatOP .w Capace...EOF ) putc( c, stdout ) #else if(( fp = fopen("/a/hprg/fagin/ PPPl /notes’, fir" ))!NULL) while(( c = getc(fp)) != EOF ) putc( c, stdout ) #erndif if...hprg/fagin/ PPPl /bitl.d’, fir" ) =NULL) lddsptbl( fp, bi-tbl ); while((--argc > 0) && ((*.+argv)[0]= -I for( s =argv[0]+l; *s!=’\\0’ s++ A -A Aug 18 16

  13. Summary of the Tandem Cylinder Solutions from the Benchmark Problems for Airframe Noise Computations-I Workshop

    NASA Technical Reports Server (NTRS)

    Lockard, David P.

    2011-01-01

    Fifteen submissions in the tandem cylinders category of the First Workshop on Benchmark problems for Airframe Noise Computations are summarized. Although the geometry is relatively simple, the problem involves complex physics. Researchers employed various block-structured, overset, unstructured and embedded Cartesian grid techniques and considerable computational resources to simulate the flow. The solutions are compared against each other and experimental data from 2 facilities. Overall, the simulations captured the gross features of the flow, but resolving all the details which would be necessary to compute the noise remains challenging. In particular, how to best simulate the effects of the experimental transition strip, and the associated high Reynolds number effects, was unclear. Furthermore, capturing the spanwise variation proved difficult.

  14. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  15. Calculation and benchmarking of an azimuthal pressure vessel neutron fluence distribution using the BOXER code and scraping experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzgrewe, F.; Hegedues, F.; Paratte, J.M.

    1995-03-01

    The light water reactor BOXER code was used to determine the fast azimuthal neutron fluence distribution at the inner surface of the reactor pressure vessel after the tenth cycle of a pressurized water reactor (PWR). Using a cross-section library in 45 groups, fixed-source calculations in transport theory and x-y geometry were carried out to determine the fast azimuthal neutron flux distribution at the inner surface of the pressure vessel for four different cycles. From these results, the fast azimuthal neutron fluence after the tenth cycle was estimated and compared with the results obtained from scraping test experiments. In these experiments,more » small samples of material were taken from the inner surface of the pressure vessel. The fast neutron fluence was then determined form the measured activity of the samples. Comparing the BOXER and scraping test results have maximal differences of 15%, which is very good, considering the factor of 10{sup 3} neutron attenuation between the reactor core and the pressure vessel. To compare the BOXER results with an independent code, the 21st cycle of the PWR was also calculated with the TWODANT two-dimensional transport code, using the same group structure and cross-section library. Deviations in the fast azimuthal flux distribution were found to be <3%, which verifies the accuracy of the BOXER results.« less

  16. Commissioning and experimental validation of SST-1 plasma facing components

    NASA Astrophysics Data System (ADS)

    Paravastu, Yuvakiran; Raval, Dilip; Khan, Ziauddin; Patel, Hitesh; Biswas, Prabal; Parekh, Tejas; George, Siju; Santra, Prosenjit; Ramesh, Gattu; ArunPrakash, A.; Thankey, Prashant; Semwal, Pratibha; Dhanani, Kalpeshkumar R.; Jaiswal, Snehal; Chauhan, Pradeep; Pradhan, Subrata

    2017-04-01

    Plasma facing components of SST-1 are designed to withstand an input heat load of 1.0 MW/m2. They protect vacuum vessel, auxiliary heating source i.e. RF antennas, NBI and other in-vessel diagnostic from the plasma particles and high radiative heat loads. PFC’s are positioned symmetric to mid-plane to accommodate with circular, single and double null configuration. Graphite is used as plasma facing material, back made of copper alloy and SS cooling/baking tubes are brazed on copper alloy back plates for efficient heat removal of incident heat flux. Benchmarking of PFC assembly was first carried out in prototype vacuum vessel of SST-1 to develop understanding and methodology of co-ordinate measurements. Based on such hands-on-experience, the final assembly of PFC’s in vacuum vessel of SST-1 was carried out. Initially, PFC’s are to be baked at 250 °C for wall conditioning followed with cooling for heat removal of incident heat flux during long pulse plasma operation. For this purpose, the supply and return headers are designed and installed inside the vacuum vessel in such a way that it will cater water as well as hot nitrogen gas depending up on the cycle. This paper will discuss the successful installation of PFC’s and its plasma operation respecting all design criteria.

  17. Threshold Gravity Determination and Artificial Gravity Studies Using Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.; Leslie, F.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required (magnitude and duration)? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for a variable gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successfully simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars.

  18. Comparing Hospital Processes and Outcomes in California Medicare Beneficiaries: Simulation Prompts Reconsideration

    PubMed Central

    Escobar, Gabriel J; Baker, Jennifer M; Turk, Benjamin J; Draper, David; Liu, Vincent; Kipnis, Patricia

    2017-01-01

    Introduction This article is not a traditional research report. It describes how conducting a specific set of benchmarking analyses led us to broader reflections on hospital benchmarking. We reexamined an issue that has received far less attention from researchers than in the past: How variations in the hospital admission threshold might affect hospital rankings. Considering this threshold made us reconsider what benchmarking is and what future benchmarking studies might be like. Although we recognize that some of our assertions are speculative, they are based on our reading of the literature and previous and ongoing data analyses being conducted in our research unit. We describe the benchmarking analyses that led to these reflections. Objectives The Centers for Medicare and Medicaid Services’ Hospital Compare Web site includes data on fee-for-service Medicare beneficiaries but does not control for severity of illness, which requires physiologic data now available in most electronic medical records. To address this limitation, we compared hospital processes and outcomes among Kaiser Permanente Northern California’s (KPNC) Medicare Advantage beneficiaries and non-KPNC California Medicare beneficiaries between 2009 and 2010. Methods We assigned a simulated severity of illness measure to each record and explored the effect of having the additional information on outcomes. Results We found that if the admission severity of illness in non-KPNC hospitals increased, KPNC hospitals’ mortality performance would appear worse; conversely, if admission severity at non-KPNC hospitals’ decreased, KPNC hospitals’ performance would appear better. Conclusion Future hospital benchmarking should consider the impact of variation in admission thresholds. PMID:29035176

  19. Structural response of 1/20-scale models of the Clinch River Breeder Reactor to a simulated hypothetical core disruptive accident. Technical report 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romander, C. M.; Cagliostro, D. J.

    Five experiments were performed to help evaluate the structural integrity of the reactor vessel and head design and to verify code predictions. In the first experiment (SM 1), a detailed model of the head was loaded statically to determine its stiffness. In the remaining four experiments (SM 2 to SM 5), models of the vessel and head were loaded dynamically under a simulated 661 MW-sec hypothetical core disruptive accident (HCDA). Models SM 2 to SM 4, each of increasing complexity, systematically showed the effects of upper internals structures, a thermal liner, core support platform, and torospherical bottom on vessel response.more » Model SM 5, identical to SM 4 but more heavily instrumented, demonstrated experimental reproducibility and provided more comprehensive data. The models consisted of a Ni 200 vessel and core barrel, a head with shielding and simulated component masses, an upper internals structure (UIS), and, in the more complex models SM 4 and SM 5, a Ni 200 thermal liner and core support structure. Water simulated the liquid sodium coolant and a low-density explosive simulated the HCDA loads.« less

  20. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less

  1. Full dimensional computer simulations to study pulsatile blood flow in vessels, aortic arch and bifurcated veins: Investigation of blood viscosity and turbulent effects.

    PubMed

    Sultanov, Renat A; Guster, Dennis

    2009-01-01

    We report computational results of blood flow through a model of the human aortic arch and a vessel of actual diameter and length. A realistic pulsatile flow is used in all simulations. Calculations for bifurcation type vessels are also carried out and presented. Different mathematical methods for numerical solution of the fluid dynamics equations have been considered. The non-Newtonian behaviour of the human blood is investigated together with turbulence effects. A detailed time-dependent mathematical convergence test has been carried out. The results of computer simulations of the blood flow in vessels of three different geometries are presented: for pressure, strain rate and velocity component distributions we found significant disagreements between our results obtained with realistic non-Newtonian treatment of human blood and the widely used method in the literature: a simple Newtonian approximation. A significant increase of the strain rate and, as a result, the wall shear stress distribution, is found in the region of the aortic arch. Turbulent effects are found to be important, particularly in the case of bifurcation vessels.

  2. Large-scale testing of in-vessel debris cooling through external flooding of the reactor pressure vessel in the CYBL facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, T.Y.; Bentz, J.H.; Bergeron, K.D.

    1994-04-01

    The possibility of achieving in-vessel core retention by flooding the reactor cavity, or the ``flooded cavity``, is an accident management concept currently under consideration for advanced light water reactors (ALWR), as well as for existing light water reactors (LWR). The CYBL (CYlindrical BoiLing) facility is a facility specifically designed to perform large-scale confirmatory testing of the flooded cavity concept. CYBL has a tank-within-a-tank design; the inner 3.7 m diameter tank simulates the reactor vessel, and the outer tank simulates the reactor cavity. The energy deposition on the bottom head is simulated with an array of radiant heaters. The array canmore » deliver a tailored heat flux distribution corresponding to that resulting from core melt convection. The present paper provides a detailed description of the capabilities of the facility, as well as results of recent experiments with heat flux in the range of interest to those required for in-vessel retention in typical ALWRs. The paper concludes with a discussion of other experiments for the flooded cavity applications.« less

  3. Hollow fiber clinostat for simulating microgravity in cell culture

    NASA Technical Reports Server (NTRS)

    Rhodes, Percy H. (Inventor); Miller, Teresa Y. (Inventor); Snyder, Robert S. (Inventor)

    1992-01-01

    A clinostat for simulating microgravity on cell systems carried in a fiber fixedly mounted in a rotatable culture vessel is disclosed. The clinostat is rotated horizontally along its longitudinal axis to simulate microgravity or vertically as a control response. Cells are injected into the fiber and the ends of the fiber are sealed and secured to spaced end pieces of a fiber holder assembly which consists of the end pieces, a hollow fiber, a culture vessel, and a tension spring with three alignment pins. The tension spring is positioned around the culture vessel with its ends abutting the end pieces for alignment of the spring. After the fiber is secured, the spring is decompressed to maintain tension on the fiber while it is being rotated. This assures that the fiber remains aligned along the axis of rotation. The fiber assembly is placed in the culture vessel and culture medium is added. The culture vessel is then inserted into the rotatable portion of the clinostat and subjected to rotate at selected rpms. The internal diameter of the hollow fiber determines the distance the cells are from the axis of rotation.

  4. Effect of Rolling Massage on the Vortex Flow in Blood Vessels with Lattice Boltzmann Simulation

    NASA Astrophysics Data System (ADS)

    Yi, Hou Hui

    The rolling massage manipulation is a classic Chinese Medical Massage, which is a nature therapy in eliminating many diseases. Here, the effect of the rolling massage on the cavity flows in blood vessel under the rolling manipulation is studied by the lattice Boltzmann simulation. The simulation results show that the vortex flows are fully disturbed by the rolling massage. The flow behavior depends on the rolling velocity and the rolling depth. Rolling massage has a better effect on the flows in the cavity than that of the flows in a planar blood vessel. The result is helpful to understand the mechanism of the massage and develop the rolling techniques.

  5. Cyber-Based Turbulent Combustion Simulation

    DTIC Science & Technology

    2012-02-28

    flame thickness by comparing with benchmark of AFRL/RZ ( UNICORN ) suppressing the oscillatory numerical behavior. These improvements in numerical...fraction with the benchmark results of AFRL/RZ. This validating base is generated by the UNICORN program on the finest mesh available and the local...shared kinematic and thermodynamic data from the UNICORN program. The most important and meaningful conclusion can be drawn from this comparison is

  6. METC CFD simulations of hot gas filtration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Brien, T.J.

    1995-06-01

    Computational Fluid Dynamic (CFD) simulations of the fluid/particle flow in several hot gas filtration vessels will be presented. These simulations have been useful in designing filtration vessels and in diagnosing problems with filter operation. The simulations were performed using the commercial code FLUENT and the METC-developed code MFIX. Simulations of the initial configuration of the Karhula facility indicated that the dirty gas flow over the filter assemblage was very non-uniform. The force of the dirty gas inlet flow was inducing a large circulation pattern that caused flow around the candles to be in opposite directions on opposite sides of themore » vessel. By introducing a system of baffles, a more uniform flow pattern was developed. This modification may have contributed to the success of the project. Several simulations of configurations proposed by Industrial Filter and Pump were performed, varying the position of the inlet. A detailed resolution of the geometry of the candles allowed determination of the flow between the individual candles. Recent simulations in support of the METC/CeraMem Cooperative Research and Development Agreement have analyzed the flow in the vessel during the cleaning back-pulse. Visualization of experiments at the CeraMem cold-flow facility provided confidence in the use of CFD. Extensive simulations were then performed to assist in the design of the hot test facility being built by Ahlstrom/Pyropower. These tests are intended to demonstrate the CeraMem technology.« less

  7. The future of simulation technologies for complex cardiovascular procedures.

    PubMed

    Cates, Christopher U; Gallagher, Anthony G

    2012-09-01

    Changing work practices and the evolution of more complex interventions in cardiovascular medicine are forcing a paradigm shift in the way doctors are trained. Implantable cardioverter defibrillator (ICD), transcatheter aortic valve implantation (TAVI), carotid artery stenting (CAS), and acute stroke intervention procedures are forcing these changes at a faster pace than in other disciplines. As a consequence, cardiovascular medicine has had to develop a sophisticated understanding of precisely what is meant by 'training' and 'skill'. An evolving conclusion is that procedure training on a virtual reality (VR) simulator presents a viable current solution. These simulations should characterize the important performance characteristics of procedural skill that have metrics derived and defined from, and then benchmarked to experienced operators (i.e. level of proficiency). Simulation training is optimal with metric-based feedback, particularly formative trainee error assessments, proximate to their performance. In prospective, randomized studies, learners who trained to a benchmarked proficiency level on the simulator performed significantly better than learners who were traditionally trained. In addition, cardiovascular medicine now has available the most sophisticated virtual reality simulators in medicine and these have been used for the roll-out of interventions such as CAS in the USA and globally with cardiovascular society and industry partnered training programmes. The Food and Drug Administration has advocated the use of VR simulation as part of the approval of new devices and the American Board of Internal Medicine has adopted simulation as part of its maintenance of certification. Simulation is rapidly becoming a mainstay of cardiovascular education, training, certification, and the safe adoption of new technology. If cardiovascular medicine is to continue to lead in the adoption and integration of simulation, then, it must take a proactive position in the development of metric-based simulation curriculum, adoption of proficiency benchmarking definitions, and then resolve to commit resources so as to continue to lead this revolution in physician training.

  8. Avoiding the parametric roll

    NASA Astrophysics Data System (ADS)

    Acomi, Nicoleta; Ancuţa, Cristian; Andrei, Cristian; Boştinǎ, Alina; Boştinǎ, Aurel

    2016-12-01

    Ships are mainly built to sail and transport cargo at sea. Environmental conditions and state of the sea are communicated to vessels through periodic weather forecasts. Despite officers being aware of the sea state, their sea time experience is a decisive factor when the vessel encounters severe environmental conditions. Another important factor is the loading condition of the vessel, which triggers different behaviour in similar marine environmental conditions. This paper aims to analyse the behaviour of a port container vessel in severe environmental conditions and to estimate the potential conditions of parametric roll resonance. Octopus software simulation is employed to simulate vessel motions under certain conditions of the sea, with possibility to analyse the behaviour of ships and the impact of high waves on ships due to specific wave encounter situations. The study should be regarded as a supporting tool during the decision making process.

  9. Importance of inlet boundary conditions for numerical simulation of combustor flows

    NASA Technical Reports Server (NTRS)

    Sturgess, G. J.; Syed, S. A.; Mcmanus, K. R.

    1983-01-01

    Fluid dynamic computer codes for the mathematical simulation of problems in gas turbine engine combustion systems are required as design and diagnostic tools. To eventually achieve a performance standard with these codes of more than qualitative accuracy it is desirable to use benchmark experiments for validation studies. Typical of the fluid dynamic computer codes being developed for combustor simulations is the TEACH (Teaching Elliptic Axisymmetric Characteristics Heuristically) solution procedure. It is difficult to find suitable experiments which satisfy the present definition of benchmark quality. For the majority of the available experiments there is a lack of information concerning the boundary conditions. A standard TEACH-type numerical technique is applied to a number of test-case experiments. It is found that numerical simulations of gas turbine combustor-relevant flows can be sensitive to the plane at which the calculations start and the spatial distributions of inlet quantities for swirling flows.

  10. Studies of aggregated nanoparticles steering during magnetic-guided drug delivery in the blood vessels

    NASA Astrophysics Data System (ADS)

    Hoshiar, Ali Kafash; Le, Tuan-Anh; Amin, Faiz Ul; Kim, Myeong Ok; Yoon, Jungwon

    2017-04-01

    Magnetic-guided targeted drug delivery (TDD) systems can enhance the treatment of diverse diseases. Despite the potential and promising results of nanoparticles, aggregation prevents precise particle guidance in the vasculature. In this study, we developed a simulation platform to investigate aggregation during steering of nanoparticles using a magnetic field function. The magnetic field function (MFF) comprises a positive and negative pulsed magnetic field generated by electromagnetic coils, which prevents adherence of particles to the vessel wall during magnetic guidance. A commonly used Y-shaped vessel was simulated and the performance of the MFF analyzed; the experimental data were in agreement with the simulation results. Moreover, the effects of various parameters on magnetic guidance were evaluated and the most influential identified. The simulation results presented herein will facilitate more precise guidance of nanoparticles in vivo.

  11. Development of an Implantable WBAN Path-Loss Model for Capsule Endoscopy

    NASA Astrophysics Data System (ADS)

    Aoyagi, Takahiro; Takizawa, Kenichi; Kobayashi, Takehiko; Takada, Jun-Ichi; Hamaguchi, Kiyoshi; Kohno, Ryuji

    An implantable WBAN path-loss model for a capsule endoscopy which is used for examining digestive organs, is developed by conducting simulations and experiments. First, we performed FDTD simulations on implant WBAN propagation by using a numerical human model. Second, we performed FDTD simulations on a vessel that represents the human body. Third, we performed experiments using a vessel of the same dimensions as that used in the simulations. On the basis of the results of these simulations and experiments, we proposed the gradient and intercept parameters of the simple path-loss in-body propagation model.

  12. Gramicidin S production by Bacillus brevis in simulated microgravity

    NASA Technical Reports Server (NTRS)

    Fang, A.; Pierson, D. L.; Mishra, S. K.; Koenig, D. W.; Demain, A. L.

    1997-01-01

    In a continuing study of microbial secondary metabolism in simulated microgravity, we have examined gramicidin S (GS) production by Bacillus brevis strain Nagano in NASA High Aspect Rotating Vessels (HARVs), which are designed to simulate some aspects of microgravity. Growth and GS production were found to occur under simulated microgravity. When performance under simulated microgravity was compared with that under normal gravity conditions in the bioreactors, GS production was found to be unaffected by simulated microgravity. The repressive effect of glycerol in flask fermentations was not observed in the HARV. Thus the negative effect of glycerol on specific GS formation is dependent on shear and/or vessel geometry, not gravity.

  13. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine performance tests readily accessible will help advance a more transparent model evaluation process.

  14. Benchmark Evaluation of HTR-PROTEUS Pebble Bed Experimental Program

    DOE PAGES

    Bess, John D.; Montierth, Leland; Köberl, Oliver; ...

    2014-10-09

    Benchmark models were developed to evaluate 11 critical core configurations of the HTR-PROTEUS pebble bed experimental program. Various additional reactor physics measurements were performed as part of this program; currently only a total of 37 absorber rod worth measurements have been evaluated as acceptable benchmark experiments for Cores 4, 9, and 10. Dominant uncertainties in the experimental keff for all core configurations come from uncertainties in the ²³⁵U enrichment of the fuel, impurities in the moderator pebbles, and the density and impurity content of the radial reflector. Calculations of k eff with MCNP5 and ENDF/B-VII.0 neutron nuclear data aremore » greater than the benchmark values but within 1% and also within the 3σ uncertainty, except for Core 4, which is the only randomly packed pebble configuration. Repeated calculations of k eff with MCNP6.1 and ENDF/B-VII.1 are lower than the benchmark values and within 1% (~3σ) except for Cores 5 and 9, which calculate lower than the benchmark eigenvalues within 4σ. The primary difference between the two nuclear data libraries is the adjustment of the absorption cross section of graphite. Simulations of the absorber rod worth measurements are within 3σ of the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  15. Progressive Fracture and Damage Tolerance of Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Gotsis, Pascal K.; Minnetyan, Levon

    1997-01-01

    Structural performance (integrity, durability and damage tolerance) of fiber reinforced composite pressure vessels, designed for pressured shelters for planetary exploration, is investigated via computational simulation. An integrated computer code is utilized for the simulation of damage initiation, growth, and propagation under pressure. Aramid fibers are considered in a rubbery polymer matrix for the composite system. Effects of fiber orientation and fabrication defect/accidental damages are investigated with regard to the safety and durability of the shelter. Results show the viability of fiber reinforced pressure vessels as damage tolerant shelters for planetary colonization.

  16. Could the heat sink effect of blood flow inside large vessels protect the vessel wall from thermal damage during RF-assisted surgical resection?

    PubMed

    González-Suárez, Ana; Trujillo, Macarena; Burdío, Fernando; Andaluz, Anna; Berjano, Enrique

    2014-08-01

    To assess by means of computer simulations whether the heat sink effect inside a large vessel (portal vein) could protect the vessel wall from thermal damage close to an internally cooled electrode during radiofrequency (RF)-assisted resection. First,in vivo experiments were conducted to validate the computational model by comparing the experimental and computational thermal lesion shapes created around the vessels. Computer simulations were then carried out to study the effect of different factors such as device-tissue contact, vessel position, and vessel-device distance on temperature distributions and thermal lesion shapes near a large vessel, specifically the portal vein. The geometries of thermal lesions around the vessels in the in vivo experiments were in agreement with the computer results. The thermal lesion shape created around the portal vein was significantly modified by the heat sink effect in all the cases considered. Thermal damage to the portal vein wall was inversely related to the vessel-device distance. It was also more pronounced when the device-tissue contact surface was reduced or when the vessel was parallel to the device or perpendicular to its distal end (blade zone), the vessel wall being damaged at distances less than 4.25 mm. The computational findings suggest that the heat sink effect could protect the portal vein wall for distances equal to or greater than 5 mm, regardless of its position and distance with respect to the RF-based device.

  17. Implementing ADM1 for plant-wide benchmark simulations in Matlab/Simulink.

    PubMed

    Rosen, C; Vrecko, D; Gernaey, K V; Pons, M N; Jeppsson, U

    2006-01-01

    The IWA Anaerobic Digestion Model No.1 (ADM1) was presented in 2002 and is expected to represent the state-of-the-art model within this field in the future. Due to its complexity the implementation of the model is not a simple task and several computational aspects need to be considered, in particular if the ADM1 is to be included in dynamic simulations of plant-wide or even integrated systems. In this paper, the experiences gained from a Matlab/Simulink implementation of ADM1 into the extended COST/IWA Benchmark Simulation Model (BSM2) are presented. Aspects related to system stiffness, model interfacing with the ASM family, mass balances, acid-base equilibrium and algebraic solvers for pH and other troublesome state variables, numerical solvers and simulation time are discussed. The main conclusion is that if implemented properly, the ADM1 will also produce high-quality results in dynamic plant-wide simulations including noise, discrete sub-systems, etc. without imposing any major restrictions due to extensive computational efforts.

  18. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    USGS Publications Warehouse

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  19. Finite Element Modeling of the World Federation's Second MFL Benchmark Problem

    NASA Astrophysics Data System (ADS)

    Zeng, Zhiwei; Tian, Yong; Udpa, Satish; Udpa, Lalita

    2004-02-01

    This paper presents results obtained by simulating the second magnetic flux leakage benchmark problem proposed by the World Federation of NDE Centers. The geometry consists of notches machined on the internal and external surfaces of a rotating steel pipe that is placed between two yokes that are part of a magnetic circuit energized by an electromagnet. The model calculates the radial component of the leaked field at specific positions. The nonlinear material property of the ferromagnetic pipe is taken into account in simulating the problem. The velocity effect caused by the rotation of the pipe is, however, ignored for reasons of simplicity.

  20. Two-dimensional free-surface flow under gravity: A new benchmark case for SPH method

    NASA Astrophysics Data System (ADS)

    Wu, J. Z.; Fang, L.

    2018-02-01

    Currently there are few free-surface benchmark cases with analytical results for the Smoothed Particle Hydrodynamics (SPH) simulation. In the present contribution we introduce a two-dimensional free-surface flow under gravity, and obtain an analytical expression on the surface height difference and a theoretical estimation on the surface fractal dimension. They are preliminarily validated and supported by SPH calculations.

  1. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less

  2. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    PubMed

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  3. A Benchmarking Initiative for Reactive Transport Modeling Applied to Subsurface Environmental Applications

    NASA Astrophysics Data System (ADS)

    Steefel, C. I.

    2015-12-01

    Over the last 20 years, we have seen the evolution of multicomponent reactive transport modeling and the expanding range and increasing complexity of subsurface environmental applications it is being used to address. Reactive transport modeling is being asked to provide accurate assessments of engineering performance and risk for important issues with far-reaching consequences. As a result, the complexity and detail of subsurface processes, properties, and conditions that can be simulated have significantly expanded. Closed form solutions are necessary and useful, but limited to situations that are far simpler than typical applications that combine many physical and chemical processes, in many cases in coupled form. In the absence of closed form and yet realistic solutions for complex applications, numerical benchmark problems with an accepted set of results will be indispensable to qualifying codes for various environmental applications. The intent of this benchmarking exercise, now underway for more than five years, is to develop and publish a set of well-described benchmark problems that can be used to demonstrate simulator conformance with norms established by the subsurface science and engineering community. The objective is not to verify this or that specific code--the reactive transport codes play a supporting role in this regard—but rather to use the codes to verify that a common solution of the problem can be achieved. Thus, the objective of each of the manuscripts is to present an environmentally-relevant benchmark problem that tests the conceptual model capabilities, numerical implementation, process coupling, and accuracy. The benchmark problems developed to date include 1) microbially-mediated reactions, 2) isotopes, 3) multi-component diffusion, 4) uranium fate and transport, 5) metal mobility in mining affected systems, and 6) waste repositories and related aspects.

  4. Pressurized thermal shock: TEMPEST computer code simulation of thermal mixing in the cold leg and downcomer of a pressurized water reactor. [Creare 61 and 64

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L.L.; Trent, D.S.

    The TEMPEST computer program was used to simulate fluid and thermal mixing in the cold leg and downcomer of a pressurized water reactor under emergency core cooling high-pressure injection (HPI), which is of concern to the pressurized thermal shock (PTS) problem. Application of the code was made in performing an analysis simulation of a full-scale Westinghouse three-loop plant design cold leg and downcomer. Verification/assessment of the code was performed and analysis procedures developed using data from Creare 1/5-scale experimental tests. Results of three simulations are presented. The first is a no-loop-flow case with high-velocity, low-negative-buoyancy HPI in a 1/5-scale modelmore » of a cold leg and downcomer. The second is a no-loop-flow case with low-velocity, high-negative density (modeled with salt water) injection in a 1/5-scale model. Comparison of TEMPEST code predictions with experimental data for these two cases show good agreement. The third simulation is a three-dimensional model of one loop of a full size Westinghouse three-loop plant design. Included in this latter simulation are loop components extending from the steam generator to the reactor vessel and a one-third sector of the vessel downcomer and lower plenum. No data were available for this case. For the Westinghouse plant simulation, thermally coupled conduction heat transfer in structural materials is included. The cold leg pipe and fluid mixing volumes of the primary pump, the stillwell, and the riser to the steam generator are included in the model. In the reactor vessel, the thermal shield, pressure vessel cladding, and pressure vessel wall are thermally coupled to the fluid and thermal mixing in the downcomer. The inlet plenum mixing volume is included in the model. A 10-min (real time) transient beginning at the initiation of HPI is computed to determine temperatures at the beltline of the pressure vessel wall.« less

  5. A vessel length-based method to compute coronary fractional flow reserve from optical coherence tomography images.

    PubMed

    Lee, Kyung Eun; Lee, Seo Ho; Shin, Eun-Seok; Shim, Eun Bo

    2017-06-26

    Hemodynamic simulation for quantifying fractional flow reserve (FFR) is often performed in a patient-specific geometry of coronary arteries reconstructed from the images from various imaging modalities. Because optical coherence tomography (OCT) images can provide more precise vascular lumen geometry, regardless of stenotic severity, hemodynamic simulation based on OCT images may be effective. The aim of this study is to perform OCT-FFR simulations by coupling a 3D CFD model from geometrically correct OCT images with a LPM based on vessel lengths extracted from CAG data with clinical validations for the present method. To simulate coronary hemodynamics, we developed a fast and accurate method that combined a computational fluid dynamics (CFD) model of an OCT-based region of interest (ROI) with a lumped parameter model (LPM) of the coronary microvasculature and veins. Here, the LPM was based on vessel lengths extracted from coronary X-ray angiography (CAG) images. Based on a vessel length-based approach, we describe a theoretical formulation for the total resistance of the LPM from a three-dimensional (3D) CFD model of the ROI. To show the utility of this method, we present calculated examples of FFR from OCT images. To validate the OCT-based FFR calculation (OCT-FFR) clinically, we compared the computed OCT-FFR values for 17 vessels of 13 patients with clinically measured FFR (M-FFR) values. A novel formulation for the total resistance of LPM is introduced to accurately simulate a 3D CFD model of the ROI. The simulated FFR values compared well with clinically measured ones, showing the accuracy of the method. Moreover, the present method is fast in terms of computational time, enabling clinicians to provide solutions handled within the hospital.

  6. A new numerical benchmark for variably saturated variable-density flow and transport in porous media

    NASA Astrophysics Data System (ADS)

    Guevara, Carlos; Graf, Thomas

    2016-04-01

    In subsurface hydrological systems, spatial and temporal variations in solute concentration and/or temperature may affect fluid density and viscosity. These variations could lead to potentially unstable situations, in which a dense fluid overlies a less dense fluid. These situations could produce instabilities that appear as dense plume fingers migrating downwards counteracted by vertical upwards flow of freshwater (Simmons et al., Transp. Porous Medium, 2002). As a result of unstable variable-density flow, solute transport rates are increased over large distances and times as compared to constant-density flow. The numerical simulation of variable-density flow in saturated and unsaturated media requires corresponding benchmark problems against which a computer model is validated (Diersch and Kolditz, Adv. Water Resour, 2002). Recorded data from a laboratory-scale experiment of variable-density flow and solute transport in saturated and unsaturated porous media (Simmons et al., Transp. Porous Medium, 2002) is used to define a new numerical benchmark. The HydroGeoSphere code (Therrien et al., 2004) coupled with PEST (www.pesthomepage.org) are used to obtain an optimized parameter set capable of adequately representing the data set by Simmons et al., (2002). Fingering in the numerical model is triggered using random hydraulic conductivity fields. Due to the inherent randomness, a large number of simulations were conducted in this study. The optimized benchmark model adequately predicts the plume behavior and the fate of solutes. This benchmark is useful for model verification of variable-density flow problems in saturated and/or unsaturated media.

  7. Construct validity and expert benchmarking of the haptic virtual reality dental simulator.

    PubMed

    Suebnukarn, Siriwan; Chaisombat, Monthalee; Kongpunwijit, Thanapohn; Rhienmora, Phattanapon

    2014-10-01

    The aim of this study was to demonstrate construct validation of the haptic virtual reality (VR) dental simulator and to define expert benchmarking criteria for skills assessment. Thirty-four self-selected participants (fourteen novices, fourteen intermediates, and six experts in endodontics) at one dental school performed ten repetitions of three mode tasks of endodontic cavity preparation: easy (mandibular premolar with one canal), medium (maxillary premolar with two canals), and hard (mandibular molar with three canals). The virtual instrument's path length was registered by the simulator. The outcomes were assessed by an expert. The error scores in easy and medium modes accurately distinguished the experts from novices and intermediates at the onset of training, when there was a significant difference between groups (ANOVA, p<0.05). The trend was consistent until trial 5. From trial 6 on, the three groups achieved similar scores. No significant difference was found between groups at the end of training. Error score analysis was not able to distinguish any group at the hard level of training. Instrument path length showed a difference in performance according to groups at the onset of training (ANOVA, p<0.05). This study established construct validity for the haptic VR dental simulator by demonstrating its discriminant capabilities between that of experts and non-experts. The experts' error scores and path length were used to define benchmarking criteria for optimal performance.

  8. Modeling laser speckle imaging of perfusion in the skin (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Regan, Caitlin; Hayakawa, Carole K.; Choi, Bernard

    2016-02-01

    Laser speckle imaging (LSI) enables visualization of relative blood flow and perfusion in the skin. It is frequently applied to monitor treatment of vascular malformations such as port wine stain birthmarks, and measure changes in perfusion due to peripheral vascular disease. We developed a computational Monte Carlo simulation of laser speckle contrast imaging to quantify how tissue optical properties, blood vessel depths and speeds, and tissue perfusion affect speckle contrast values originating from coherent excitation. The simulated tissue geometry consisted of multiple layers to simulate the skin, or incorporated an inclusion such as a vessel or tumor at different depths. Our simulation used a 30x30mm uniform flat light source to optically excite the region of interest in our sample to better mimic wide-field imaging. We used our model to simulate how dynamically scattered photons from a buried blood vessel affect speckle contrast at different lateral distances (0-1mm) away from the vessel, and how these speckle contrast changes vary with depth (0-1mm) and flow speed (0-10mm/s). We applied the model to simulate perfusion in the skin, and observed how different optical properties, such as epidermal melanin concentration (1%-50%) affected speckle contrast. We simulated perfusion during a systolic forearm occlusion and found that contrast decreased by 35% (exposure time = 10ms). Monte Carlo simulations of laser speckle contrast give us a tool to quantify what regions of the skin are probed with laser speckle imaging, and measure how the tissue optical properties and blood flow affect the resulting images.

  9. Development of a Pebble-Bed Liquid-Nitrogen Evaporator and Superheater for the Scaled Large Blast/Thermal Simulator Facility

    DTIC Science & Technology

    1991-04-01

    Boiler and Pressure Vessel Code . Other design requirements are developed from standard safe... Boiler and Pressure Vessel Code . The following three condi- tions constitute the primary design parameters for pressure vessels: (a) Design Working...rules and practices of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code . Section VIII, Division 1 of the ASME

  10. Structural response of 1/20-scale models of the Clinch River Breeder Reactor to a simulated hypothetical core-disruptive accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romander, C M; Cagliostro, D J

    Five experiments were performed to help evaluate the structural integrity of the reactor vessel and head design and to verify code predictions. In the first experiment (SM 1), a detailed model of the head was loaded statically to determine its stiffness. In the remaining four experiments (SM 2 to SM 5), models of the vessel and head were loaded dynamically under a simulated 661 MW-s hypothetical core disruptive accident (HCDA). Models SM 2 to SM 4, each of increasing complexity, systematically showed the effects of upper internals structures, a thermal liner, core support platform, and torospherical bottom on vessel response.more » Model SM 5, identical to SM 4 but more heavily instrumented, demonstrated experimental reproducibility and provided more comprehensive data. The models consisted of a Ni 200 vessel and core barrel, a head with shielding and simulated component masses, and an upper internals structure (UIS).« less

  11. Simulations of the Microcirculation in the Human Conjunctiva

    NASA Astrophysics Data System (ADS)

    Dow, William; Jacobitz, Frank; Chen, Peter

    2012-11-01

    The microcirculation in the conjunctiva of a healthy human subject is analyzed using a simulation approach. A comparison between healthy and diseased states may lead to early diagnosis for a variety of vascular related disorders. Previous work suggests that hypertension, arteriosclerosis, and diabetes mellitus have noticeable very early changes in the microvasculature (Davis and Landau, 1957; Ditzel, 1968; Kunitomo, 1974) and the vessels of the conjunctiva are specifically useful for this research because they can be studied non-invasively. The microcirculation in the conjunctiva has been documented over the course of disease treatments, providing both still images and video footage for information on vessel length, diameter, and connectivity as well as the direction of blood flow. The numerical method is based on a Hagen-Poiseuille balance in the microvessels and a sparse matrix solver is used to obtain the solution. The simulations use realistic vessel topology for the microvasculature, reconstructed from microscope images of tissue samples, and consider blood rheology as well as passive and active vessel properties.

  12. Benchmark Simulations of the Thermal-Hydraulic Responses during EBR-II Inherent Safety Tests using SAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui; Sumner, Tyler S.

    2016-04-17

    An advanced system analysis tool SAM is being developed for fast-running, improved-fidelity, and whole-plant transient analyses at Argonne National Laboratory under DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. As an important part of code development, companion validation activities are being conducted to ensure the performance and validity of the SAM code. This paper presents the benchmark simulations of two EBR-II tests, SHRT-45R and BOP-302R, whose data are available through the support of DOE-NE’s Advanced Reactor Technology (ART) program. The code predictions of major primary coolant system parameter are compared with the test results. Additionally, the SAS4A/SASSYS-1 code simulationmore » results are also included for a code-to-code comparison.« less

  13. Simulated microgravity upregulates an endothelial vasoconstrictor prostaglandin

    NASA Technical Reports Server (NTRS)

    Sangha, D. S.; Han, S.; Purdy, R. E.

    2001-01-01

    Endothelial nitric oxide contributes to the vascular hyporesponsiveness to norepinephrine (NE) observed in carotid arteries from rats exposed to simulated microgravity. The goal of the present study was to determine whether a cyclooxygenase product of arachidonic acid also influences vascular responsiveness in this setting. Microgravity was simulated in rats by hindlimb unweighting (HU). After 20 days of HU, carotid arteries were isolated from control and HU-treated rats, and vascular rings were mounted in tissue baths for the measurement of isometric contraction. Two cyclooxygenase inhibitors, indomethacin and ibuprofen, and the selective thromboxane A(2) prostanoid-receptor antagonist, SQ-29548, had no effect on the contraction to NE in control vessels but markedly reduced contraction to NE in HU vessels. When the endothelium was removed, indomethacin no longer had any effect on the NE-induced contraction in HU vessels. In endothelium-intact vessels in the presence of indomethacin, the addition of the nitric oxide synthase inhibitor, N(G)-L-nitro-arginine methyl ester, to the medium bathing HU vessels increased the contraction to NE to the level of that of the control vessels. These results indicate that HU treatment induced two endothelial changes in carotid artery that opposed each other. Nitric oxide activity was increased and was responsible for the vascular hyporesponsiveness to NE. The activity of a vasoconstrictor prostaglandin was also increased, and attenuated the vasodilating effect of nitric oxide.

  14. Heat sink phenomenon of bipolar and monopolar radiofrequency ablation observed using polypropylene tubes for vessel simulation.

    PubMed

    Al-Alem, Ihssan; Pillai, Krishna; Akhter, Javed; Chua, Terence C; Morris, David L

    2014-06-01

    Radiofrequency ablation (RFA) is widely used for treating liver tumors; recurrence is common owing to proximity to blood vessels possibly due to the heat sink effect. We seek to investigate this phenomenon using unipolar and bipolar RFA on an egg white tumor tissue model and an animal liver model. Temperature profiles during ablation (with and without vessel simulation) were studied, using both bipolar and unipolar RFA probes by 4 strategically placed temperature leads to monitor temperature profile during ablation. The volume of ablated tissue was also measured. The volume ablated during vessel simulation confirmed the impact of the heat sink phenomenon. The heat sink effect of unipolar RFA was greater compared with bipolar RFA (ratio of volume affected 2:1) in both tissue and liver models. The volume ablated using unipolar RFA was less than the bipolar RFA (ratio of volume ablated = 1:4). Unipolar RFA achieved higher ablation temperatures (122°C vs 98°C). Unipolar RFA resulted in tissue damage beyond the vessel, which was not observed using bipolar RFA. Bipolar RFA ablates a larger tumor volume compared with unipolar RFA, with a single ablation. The impact of heat sink phenomenon in tumor ablation is less so with bipolar than unipolar RFA with sparing of adjacent vessel damage. © The Author(s) 2013.

  15. 3D-Printed Tissue-Mimicking Phantoms for Medical Imaging and Computational Validation Applications

    PubMed Central

    Shahmirzadi, Danial; Li, Ronny X.; Doyle, Barry J.; Konofagou, Elisa E.; McGloughlin, Tim M.

    2014-01-01

    Abstract Abdominal aortic aneurysm (AAA) is a permanent, irreversible dilation of the distal region of the aorta. Recent efforts have focused on improved AAA screening and biomechanics-based failure prediction. Idealized and patient-specific AAA phantoms are often employed to validate numerical models and imaging modalities. To produce such phantoms, the investment casting process is frequently used, reconstructing the 3D vessel geometry from computed tomography patient scans. In this study the alternative use of 3D printing to produce phantoms is investigated. The mechanical properties of flexible 3D-printed materials are benchmarked against proven elastomers. We demonstrate the utility of this process with particular application to the emerging imaging modality of ultrasound-based pulse wave imaging, a noninvasive diagnostic methodology being developed to obtain regional vascular wall stiffness properties, differentiating normal and pathologic tissue in vivo. Phantom wall displacements under pulsatile loading conditions were observed, showing good correlation to fluid–structure interaction simulations and regions of peak wall stress predicted by finite element analysis. 3D-printed phantoms show a strong potential to improve medical imaging and computational analysis, potentially helping bridge the gap between experimental and clinical diagnostic tools. PMID:28804733

  16. Heterogeneous mechanics of the mouse pulmonary arterial network.

    PubMed

    Lee, Pilhwa; Carlson, Brian E; Chesler, Naomi; Olufsen, Mette S; Qureshi, M Umar; Smith, Nicolas P; Sochi, Taha; Beard, Daniel A

    2016-10-01

    Individualized modeling and simulation of blood flow mechanics find applications in both animal research and patient care. Individual animal or patient models for blood vessel mechanics are based on combining measured vascular geometry with a fluid structure model coupling formulations describing dynamics of the fluid and mechanics of the wall. For example, one-dimensional fluid flow modeling requires a constitutive law relating vessel cross-sectional deformation to pressure in the lumen. To investigate means of identifying appropriate constitutive relationships, an automated segmentation algorithm was applied to micro-computerized tomography images from a mouse lung obtained at four different static pressures to identify the static pressure-radius relationship for four generations of vessels in the pulmonary arterial network. A shape-fitting function was parameterized for each vessel in the network to characterize the nonlinear and heterogeneous nature of vessel distensibility in the pulmonary arteries. These data on morphometric and mechanical properties were used to simulate pressure and flow velocity propagation in the network using one-dimensional representations of fluid and vessel wall mechanics. Moreover, wave intensity analysis was used to study effects of wall mechanics on generation and propagation of pressure wave reflections. Simulations were conducted to investigate the role of linear versus nonlinear formulations of wall elasticity and homogeneous versus heterogeneous treatments of vessel wall properties. Accounting for heterogeneity, by parameterizing the pressure/distention equation of state individually for each vessel segment, was found to have little effect on the predicted pressure profiles and wave propagation compared to a homogeneous parameterization based on average behavior. However, substantially different results were obtained using a linear elastic thin-shell model than were obtained using a nonlinear model that has a more physiologically realistic pressure versus radius relationship.

  17. Nonparametric estimation of benchmark doses in environmental risk assessment

    PubMed Central

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  18. Likelihood of a marine vessel accident from wind energy development in the Atlantic: Likelihood of shipping accident from wind energy in the Atlantic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copping, Andrea; Breithaupt, Stephen; Whiting, Jonathan

    2015-11-02

    Offshore wind energy development is planned for areas off the Atlantic coast. Many of the planned wind development areas fall within traditional commercial vessel routes. In order to mitigate possible hazards to ships and to wind turbines, it is important to understand the potential for increased risk to commercial shipping from the presence of wind farms. Using Automatic Identification System (AIS) data, historical shipping routes between ports in the Atlantic were identified, from Maine to the Florida Straits. The AIS data were also used as inputs to a numerical model that can simulate cargo, tanker and tug/towing vessel movement alongmore » typical routes. The model was used to recreate present day vessel movement, as well as to simulate future routing that may be required to avoid wind farms. By comparing the present and future routing of vessels, a risk analysis was carried out to determine the increased marginal risk of vessel collisions, groundings, and allisions with stationary objects, due to the presence of wind farms. The outcome of the analysis showed little increase in vessel collisions or allisions, and a decrease in groundings as more vessels were forced seaward by the wind farms.« less

  19. Large-scale boiling experiments of the flooded cavity concept for in-vessel core retention

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, T.Y.; Slezak, S.E.; Bentz, J.H.

    1994-03-01

    This paper presents results of ex-vessel boiling experiments performed in the CYBL (CYlindrical BoiLing) facility. CYBL is a reactor-scale facility for confirmatory research of the flooded cavity concept for accident management. CYBL has a tank-within-a-tank design; the inner tank simulates the reactor vessel and the outer tank simulates the reactor cavity. Experiments with uniform and edge-peaked heat flux distributions up to 20 W/cm{sup 2} across the vessel bottom were performed. Boiling outside the reactor vessel was found to be subcooled nucleate boiling. The subcooling is mainly due to the gravity head which results from flooding the sides of the reactormore » vessel. The boiling process exhibits a cyclic pattern with four distinct phases: direct liquid/solid contact, bubble nucleation and growth, coalescence, and vapor mass dispersion (ejection). The results suggest that under prototypic heat load and heat flux distributions, the flooded cavity in a passive pressurized water reactor like the AP-600 should be capable of cooling the reactor pressure vessel in the central region of the lower head that is addressed by these tests.« less

  20. Surrogate model approach for improving the performance of reactive transport simulations

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines (MARS) method provides the best trade-off between speed and accuracy. This proof-of-concept forms an essential step towards building an interactive visual analytics system to enable user-driven systematic creation of geochemical surrogate models. Such a system shall enable reactive transport simulations with unprecedented spatial and temporal detail to become possible. References: Kolditz, O., Görke, U.J., Shao, H. and Wang, W., 2012. Thermo-hydro-mechanical-chemical processes in porous media: benchmarks and examples (Vol. 86). Springer Science & Business Media.

  1. Simulation of guided-wave ultrasound propagation in composite laminates: Benchmark comparisons of numerical codes and experiment.

    PubMed

    Leckey, Cara A C; Wheeler, Kevin R; Hafiychuk, Vasyl N; Hafiychuk, Halyna; Timuçin, Doğan A

    2018-03-01

    Ultrasonic wave methods constitute the leading physical mechanism for nondestructive evaluation (NDE) and structural health monitoring (SHM) of solid composite materials, such as carbon fiber reinforced polymer (CFRP) laminates. Computational models of ultrasonic wave excitation, propagation, and scattering in CFRP composites can be extremely valuable in designing practicable NDE and SHM hardware, software, and methodologies that accomplish the desired accuracy, reliability, efficiency, and coverage. The development and application of ultrasonic simulation approaches for composite materials is an active area of research in the field of NDE. This paper presents comparisons of guided wave simulations for CFRP composites implemented using four different simulation codes: the commercial finite element modeling (FEM) packages ABAQUS, ANSYS, and COMSOL, and a custom code executing the Elastodynamic Finite Integration Technique (EFIT). Benchmark comparisons are made between the simulation tools and both experimental laser Doppler vibrometry data and theoretical dispersion curves. A pristine and a delamination type case (Teflon insert in the experimental specimen) is studied. A summary is given of the accuracy of simulation results and the respective computational performance of the four different simulation tools. Published by Elsevier B.V.

  2. Deepthi Vaidhynathan | NREL

    Science.gov Websites

    Complex Systems Simulation and Optimization Group on performance analysis and benchmarking latest . Research Interests High Performance Computing|Embedded System |Microprocessors & Microcontrollers

  3. Benchmarking Data for the Proposed Signature of Used Fuel Casks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rauch, Eric Benton

    2016-09-23

    A set of benchmarking measurements to test facets of the proposed extended storage signature was conducted on May 17, 2016. The measurements were designed to test the overall concept of how the proposed signature can be used to identify a used fuel cask based only on the distribution of neutron sources within the cask. To simulate the distribution, 4 Cf-252 sources were chosen and arranged on a 3x3 grid in 3 different patterns and raw neutron totals counts were taken at 6 locations around the grid. This is a very simplified test of the typical geometry studied previously in simulationmore » with simulated used nuclear fuel.« less

  4. Numerical modeling of fluid and electrical currents through geometries based on synchrotron X-ray tomographic images of reservoir rocks using Avizo and COMSOL

    NASA Astrophysics Data System (ADS)

    Bird, M. B.; Butler, S. L.; Hawkes, C. D.; Kotzer, T.

    2014-12-01

    The use of numerical simulations to model physical processes occurring within subvolumes of rock samples that have been characterized using advanced 3D imaging techniques is becoming increasingly common. Not only do these simulations allow for the determination of macroscopic properties like hydraulic permeability and electrical formation factor, but they also allow the user to visualize processes taking place at the pore scale and they allow for multiple different processes to be simulated on the same geometry. Most efforts to date have used specialized research software for the purpose of simulations. In this contribution, we outline the steps taken to use commercial software Avizo to transform a 3D synchrotron X-ray-derived tomographic image of a rock core sample to an STL (STereoLithography) file which can be imported into the commercial multiphysics modeling package COMSOL. We demonstrate that the use of COMSOL to perform fluid and electrical current flow simulations through the pore spaces. The permeability and electrical formation factor of the sample are calculated and compared with laboratory-derived values and benchmark calculations. Although the simulation domains that we were able to model on a desk top computer were significantly smaller than representative elementary volumes, and we were able to establish Kozeny-Carman and Archie's Law trends on which laboratory measurements and previous benchmark solutions fall. The rock core samples include a Fountainebleau sandstone used for benchmarking and a marly dolostone sampled from a well in the Weyburn oil field of southeastern Saskatchewan, Canada. Such carbonates are known to have complicated pore structures compared with sandstones, yet we are able to calculate reasonable macroscopic properties. We discuss the computing resources required.

  5. The Stock Market Game: A Simulation of Stock Market Trading. Grades 5-8.

    ERIC Educational Resources Information Center

    Draze, Dianne

    This guide to a unit on a simulation game about the stock market contains an instructional text and two separate simulations. Through directed lessons and reproducible worksheets, the unit teaches students about business ownership, stock exchanges, benchmarks, commissions, why prices change, the logistics of buying and selling stocks, and how to…

  6. Modeling and Simulation With Operational Databases to Enable Dynamic Situation Assessment & Prediction

    DTIC Science & Technology

    2010-11-01

    subsections discuss the design of the simulations. 3.12.1 Lanchester5D Simulation A Lanchester simulation was developed to conduct performance...benchmarks using the WarpIV Kernel and HyperWarpSpeed. The Lanchester simulation contains a user-definable number of grid cells in which blue and red...forces engage in battle using Lanchester equations. Having a user-definable number of grid cells enables the simulation to be stressed with high entity

  7. Risk Reduction of an Invasive Insect by Targeting Surveillance Efforts with the Assistance of a Phenology Model and International Maritime Shipping Routes and Schedules.

    PubMed

    Gray, David R

    2016-05-01

    Reducing the risk of introduction to North America of the invasive Asian gypsy moth (Lymantria dispar asiatica Vnukovskij and L. d. japonica [Motschulsky]) on international maritime vessels involves two tactics: (1) vessels that wish to arrive in Canada or the United States and have visited any Asian port that is subject to regulation during designated times must obtain a predeparture inspection certificate from an approved entity; and (2) vessels with a certificate may be subjected to an additional inspection upon arrival. A decision support tool is described here with which the allocation of inspection resources at North American ports can be partitioned among multiple vessels according to estimates of the potential onboard Asian gypsy moth population and estimates of the onboard larval emergence pattern. The decision support tool assumes that port inspection is uniformly imperfect at the Asian ports and that each visit to a regulated port has potential for the vessel to be contaminated with gypsy moth egg masses. The decision support tool uses a multigenerational phenology model to estimate the potential onboard population of egg masses by calculating the temporal intersection between the dates of port visits to regulated ports and the simulated oviposition pattern in each port. The phenological development of the onboard population is simulated each day of the vessel log until the vessel arrives at the port being protected from introduction. Multiple independent simulations are used to create a probability distribution of the size and timing of larval emergence. © 2015 Society for Risk Analysis.

  8. Preventing Pirates from Boarding Commercial Vessels - A Systems Approach

    DTIC Science & Technology

    2014-09-01

    was developed in MATLAB to run simulations designed to estimate the relative effectiveness of each assessed countermeasure. A cost analysis was...project indicated that the P-Trap countermeasure, designed to entangle the pirate’s propellers with thin lines, is both effective and economically viable...vessels. A model of the operational environment was developed in MATLAB to run simulations designed to estimate the relative effectiveness of each

  9. Benchmark Tests for Stirling Convertor Heater Head Life Assessment Conducted

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Halford, Gary R.; Bowman, Randy R.

    2004-01-01

    A new in-house test capability has been developed at the NASA Glenn Research Center, where a critical component of the Stirling Radioisotope Generator (SRG) is undergoing extensive testing to aid the development of analytical life prediction methodology and to experimentally aid in verification of the flight-design component's life. The new facility includes two test rigs that are performing creep testing of the SRG heater head pressure vessel test articles at design temperature and with wall stresses ranging from operating level to seven times that (see the following photograph).

  10. In vitro flow assessment: from PC-MRI to computational fluid dynamics including fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Kratzke, Jonas; Rengier, Fabian; Weis, Christian; Beller, Carsten J.; Heuveline, Vincent

    2016-04-01

    Initiation and development of cardiovascular diseases can be highly correlated to specific biomechanical parameters. To examine and assess biomechanical parameters, numerical simulation of cardiovascular dynamics has the potential to complement and enhance medical measurement and imaging techniques. As such, computational fluid dynamics (CFD) have shown to be suitable to evaluate blood velocity and pressure in scenarios, where vessel wall deformation plays a minor role. However, there is a need for further validation studies and the inclusion of vessel wall elasticity for morphologies being subject to large displacement. In this work, we consider a fluid-structure interaction (FSI) model including the full elasticity equation to take the deformability of aortic wall soft tissue into account. We present a numerical framework, in which either a CFD study can be performed for less deformable aortic segments or an FSI simulation for regions of large displacement such as the aortic root and arch. Both of the methods are validated by means of an aortic phantom experiment. The computational results are in good agreement with 2D phase-contrast magnetic resonance imaging (PC-MRI) velocity measurements as well as catheter-based pressure measurements. The FSI simulation shows a characteristic vessel compliance effect on the flow field induced by the elasticity of the vessel wall, which the CFD model is not capable of. The in vitro validated FSI simulation framework can enable the computation of complementary biomechanical parameters such as the stress distribution within the vessel wall.

  11. MRI Simulation Study Investigating Effects of Vessel Topology, Diffusion, and Susceptibility on Transverse Relaxation Rates Using a Cylinder Fork Model.

    PubMed

    Shazeeb, Mohammed Salman; Kalpathy-Cramer, Jayashree; Issa, Bashar

    2017-11-24

    Brain vasculature is conventionally represented as straight cylinders when simulating blood oxygenation level dependent (BOLD) contrast effects in functional magnetic resonance imaging (fMRI). In reality, the vasculature is more complicated with branching and coiling especially in tumors. Diffusion and susceptibility changes can also introduce variations in the relaxation mechanisms within tumors. This study introduces a simple cylinder fork model (CFM) and investigates the effects of vessel topology, diffusion, and susceptibility on the transverse relaxation rates R2* and R2. Simulations using Monte Carlo methods were performed to quantify R2* and R2 by manipulating the CFM at different orientations, bifurcation angles, and rotation angles. Other parameters of the CFM were chosen based on physiologically relevant values: vessel diameters (~2‒10 µm), diffusion rates (1 × 10 -11 ‒1 × 10 -9  m 2 /s), and susceptibility values (3 × 10 -8 -4 × 10 -7 cgs units). R2* and R2 measurements showed a significant dependence on the bifurcation and rotation angles in several scenarios using different vessel diameters, orientations, diffusion rates, and susceptibility values. The angular dependence of R2* and R2 using the CFM could potentially be exploited as a tool to differentiate between normal and tumor vessels. The CFM can also serve as the elementary building block to simulate a capillary network reflecting realistic topological features.

  12. Higher representations on the lattice: Numerical simulations, SU(2) with adjoint fermions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Debbio, Luigi; Patella, Agostino; Pica, Claudio

    2010-05-01

    We discuss the lattice formulation of gauge theories with fermions in arbitrary representations of the color group and present in detail the implementation of the hybrid Monte Carlo (HMC)/rational HMC algorithm for simulating dynamical fermions. We discuss the validation of the implementation through an extensive set of tests and the stability of simulations by monitoring the distribution of the lowest eigenvalue of the Wilson-Dirac operator. Working with two flavors of Wilson fermions in the adjoint representation, benchmark results for realistic lattice simulations are presented. Runs are performed on different lattice sizes ranging from 4{sup 3}x8 to 24{sup 3}x64 sites. Formore » the two smallest lattices we also report the measured values of benchmark mesonic observables. These results can be used as a baseline for rapid cross-checks of simulations in higher representations. The results presented here are the first steps toward more extensive investigations with controlled systematic errors, aiming at a detailed understanding of the phase structure of these theories, and of their viability as candidates for strong dynamics beyond the standard model.« less

  13. Groundwater flow with energy transport and water-ice phase change: Numerical simulations, benchmarks, and application to freezing in peat bogs

    USGS Publications Warehouse

    McKenzie, J.M.; Voss, C.I.; Siegel, D.I.

    2007-01-01

    In northern peatlands, subsurface ice formation is an important process that can control heat transport, groundwater flow, and biological activity. Temperature was measured over one and a half years in a vertical profile in the Red Lake Bog, Minnesota. To successfully simulate the transport of heat within the peat profile, the U.S. Geological Survey's SUTRA computer code was modified. The modified code simulates fully saturated, coupled porewater-energy transport, with freezing and melting porewater, and includes proportional heat capacity and thermal conductivity of water and ice, decreasing matrix permeability due to ice formation, and latent heat. The model is verified by correctly simulating the Lunardini analytical solution for ice formation in a porous medium with a mixed ice-water zone. The modified SUTRA model correctly simulates the temperature and ice distributions in the peat bog. Two possible benchmark problems for groundwater and energy transport with ice formation and melting are proposed that may be used by other researchers for code comparison. ?? 2006 Elsevier Ltd. All rights reserved.

  14. Grizzly Staus Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Zhang, Yongfeng; Chakraborty, Pritam

    2014-09-01

    This report summarizes work during FY 2014 to develop capabilities to predict embrittlement of reactor pressure vessel steel, and to assess the response of embrittled reactor pressure vessels to postulated accident conditions. This work has been conducted a three length scales. At the engineering scale, 3D fracture mechanics capabilities have been developed to calculate stress intensities and fracture toughnesses, to perform a deterministic assessment of whether a crack would propagate at the location of an existing flaw. This capability has been demonstrated on several types of flaws in a generic reactor pressure vessel model. Models have been developed at themore » scale of fracture specimens to develop a capability to determine how irradiation affects the fracture toughness of material. Verification work has been performed on a previously-developed model to determine the sensitivity of the model to specimen geometry and size effects. The effects of irradiation on the parameters of this model has been investigated. At lower length scales, work has continued in an ongoing to understand how irradiation and thermal aging affect the microstructure and mechanical properties of reactor pressure vessel steel. Previously-developed atomistic kinetic monte carlo models have been further developed and benchmarked against experimental data. Initial work has been performed to develop models of nucleation in a phase field model. Additional modeling work has also been performed to improve the fundamental understanding of the formation mechanisms and stability of matrix defects caused.« less

  15. Oxygen distribution in tumors: A qualitative analysis and modeling study providing a novel Monte Carlo approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lagerlöf, Jakob H., E-mail: Jakob@radfys.gu.se; Kindblom, Jon; Bernhardt, Peter

    2014-09-15

    Purpose: To construct a Monte Carlo (MC)-based simulation model for analyzing the dependence of tumor oxygen distribution on different variables related to tumor vasculature [blood velocity, vessel-to-vessel proximity (vessel proximity), and inflowing oxygen partial pressure (pO{sub 2})]. Methods: A voxel-based tissue model containing parallel capillaries with square cross-sections (sides of 10 μm) was constructed. Green's function was used for diffusion calculations and Michaelis-Menten's kinetics to manage oxygen consumption. The model was tuned to approximately reproduce the oxygenational status of a renal carcinoma; the depth oxygenation curves (DOC) were fitted with an analytical expression to facilitate rapid MC simulations of tumormore » oxygen distribution. DOCs were simulated with three variables at three settings each (blood velocity, vessel proximity, and inflowing pO{sub 2}), which resulted in 27 combinations of conditions. To create a model that simulated variable oxygen distributions, the oxygen tension at a specific point was randomly sampled with trilinear interpolation in the dataset from the first simulation. Six correlations between blood velocity, vessel proximity, and inflowing pO{sub 2} were hypothesized. Variable models with correlated parameters were compared to each other and to a nonvariable, DOC-based model to evaluate the differences in simulated oxygen distributions and tumor radiosensitivities for different tumor sizes. Results: For tumors with radii ranging from 5 to 30 mm, the nonvariable DOC model tended to generate normal or log-normal oxygen distributions, with a cut-off at zero. The pO{sub 2} distributions simulated with the six-variable DOC models were quite different from the distributions generated with the nonvariable DOC model; in the former case the variable models simulated oxygen distributions that were more similar to in vivo results found in the literature. For larger tumors, the oxygen distributions became truncated in the lower end, due to anoxia, but smaller tumors showed undisturbed oxygen distributions. The six different models with correlated parameters generated three classes of oxygen distributions. The first was a hypothetical, negative covariance between vessel proximity and pO{sub 2} (VPO-C scenario); the second was a hypothetical positive covariance between vessel proximity and pO{sub 2} (VPO+C scenario); and the third was the hypothesis of no correlation between vessel proximity and pO{sub 2} (UP scenario). The VPO-C scenario produced a distinctly different oxygen distribution than the two other scenarios. The shape of the VPO-C scenario was similar to that of the nonvariable DOC model, and the larger the tumor, the greater the similarity between the two models. For all simulations, the mean oxygen tension decreased and the hypoxic fraction increased with tumor size. The absorbed dose required for definitive tumor control was highest for the VPO+C scenario, followed by the UP and VPO-C scenarios. Conclusions: A novel MC algorithm was presented which simulated oxygen distributions and radiation response for various biological parameter values. The analysis showed that the VPO-C scenario generated a clearly different oxygen distribution from the VPO+C scenario; the former exhibited a lower hypoxic fraction and higher radiosensitivity. In future studies, this modeling approach might be valuable for qualitative analyses of factors that affect oxygen distribution as well as analyses of specific experimental and clinical situations.« less

  16. Oxygen distribution in tumors: a qualitative analysis and modeling study providing a novel Monte Carlo approach.

    PubMed

    Lagerlöf, Jakob H; Kindblom, Jon; Bernhardt, Peter

    2014-09-01

    To construct a Monte Carlo (MC)-based simulation model for analyzing the dependence of tumor oxygen distribution on different variables related to tumor vasculature [blood velocity, vessel-to-vessel proximity (vessel proximity), and inflowing oxygen partial pressure (pO2)]. A voxel-based tissue model containing parallel capillaries with square cross-sections (sides of 10 μm) was constructed. Green's function was used for diffusion calculations and Michaelis-Menten's kinetics to manage oxygen consumption. The model was tuned to approximately reproduce the oxygenational status of a renal carcinoma; the depth oxygenation curves (DOC) were fitted with an analytical expression to facilitate rapid MC simulations of tumor oxygen distribution. DOCs were simulated with three variables at three settings each (blood velocity, vessel proximity, and inflowing pO2), which resulted in 27 combinations of conditions. To create a model that simulated variable oxygen distributions, the oxygen tension at a specific point was randomly sampled with trilinear interpolation in the dataset from the first simulation. Six correlations between blood velocity, vessel proximity, and inflowing pO2 were hypothesized. Variable models with correlated parameters were compared to each other and to a nonvariable, DOC-based model to evaluate the differences in simulated oxygen distributions and tumor radiosensitivities for different tumor sizes. For tumors with radii ranging from 5 to 30 mm, the nonvariable DOC model tended to generate normal or log-normal oxygen distributions, with a cut-off at zero. The pO2 distributions simulated with the six-variable DOC models were quite different from the distributions generated with the nonvariable DOC model; in the former case the variable models simulated oxygen distributions that were more similar to in vivo results found in the literature. For larger tumors, the oxygen distributions became truncated in the lower end, due to anoxia, but smaller tumors showed undisturbed oxygen distributions. The six different models with correlated parameters generated three classes of oxygen distributions. The first was a hypothetical, negative covariance between vessel proximity and pO2 (VPO-C scenario); the second was a hypothetical positive covariance between vessel proximity and pO2 (VPO+C scenario); and the third was the hypothesis of no correlation between vessel proximity and pO2 (UP scenario). The VPO-C scenario produced a distinctly different oxygen distribution than the two other scenarios. The shape of the VPO-C scenario was similar to that of the nonvariable DOC model, and the larger the tumor, the greater the similarity between the two models. For all simulations, the mean oxygen tension decreased and the hypoxic fraction increased with tumor size. The absorbed dose required for definitive tumor control was highest for the VPO+C scenario, followed by the UP and VPO-C scenarios. A novel MC algorithm was presented which simulated oxygen distributions and radiation response for various biological parameter values. The analysis showed that the VPO-C scenario generated a clearly different oxygen distribution from the VPO+C scenario; the former exhibited a lower hypoxic fraction and higher radiosensitivity. In future studies, this modeling approach might be valuable for qualitative analyses of factors that affect oxygen distribution as well as analyses of specific experimental and clinical situations.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonen, F.A.; Johnson, K.I.; Liebetrau, A.M.

    The VISA-II (Vessel Integrity Simulation Analysis code was originally developed as part of the NRC staff evaluation of pressurized thermal shock. VISA-II uses Monte Carlo simulation to evaluate the failure probability of a pressurized water reactor (PWR) pressure vessel subjected to a pressure and thermal transient specified by the user. Linear elastic fracture mechanics methods are used to model crack initiation and propagation. Parameters for initial crack size and location, copper content, initial reference temperature of the nil-ductility transition, fluence, crack-initiation fracture toughness, and arrest fracture toughness are treated as random variables. This report documents an upgraded version of themore » original VISA code as described in NUREG/CR-3384. Improvements include a treatment of cladding effects, a more general simulation of flaw size, shape and location, a simulation of inservice inspection, an updated simulation of the reference temperature of the nil-ductility transition, and treatment of vessels with multiple welds and initial flaws. The code has been extensively tested and verified and is written in FORTRAN for ease of installation on different computers. 38 refs., 25 figs.« less

  18. What are the assets and weaknesses of HFO detectors? A benchmark framework based on realistic simulations

    PubMed Central

    Pizzo, Francesca; Bartolomei, Fabrice; Wendling, Fabrice; Bénar, Christian-George

    2017-01-01

    High-frequency oscillations (HFO) have been suggested as biomarkers of epileptic tissues. While visual marking of these short and small oscillations is tedious and time-consuming, automatic HFO detectors have not yet met a large consensus. Even though detectors have been shown to perform well when validated against visual marking, the large number of false detections due to their lack of robustness hinder their clinical application. In this study, we developed a validation framework based on realistic and controlled simulations to quantify precisely the assets and weaknesses of current detectors. We constructed a dictionary of synthesized elements—HFOs and epileptic spikes—from different patients and brain areas by extracting these elements from the original data using discrete wavelet transform coefficients. These elements were then added to their corresponding simulated background activity (preserving patient- and region- specific spectra). We tested five existing detectors against this benchmark. Compared to other studies confronting detectors, we did not only ranked them according their performance but we investigated the reasons leading to these results. Our simulations, thanks to their realism and their variability, enabled us to highlight unreported issues of current detectors: (1) the lack of robust estimation of the background activity, (2) the underestimated impact of the 1/f spectrum, and (3) the inadequate criteria defining an HFO. We believe that our benchmark framework could be a valuable tool to translate HFOs into a clinical environment. PMID:28406919

  19. High-order continuum kinetic method for modeling plasma dynamics in phase space

    DOE PAGES

    Vogman, G. V.; Colella, P.; Shumlak, U.

    2014-12-15

    Continuum methods offer a high-fidelity means of simulating plasma kinetics. While computationally intensive, these methods are advantageous because they can be cast in conservation-law form, are not susceptible to noise, and can be implemented using high-order numerical methods. Advances in continuum method capabilities for modeling kinetic phenomena in plasmas require the development of validation tools in higher dimensional phase space and an ability to handle non-cartesian geometries. To that end, a new benchmark for validating Vlasov-Poisson simulations in 3D (x,v x,v y) is presented. The benchmark is based on the Dory-Guest-Harris instability and is successfully used to validate a continuummore » finite volume algorithm. To address challenges associated with non-cartesian geometries, unique features of cylindrical phase space coordinates are described. Preliminary results of continuum kinetic simulations in 4D (r,z,v r,v z) phase space are presented.« less

  20. Benchmarks for time-domain simulation of sound propagation in soft-walled airways: Steady configurations

    PubMed Central

    Titze, Ingo R.; Palaparthi, Anil; Smith, Simeon L.

    2014-01-01

    Time-domain computer simulation of sound production in airways is a widely used tool, both for research and synthetic speech production technology. Speed of computation is generally the rationale for one-dimensional approaches to sound propagation and radiation. Transmission line and wave-reflection (scattering) algorithms are used to produce formant frequencies and bandwidths for arbitrarily shaped airways. Some benchmark graphs and tables are provided for formant frequencies and bandwidth calculations based on specific mathematical terms in the one-dimensional Navier–Stokes equation. Some rules are provided here for temporal and spatial discretization in terms of desired accuracy and stability of the solution. Kinetic losses, which have been difficult to quantify in frequency-domain simulations, are quantified here on the basis of the measurements of Scherer, Torkaman, Kucinschi, and Afjeh [(2010). J. Acoust. Soc. Am. 128(2), 828–838]. PMID:25480071

  1. Global Gridded Crop Model Evaluation: Benchmarking, Skills, Deficiencies and Implications.

    NASA Technical Reports Server (NTRS)

    Muller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; Deryng, Delphine; Folberth, Christian; Glotter, Michael; Hoek, Steven; hide

    2017-01-01

    Crop models are increasingly used to simulate crop yields at the global scale, but so far there is no general framework on how to assess model performance. Here we evaluate the simulation results of 14 global gridded crop modeling groups that have contributed historic crop yield simulations for maize, wheat, rice and soybean to the Global Gridded Crop Model Intercomparison (GGCMI) of the Agricultural Model Intercomparison and Improvement Project (AgMIP). Simulation results are compared to reference data at global, national and grid cell scales and we evaluate model performance with respect to time series correlation, spatial correlation and mean bias. We find that global gridded crop models (GGCMs) show mixed skill in reproducing time series correlations or spatial patterns at the different spatial scales. Generally, maize, wheat and soybean simulations of many GGCMs are capable of reproducing larger parts of observed temporal variability (time series correlation coefficients (r) of up to 0.888 for maize, 0.673 for wheat and 0.643 for soybean at the global scale) but rice yield variability cannot be well reproduced by most models. Yield variability can be well reproduced for most major producing countries by many GGCMs and for all countries by at least some. A comparison with gridded yield data and a statistical analysis of the effects of weather variability on yield variability shows that the ensemble of GGCMs can explain more of the yield variability than an ensemble of regression models for maize and soybean, but not for wheat and rice. We identify future research needs in global gridded crop modeling and for all individual crop modeling groups. In the absence of a purely observation-based benchmark for model evaluation, we propose that the best performing crop model per crop and region establishes the benchmark for all others, and modelers are encouraged to investigate how crop model performance can be increased. We make our evaluation system accessible to all crop modelers so that other modeling groups can also test their model performance against the reference data and the GGCMI benchmark.

  2. Supercritical and Transcritical Shear Flows in Microgravity: Experiments and Direct Numerical Simulations

    DTIC Science & Technology

    2006-08-01

    Boiler and Pressure Vessel Code were con...GRC, and to specifically state a general operating requirement. 1.1. The entire apparatus will be designed to ASME Boiler and Pressure Vessel Code , whenever...calculations, including a finite element analysis (FEA) will be inspected to verify the ASME Boiler and Pressure Vessel Code has been me, whenever

  3. Three-Dimensional Blood Vessel Model with Temperature-Indicating Function for Evaluation of Thermal Damage during Surgery

    PubMed Central

    Watanabe, Takafumi; Arai, Fumihito

    2018-01-01

    Surgical simulators have recently attracted attention because they enable the evaluation of the surgical skills of medical doctors and the performance of medical devices. However, thermal damage to the human body during surgery is difficult to evaluate using conventional surgical simulators. In this study, we propose a functional surgical model with a temperature-indicating function for the evaluation of thermal damage during surgery. The simulator is made of a composite material of polydimethylsiloxane and a thermochromic dye, which produces an irreversible color change as the temperature increases. Using this material, we fabricated a three-dimensional blood vessel model using the lost-wax process. We succeeded in fabricating a renal vessel model for simulation of catheter ablation. Increases in the temperature of the materials can be measured by image analysis of their color change. The maximum measurement error of the temperature was approximately −1.6 °C/+2.4 °C within the range of 60 °C to 100 °C. PMID:29370139

  4. Numerical simulation of microcarrier motion in a rotating wall vessel bioreactor.

    PubMed

    Ju, Zhi-Hao; Liu, Tian-Qing; Ma, Xue-Hu; Cui, Zhan-Feng

    2006-06-01

    To analyze the forces of rotational wall vessel (RWV) bioreactor on small tissue pieces or microcarrier particles and to determine the tracks of microcarrier particles in RWV bioreactor. The motion of the microcarrier in the rotating wall vessel (RWV) bioreactor with both the inner and outer cylinders rotating was modeled by numerical simulation. The continuous trajectory of microcarrier particles, including the possible collision with the wall was obtained. An expression between the minimum rotational speed difference of the inner and outer cylinders and the microcarrier particle or aggregate radius could avoid collisions with either wall. The range of microcarrier radius or tissue size, which could be safely cultured in the RWV bioreactor, in terms of shear stress level, was determined. The model works well in describing the trajectory of a heavier microcarrier particle in rotating wall vessel.

  5. Body surface detection method for photoacoustic image data using cloth-simulation technique

    NASA Astrophysics Data System (ADS)

    Sekiguchi, H.; Yoshikawa, A.; Matsumoto, Y.; Asao, Y.; Yagi, T.; Togashi, K.; Toi, M.

    2018-02-01

    Photoacoustic tomography (PAT) is a novel modality that can visualize blood vessels without contrast agents. It clearly shows blood vessels near the body surface. However, these vessels obstruct the observation of deep blood vessels. As the existence range of each vessel is determined by the distance from the body surface, they can be separated if the position of the skin is known. However, skin tissue, which does not contain hemoglobin, does not appear in PAT results, therefore, manual estimation is required. As this task is very labor-intensive, its automation is highly desirable. Therefore, we developed a method to estimate the body surface using the cloth-simulation technique, which is a commonly used method to create computer graphics (CG) animations; however, it has not yet been employed for medical image processing. In cloth simulations, the virtual cloth is represented by a two-dimensional array of mass nodes. The nodes are connected with each other by springs. Once the cloth is released from a position away from the body, each node begins to move downwards under the effect of gravity, spring, and other forces; some of the nodes hit the superficial vessels and stop. The cloth position in the stationary state represents the body surface. The body surface estimation, which required approximately 1 h with the manual method, is automated and it takes only approximately 10 s with the proposed method. The proposed method could facilitate the practical use of PAT.

  6. Delay Tolerant Networking - Bundle Protocol Simulation

    NASA Technical Reports Server (NTRS)

    SeGui, John; Jenning, Esther

    2006-01-01

    In this paper, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the useof MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions.

  7. A Modular Simulation Framework for Assessing Swarm Search Models

    DTIC Science & Technology

    2014-09-01

    SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework

  8. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    USDA-ARS?s Scientific Manuscript database

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  10. A Comparison of Microvascular Responses to Visible and Near-Infrared Lasers

    PubMed Central

    Li, D.; Farshidi, D.; Wang, G.X.; He, Y.L.; Kelly, K.M.; Wu, W.J.; Chen, B.; Ying, Z.X.

    2015-01-01

    Background and Objective Pulsed dye laser (PDL) is a commonly used treatment for Port Wine Stain birthmarks (PWS). However, deeper components of PWS are often resistant to PDL. Deeper penetrating lasers, including the long pulsed Neodymium:Yttrium Aluminum Garnet (Nd:YAG) laser have been used, but carry greater risk. This study evaluates the distinct blood vessel thermal responses to visible (595 nm) and near infrared (1,064 nm) lasers using animal and numerical models. Study Design/Materials and Methods Blood vessels in the rodent dorsal skin chamber (DSC) were irradiated by a 595 nm PDL and a long-pulsed 1,064 nm Nd:YAG laser. Laser-induced immediate and 1-hour post-structural and functional changes in the vessels were documented. Numerical simulations were conducted using a 1,000 μm depth SD mouse skin fold to simulate experimental conditions. Results PDL irradiation produced immediate blood vessel hemorrhage. Modeling indicated this occurs due to preferential heating of the superior parts of large blood vessels. Nd:YAG irradiation resulted in blood vessel constriction; modeling indicated more uniform heating of vessel walls. Conclusion PDL and Nd:YAG lasers result in distinct tissue responses. This supports different observable clinical treatment end points when using these devices. Vessel constriction associated with the Nd:YAG may be more difficult to observe and is one reason this device may carry greater risk. Lasers Surg. Med. 46:479–487, 2014. PMID:24974953

  11. Quantitative evaluation of mucosal vascular contrast in narrow band imaging using Monte Carlo modeling

    NASA Astrophysics Data System (ADS)

    Le, Du; Wang, Quanzeng; Ramella-Roman, Jessica; Pfefer, Joshua

    2012-06-01

    Narrow-band imaging (NBI) is a spectrally-selective reflectance imaging technique for enhanced visualization of superficial vasculature. Prior clinical studies have indicated NBI's potential for detection of vasculature abnormalities associated with gastrointestinal mucosal neoplasia. While the basic mechanisms behind the increased vessel contrast - hemoglobin absorption and tissue scattering - are known, a quantitative understanding of the effect of tissue and device parameters has not been achieved. In this investigation, we developed and implemented a numerical model of light propagation that simulates NBI reflectance distributions. This was accomplished by incorporating mucosal tissue layers and vessel-like structures in a voxel-based Monte Carlo algorithm. Epithelial and mucosal layers as well as blood vessels were defined using wavelength-specific optical properties. The model was implemented to calculate reflectance distributions and vessel contrast values as a function of vessel depth (0.05 to 0.50 mm) and diameter (0.01 to 0.10 mm). These relationships were determined for NBI wavelengths of 410 nm and 540 nm, as well as broadband illumination common to standard endoscopic imaging. The effects of illumination bandwidth on vessel contrast were also simulated. Our results provide a quantitative analysis of the effect of absorption and scattering on vessel contrast. Additional insights and potential approaches for improving NBI system contrast are discussed.

  12. PID controller tuning using metaheuristic optimization algorithms for benchmark problems

    NASA Astrophysics Data System (ADS)

    Gholap, Vishal; Naik Dessai, Chaitali; Bagyaveereswaran, V.

    2017-11-01

    This paper contributes to find the optimal PID controller parameters using particle swarm optimization (PSO), Genetic Algorithm (GA) and Simulated Annealing (SA) algorithm. The algorithms were developed through simulation of chemical process and electrical system and the PID controller is tuned. Here, two different fitness functions such as Integral Time Absolute Error and Time domain Specifications were chosen and applied on PSO, GA and SA while tuning the controller. The proposed Algorithms are implemented on two benchmark problems of coupled tank system and DC motor. Finally, comparative study has been done with different algorithms based on best cost, number of iterations and different objective functions. The closed loop process response for each set of tuned parameters is plotted for each system with each fitness function.

  13. Optimization and benchmarking of a perturbative Metropolis Monte Carlo quantum mechanics/molecular mechanics program

    NASA Astrophysics Data System (ADS)

    Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A.

    2017-12-01

    In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.

  14. Optimization and benchmarking of a perturbative Metropolis Monte Carlo quantum mechanics/molecular mechanics program.

    PubMed

    Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A

    2017-12-28

    In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.

  15. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  16. "Aid to Thought"--Just Simulate It!

    ERIC Educational Resources Information Center

    Kinczkowski, Linda; Cardon, Phillip; Speelman, Pamela

    2015-01-01

    This paper provides examples of Aid-to-Thought uses in urban decision making, classroom laboratory planning, and in a ship antiaircraft defense system. Aid-to-Thought modeling and simulations are tools students can use effectively in a STEM classroom while meeting Standards for Technological Literacy Benchmarks O and R. These projects prepare…

  17. Toward Automated Benchmarking of Atomistic Force Fields: Neat Liquid Densities and Static Dielectric Constants from the ThermoML Data Archive.

    PubMed

    Beauchamp, Kyle A; Behr, Julie M; Rustenburg, Ariën S; Bayly, Christopher I; Kroenlein, Kenneth; Chodera, John D

    2015-10-08

    Atomistic molecular simulations are a powerful way to make quantitative predictions, but the accuracy of these predictions depends entirely on the quality of the force field employed. Although experimental measurements of fundamental physical properties offer a straightforward approach for evaluating force field quality, the bulk of this information has been tied up in formats that are not machine-readable. Compiling benchmark data sets of physical properties from non-machine-readable sources requires substantial human effort and is prone to the accumulation of human errors, hindering the development of reproducible benchmarks of force-field accuracy. Here, we examine the feasibility of benchmarking atomistic force fields against the NIST ThermoML data archive of physicochemical measurements, which aggregates thousands of experimental measurements in a portable, machine-readable, self-annotating IUPAC-standard format. As a proof of concept, we present a detailed benchmark of the generalized Amber small-molecule force field (GAFF) using the AM1-BCC charge model against experimental measurements (specifically, bulk liquid densities and static dielectric constants at ambient pressure) automatically extracted from the archive and discuss the extent of data available for use in larger scale (or continuously performed) benchmarks. The results of even this limited initial benchmark highlight a general problem with fixed-charge force fields in the representation low-dielectric environments, such as those seen in binding cavities or biological membranes.

  18. 3D J-Integral Capability in Grizzly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Backman, Marie; Chakraborty, Pritam

    2014-09-01

    This report summarizes work done to develop a capability to evaluate fracture contour J-Integrals in 3D in the Grizzly code. In the current fiscal year, a previously-developed 2D implementation of a J-Integral evaluation capability has been extended to work in 3D, and to include terms due both to mechanically-induced strains and due to gradients in thermal strains. This capability has been verified against a benchmark solution on a model of a curved crack front in 3D. The thermal term in this integral has been verified against a benchmark problem with a thermal gradient. These developments are part of a largermore » effort to develop Grizzly as a tool that can be used to predict the evolution of aging processes in nuclear power plant systems, structures, and components, and assess their capacity after being subjected to those aging processes. The capabilities described here have been developed to enable evaluations of Mode- stress intensity factors on axis-aligned flaws in reactor pressure vessels. These can be compared with the fracture toughness of the material to determine whether a pre-existing flaw would begin to propagate during a pos- tulated pressurized thermal shock accident. This report includes a demonstration calculation to show how Grizzly is used to perform a deterministic assessment of such a flaw propagation in a degraded reactor pressure vessel under pressurized thermal shock conditions. The stress intensity is calculated from J, and the toughness is computed using the fracture master curve and the degraded ductile to brittle transition temperature.« less

  19. CFD-Based Design of Turbopump Inlet Duct for Reduced Dynamic Loads

    NASA Technical Reports Server (NTRS)

    Rothermel, Jeffry; Dorney, Suzanne M.; Dorney, Daniel J.

    2003-01-01

    Numerical simulations have been completed for a variety of designs for a 90 deg elbow duct. The objective is to identify a design that minimizes the dynamic load entering a LOX turbopump located at the elbow exit. Designs simulated to date indicate that simpler duct geometries result in lower losses. Benchmark simulations have verified that the compressible flow codes used in this study are applicable to these incompressible flow simulations.

  20. CFD-based Design of LOX Pump Inlet Duct for Reduced Dynamic Loads

    NASA Technical Reports Server (NTRS)

    Rothermel, Jeffry; Dorney, Daniel J.; Dorney, Suzanne M.

    2003-01-01

    Numerical simulations have been completed for a variety of designs for a 90 deg elbow duct. The objective is to identify a design that minimizes the dynamic load entering a LOX turbopump located at the elbow exit. Designs simulated to date indicate that simpler duct geometries result in lower losses. Benchmark simulations have verified that the compressible flow code used in this study is applicable to these incompressible flow simulations.

  1. Numerical investigation of vessel heating using a copper vapor laser and a pulsed dye laser in treating vascular skin lesions

    NASA Astrophysics Data System (ADS)

    Pushkareva, A. E.; Ponomarev, I. V.; Isaev, A. A.; Klyuchareva, S. V.

    2018-02-01

    A computer simulation technique was employed to study the selective heating of a tissue vessel using emission from a pulsed copper vapor laser and a pulsed dye laser. The depth and size of vessels that could be selectively and safely removed were determined for the lasers under examination.

  2. Investigating the limitations of single breath-hold renal artery blood flow measurements using spiral phase contrast MR with R-R interval averaging.

    PubMed

    Steeden, Jennifer A; Muthurangu, Vivek

    2015-04-01

    1) To validate an R-R interval averaged golden angle spiral phase contrast magnetic resonance (RAGS PCMR) sequence against conventional cine PCMR for assessment of renal blood flow (RBF) in normal volunteers; and 2) To investigate the effects of motion and heart rate on the accuracy of flow measurements using an in silico simulation. In 20 healthy volunteers RAGS (∼6 sec breath-hold) and respiratory-navigated cine (∼5 min) PCMR were performed in both renal arteries to assess RBF. A simulation of RAGS PCMR was used to assess the effect of heart rate (30-105 bpm), vessel expandability (0-150%) and translational motion (x1.0-4.0) on the accuracy of RBF measurements. There was good agreement between RAGS and cine PCMR in the volunteer study (bias: 0.01 L/min, limits of agreement: -0.04 to +0.06 L/min, P = 0.0001). The simulation demonstrated a positive linear relationship between heart rate and error (r = 0.9894, P < 0.0001), a negative linear relationship between vessel expansion and error (r = -0.9484, P < 0.0001), and a nonlinear, heart rate-dependent relationship between vessel translation and error. We have demonstrated that RAGS PCMR accurately measures RBF in vivo. However, the simulation reveals limitations in this technique at extreme heart rates (<40 bpm, >100 bpm), or when there is significant motion (vessel expandability: >80%, vessel translation: >x2.2). © 2014 Wiley Periodicals, Inc.

  3. Turbulent flow in a vessel agitated by side entering inclined blade turbine with different diameter using CFD simulation

    NASA Astrophysics Data System (ADS)

    Fathonah, N. N.; Nurtono, T.; Kusdianto; Winardi, S.

    2018-03-01

    Single phase turbulent flow in a vessel agitated by side entering inclined blade turbine has simulated using CFD. The aim of this work is to identify the hydrodynamic characteristics of a model vessel, which geometrical configuration is adopted at industrial scale. The laboratory scale model vessel is a flat bottomed cylindrical tank agitated by side entering 4-blade inclined blade turbine with impeller rotational speed N=100-400 rpm. The effect of the impeller diameter on fluid flow pattern has been investigated. The fluid flow patterns in a vessel is essentially characterized by the phenomena of macro-instabilities, i.e. the flow patterns change with large scale in space and low frequency. The intensity of fluid flow in the tank increase with the increase of impeller rotational speed from 100, 200, 300, and 400 rpm. It was accompanied by shifting the position of the core of circulation flow away from impeller discharge stream and approached the front of the tank wall. The intensity of fluid flow in the vessel increase with the increase of the impeller diameter from d=3 cm to d=4 cm.

  4. Assessment of the effect of population and diary sampling methods on estimation of school-age children exposure to fine particles.

    PubMed

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2014-12-01

    Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.

  5. Measurement with microscopic MRI and simulation of flow in different aneurysm models.

    PubMed

    Edelhoff, Daniel; Walczak, Lars; Frank, Frauke; Heil, Marvin; Schmitz, Inge; Weichert, Frank; Suter, Dieter

    2015-10-01

    The impact and the development of aneurysms depend to a significant degree on the exchange of liquid between the regular vessel and the pathological extension. A better understanding of this process will lead to improved prediction capabilities. The aim of the current study was to investigate fluid-exchange in aneurysm models of different complexities by combining microscopic magnetic resonance measurements with numerical simulations. In order to evaluate the accuracy and applicability of these methods, the fluid-exchange process between the unaltered vessel lumen and the aneurysm phantoms was analyzed quantitatively using high spatial resolution. Magnetic resonance flow imaging was used to visualize fluid-exchange in two different models produced with a 3D printer. One model of an aneurysm was based on histological findings. The flow distribution in the different models was measured on a microscopic scale using time of flight magnetic resonance imaging. The whole experiment was simulated using fast graphics processing unit-based numerical simulations. The obtained simulation results were compared qualitatively and quantitatively with the magnetic resonance imaging measurements, taking into account flow and spin-lattice relaxation. The results of both presented methods compared well for the used aneurysm models and the chosen flow distributions. The results from the fluid-exchange analysis showed comparable characteristics concerning measurement and simulation. Similar symmetry behavior was observed. Based on these results, the amount of fluid-exchange was calculated. Depending on the geometry of the models, 7% to 45% of the liquid was exchanged per second. The result of the numerical simulations coincides well with the experimentally determined velocity field. The rate of fluid-exchange between vessel and aneurysm was well-predicted. Hence, the results obtained by simulation could be validated by the experiment. The observed deviations can be caused by the noise in the measurement and by the limited resolution of the simulation. The resulting differences are small enough to allow reliable predictions of the flow distribution in vessels with stents and for pulsed blood flow.

  6. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  7. International benchmarking of longitudinal train dynamics simulators: results

    NASA Astrophysics Data System (ADS)

    Wu, Qing; Spiryagin, Maksym; Cole, Colin; Chang, Chongyi; Guo, Gang; Sakalo, Alexey; Wei, Wei; Zhao, Xubao; Burgelman, Nico; Wiersma, Pier; Chollet, Hugues; Sebes, Michel; Shamdani, Amir; Melzi, Stefano; Cheli, Federico; di Gialleonardo, Egidio; Bosso, Nicola; Zampieri, Nicolò; Luo, Shihui; Wu, Honghua; Kaza, Guy-Léon

    2018-03-01

    This paper presents the results of the International Benchmarking of Longitudinal Train Dynamics Simulators which involved participation of nine simulators (TABLDSS, UM, CRE-LTS, TDEAS, PoliTo, TsDyn, CARS, BODYSIM and VOCO) from six countries. Longitudinal train dynamics results and computing time of four simulation cases are presented and compared. The results show that all simulators had basic agreement in simulations of locomotive forces, resistance forces and track gradients. The major differences among different simulators lie in the draft gear models. TABLDSS, UM, CRE-LTS, TDEAS, TsDyn and CARS had general agreement in terms of the in-train forces; minor differences exist as reflections of draft gear model variations. In-train force oscillations were observed in VOCO due to the introduction of wheel-rail contact. In-train force instabilities were sometimes observed in PoliTo and BODYSIM due to the velocity controlled transitional characteristics which could have generated unreasonable transitional stiffness. Regarding computing time per train operational second, the following list is in order of increasing computing speed: VOCO, TsDyn, PoliTO, CARS, BODYSIM, UM, TDEAS, CRE-LTS and TABLDSS (fastest); all simulators except VOCO, TsDyn and PoliTo achieved faster speeds than real-time simulations. Similarly, regarding computing time per integration step, the computing speeds in order are: CRE-LTS, VOCO, CARS, TsDyn, UM, TABLDSS and TDEAS (fastest).

  8. Real-time surgery simulation of intracranial aneurysm clipping with patient-specific geometries and haptic feedback

    NASA Astrophysics Data System (ADS)

    Fenz, Wolfgang; Dirnberger, Johannes

    2015-03-01

    Providing suitable training for aspiring neurosurgeons is becoming more and more problematic. The increasing popularity of the endovascular treatment of intracranial aneurysms leads to a lack of simple surgical situations for clipping operations, leaving mainly the complex cases, which present even experienced surgeons with a challenge. To alleviate this situation, we have developed a training simulator with haptic interaction allowing trainees to practice virtual clipping surgeries on real patient-specific vessel geometries. By using specialized finite element (FEM) algorithms (fast finite element method, matrix condensation) combined with GPU acceleration, we can achieve the necessary frame rate for smooth real-time interaction with the detailed models needed for a realistic simulation of the vessel wall deformation caused by the clamping with surgical clips. Vessel wall geometries for typical training scenarios were obtained from 3D-reconstructed medical image data, while for the instruments (clipping forceps, various types of clips, suction tubes) we use models provided by manufacturer Aesculap AG. Collisions between vessel and instruments have to be continuously detected and transformed into corresponding boundary conditions and feedback forces, calculated using a contact plane method. After a training, the achieved result can be assessed based on various criteria, including a simulation of the residual blood flow into the aneurysm. Rigid models of the surgical access and surrounding brain tissue, plus coupling a real forceps to the haptic input device further increase the realism of the simulation.

  9. Three-dimensional multi-scale model of deformable platelets adhesion to vessel wall in blood flow

    PubMed Central

    Wu, Ziheng; Xu, Zhiliang; Kim, Oleg; Alber, Mark

    2014-01-01

    When a blood vessel ruptures or gets inflamed, the human body responds by rapidly forming a clot to restrict the loss of blood. Platelets aggregation at the injury site of the blood vessel occurring via platelet–platelet adhesion, tethering and rolling on the injured endothelium is a critical initial step in blood clot formation. A novel three-dimensional multi-scale model is introduced and used in this paper to simulate receptor-mediated adhesion of deformable platelets at the site of vascular injury under different shear rates of blood flow. The novelty of the model is based on a new approach of coupling submodels at three biological scales crucial for the early clot formation: novel hybrid cell membrane submodel to represent physiological elastic properties of a platelet, stochastic receptor–ligand binding submodel to describe cell adhesion kinetics and lattice Boltzmann submodel for simulating blood flow. The model implementation on the GPU cluster significantly improved simulation performance. Predictive model simulations revealed that platelet deformation, interactions between platelets in the vicinity of the vessel wall as well as the number of functional GPIbα platelet receptors played significant roles in platelet adhesion to the injury site. Variation of the number of functional GPIbα platelet receptors as well as changes of platelet stiffness can represent effects of specific drugs reducing or enhancing platelet activity. Therefore, predictive simulations can improve the search for new drug targets and help to make treatment of thrombosis patient-specific. PMID:24982253

  10. Fluid Dynamics of Magnetic Nanoparticles in Simulated Blood Vessels

    NASA Astrophysics Data System (ADS)

    Blue, Lauren; Sewell, Mary Kathryn; Brazel, Christopher S.

    2008-11-01

    Magnetic nanoparticles (MNPs) can be used to locally target therapies and offer the benefit of using an AC magnetic field to combine hyperthermia treatment with the triggered release of therapeutic agents. Here, we investigate localization of MNPs in a simulated environment to understand the relationship between magnetic field intensity and bulk fluid dynamics to determine MNP retention in a simulated blood vessel. As MNPs travel through blood vessels, they can be slowed or trapped in a specific area by applying a magnetic field. Magnetic cobalt ferrite nanoparticles were synthesized and labeled with a fluorescent rhodamine tag to visualize patterns in a flow cell, as monitored by a fluorescence microscope. Particle retention was determined as a function of flow rate, concentration, and magnetic field strength. Understanding the relationship between magnetic field intensity, flow behavior and nanoparticle characteristics will aid in the development of therapeutic systems specifically targeted to diseased tissue.

  11. Computational fluid-structure interaction: methods and application to a total cavopulmonary connection

    NASA Astrophysics Data System (ADS)

    Bazilevs, Yuri; Hsu, M.-C.; Benson, D. J.; Sankaran, S.; Marsden, A. L.

    2009-12-01

    The Fontan procedure is a surgery that is performed on single-ventricle heart patients, and, due to the wide range of anatomies and variations among patients, lends itself nicely to study by advanced numerical methods. We focus on a patient-specific Fontan configuration, and perform a fully coupled fluid-structure interaction (FSI) analysis of hemodynamics and vessel wall motion. To enable physiologically realistic simulations, a simple approach to constructing a variable-thickness blood vessel wall description is proposed. Rest and exercise conditions are simulated and rigid versus flexible vessel wall simulation results are compared. We conclude that flexible wall modeling plays an important role in predicting quantities of hemodynamic interest in the Fontan connection. To the best of our knowledge, this paper presents the first three-dimensional patient-specific fully coupled FSI analysis of a total cavopulmonary connection that also includes large portions of the pulmonary circulation.

  12. Refilling of a Hydraulically Isolated Embolized Xylem Vessel: Model Calculations

    PubMed Central

    VESALA, TIMO; HÖLTTÄ, TEEMU; PERÄMÄKI, MARTTI; NIKINMAA, EERO

    2003-01-01

    When they are hydraulically isolated, embolized xylem vessels can be refilled, while adjacent vessels remain under tension. This implies that the pressure of water in the refilling vessel must be equal to the bubble gas pressure, which sets physical constraints for recovery. A model of water exudation into the cylindrical vessel and of bubble dissolution based on the assumption of hydraulic isolation is developed. Refilling is made possible by the turgor of the living cells adjacent to the refilling vessel, and by a reflection coefficient below 1 for the exchange of solutes across the interface between the vessel and the adjacent cells. No active transport of solutes is assumed. Living cells are also capable of importing water from the water‐conducting vessels. The most limiting factors were found to be the osmotic potential of living cells and the ratio of the volume of the adjacent living cells to that of the embolized vessel. With values for these of 1·5 MPa and 1, respectively, refilling times were in the order of hours for a broad range of possible values of water conductivity coefficients and effective diffusion distances for dissolved air, when the xylem water tension was below 0·6 MPa and constant. Inclusion of the daily pattern for xylem tension improved the simulations. The simulated gas pressure within the refilling vessel was in accordance with recent experimental results. The study shows that the refilling process is physically possible under hydraulic isolation, while water in surrounding vessels is under negative pressure. However, the osmotic potentials in the refilling vessel tend to be large (in the order of 1 MPa). Only if the xylem water tension is, at most, twice atmospheric pressure, the reflection coefficient remains close to 1 (0·95) and the ratio of the volume of the adjacent living cells to that of the embolized vessel is about 2, does the osmotic potential stay below 0·4 MPa. PMID:12588721

  13. 2D Quantum Simulation of MOSFET Using the Non Equilibrium Green's Function Method

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexel; Anantram, M. P.; Govindan, T. R.; Yan, Jerry (Technical Monitor)

    2000-01-01

    The objectives this viewgraph presentation summarizes include: (1) the development of a quantum mechanical simulator for ultra short channel MOSFET simulation, including theory, physical approximations, and computer code; (2) explore physics that is not accessible by semiclassical methods; (3) benchmarking of semiclassical and classical methods; and (4) study other two-dimensional devices and molecular structure, from discretized Hamiltonian to tight-binding Hamiltonian.

  14. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  15. Influence of cerebral blood vessel movements on the position of perivascular synapses.

    PubMed

    Urrecha, Miguel; Romero, Ignacio; DeFelipe, Javier; Merchán-Pérez, Angel

    2017-01-01

    Synaptic activity is regulated and limited by blood flow, which is controlled by blood vessel dilation and contraction. Traditionally, the study of neurovascular coupling has mainly focused on energy consumption and oxygen delivery. However, the mechanical changes that blood vessel movements induce in the surrounding tissue have not been considered. We have modeled the mechanical changes that movements of blood vessels cause in neighboring synapses. Our simulations indicate that synaptic densities increase or decrease during vascular dilation and contraction, respectively, near the blood vessel walls. This phenomenon may alter the concentration of neurotransmitters and vasoactive substances in the immediate vicinity of the vessel wall and thus may have an influence on local blood flow.

  16. NetBenchmark: a bioconductor package for reproducible benchmarks of gene regulatory network inference.

    PubMed

    Bellot, Pau; Olsen, Catharina; Salembier, Philippe; Oliveras-Vergés, Albert; Meyer, Patrick E

    2015-09-29

    In the last decade, a great number of methods for reconstructing gene regulatory networks from expression data have been proposed. However, very few tools and datasets allow to evaluate accurately and reproducibly those methods. Hence, we propose here a new tool, able to perform a systematic, yet fully reproducible, evaluation of transcriptional network inference methods. Our open-source and freely available Bioconductor package aggregates a large set of tools to assess the robustness of network inference algorithms against different simulators, topologies, sample sizes and noise intensities. The benchmarking framework that uses various datasets highlights the specialization of some methods toward network types and data. As a result, it is possible to identify the techniques that have broad overall performances.

  17. A benchmark for fault tolerant flight control evaluation

    NASA Astrophysics Data System (ADS)

    Smaili, H.; Breeman, J.; Lombaerts, T.; Stroosma, O.

    2013-12-01

    A large transport aircraft simulation benchmark (REconfigurable COntrol for Vehicle Emergency Return - RECOVER) has been developed within the GARTEUR (Group for Aeronautical Research and Technology in Europe) Flight Mechanics Action Group 16 (FM-AG(16)) on Fault Tolerant Control (2004 2008) for the integrated evaluation of fault detection and identification (FDI) and reconfigurable flight control strategies. The benchmark includes a suitable set of assessment criteria and failure cases, based on reconstructed accident scenarios, to assess the potential of new adaptive control strategies to improve aircraft survivability. The application of reconstruction and modeling techniques, based on accident flight data, has resulted in high-fidelity nonlinear aircraft and fault models to evaluate new Fault Tolerant Flight Control (FTFC) concepts and their real-time performance to accommodate in-flight failures.

  18. 78 FR 30956 - Cruise Vessel Security and Safety Training Provider Certification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ..., practical demonstration, or simulation program. A detailed instructor manual must be submitted. Submissions... simulation programs to be used. If a simulator or simulation program is to be used, include technical... lessons and, if appropriate, for practical demonstrations or simulation exercises and assessments...

  19. PHISICS/RELAP5-3D RESULTS FOR EXERCISES II-1 AND II-2 OF THE OECD/NEA MHTGR-350 BENCHMARK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard

    2016-03-01

    The Idaho National Laboratory (INL) Advanced Reactor Technologies (ART) High-Temperature Gas-Cooled Reactor (HTGR) Methods group currently leads the Modular High-Temperature Gas-Cooled Reactor (MHTGR) 350 benchmark. The benchmark consists of a set of lattice-depletion, steady-state, and transient problems that can be used by HTGR simulation groups to assess the performance of their code suites. The paper summarizes the results obtained for the first two transient exercises defined for Phase II of the benchmark. The Parallel and Highly Innovative Simulation for INL Code System (PHISICS), coupled with the INL system code RELAP5-3D, was used to generate the results for the Depressurized Conductionmore » Cooldown (DCC) (exercise II-1a) and Pressurized Conduction Cooldown (PCC) (exercise II-2) transients. These exercises require the time-dependent simulation of coupled neutronics and thermal-hydraulics phenomena, and utilize the steady-state solution previously obtained for exercise I-3 of Phase I. This paper also includes a comparison of the benchmark results obtained with a traditional system code “ring” model against a more detailed “block” model that include kinetics feedback on an individual block level and thermal feedbacks on a triangular sub-mesh. The higher spatial fidelity that can be obtained by the block model is illustrated with comparisons of the maximum fuel temperatures, especially in the case of natural convection conditions that dominate the DCC and PCC events. Differences up to 125 K (or 10%) were observed between the ring and block model predictions of the DCC transient, mostly due to the block model’s capability of tracking individual block decay powers and more detailed helium flow distributions. In general, the block model only required DCC and PCC calculation times twice as long as the ring models, and it therefore seems that the additional development and calculation time required for the block model could be worth the gain that can be obtained in the spatial resolution« less

  20. Simulation-based inter-professional education to improve attitudes towards collaborative practice: a prospective comparative pilot study in a Chinese medical centre

    PubMed Central

    Yang, Ling-Yu; Yang, Ying-Ying; Huang, Chia-Chang; Liang, Jen-Feng; Lee, Fa-Yauh; Cheng, Hao-Min; Huang, Chin-Chou; Kao, Shou-Yen

    2017-01-01

    Objectives Inter-professional education (IPE) builds inter-professional collaboration (IPC) attitude/skills of health professionals. This interventional IPE programme evaluates whether benchmarking sharing can successfully cultivate seed instructors responsible for improving their team members’ IPC attitudes. Design Prospective, pre-post comparative cross-sectional pilot study. Setting/participants Thirty four physicians, 30 nurses and 24 pharmacists, who volunteered to be trained as seed instructors participated in 3.5-hour preparation and 3.5-hour simulation courses. Then, participants (n=88) drew lots to decide 44 presenters, half of each profession, who needed to prepare IPC benchmarking and formed Group 1. The remaining participants formed Group 2 (regular). Facilitators rated the Group 1 participants’ degree of appropriate transfer and sustainable practice of the learnt IPC skills in the workplace according to successful IPC examples in their benchmarking sharing. Results For the three professions, improvement in IPC attitude was identified by sequential increase in the post-course (second month, T2) and end-of-study (third month, T3) Interdisciplinary Education Perception Scale (IEPS) and Attitudes Towards Healthcare Teams Scale (ATHCTS) scores, compared with pre-course (first month, T1) scores. By IEPS and ATHCTS-based assessment, the degree of sequential improvements in IPC attitude was found to be higher among nurses and pharmacists than in physicians. In benchmarking sharing, the facilitators’ agreement about the degree of participants’appropriate transfer and sustainable practice learnt ‘communication and teamwork’ skills in the workplace were significantly higher among pharmacists and nurses than among physicians. The post-intervention random sampling survey (sixth month, Tpost) found that the IPC attitude of the three professions improved after on-site IPC skill promotion by new programme-trained seed instructors within teams. Conclusions Addition of benchmark sharing to a diamond-based IPE simulation programme enhances participants’ IPC attitudes, self-reflection, workplace transfer and practice of the learnt skills. Furthermore, IPC promotion within teams by newly trained seed instructors improved the IPC attitudes across all three professions. PMID:29122781

  1. Validation of Shielding Analysis Capability of SuperMC with SINBAD

    NASA Astrophysics Data System (ADS)

    Chen, Chaobin; Yang, Qi; Wu, Bin; Han, Yuncheng; Song, Jing

    2017-09-01

    Abstract: The shielding analysis capability of SuperMC was validated with the Shielding Integral Benchmark Archive Database (SINBAD). The SINBAD was compiled by RSICC and NEA, it includes numerous benchmark experiments performed with the D-T fusion neutron source facilities of OKTAVIAN, FNS, IPPE, etc. The results from SuperMC simulation were compared with experimental data and MCNP results. Very good agreement with deviation lower than 1% was achieved and it suggests that SuperMC is reliable in shielding calculation.

  2. A resolved two-way coupled CFD/6-DOF approach for predicting embolus transport and the embolus-trapping efficiency of IVC filters.

    PubMed

    Aycock, Kenneth I; Campbell, Robert L; Manning, Keefe B; Craven, Brent A

    2017-06-01

    Inferior vena cava (IVC) filters are medical devices designed to provide a mechanical barrier to the passage of emboli from the deep veins of the legs to the heart and lungs. Despite decades of development and clinical use, IVC filters still fail to prevent the passage of all hazardous emboli. The objective of this study is to (1) develop a resolved two-way computational model of embolus transport, (2) provide verification and validation evidence for the model, and (3) demonstrate the ability of the model to predict the embolus-trapping efficiency of an IVC filter. Our model couples computational fluid dynamics simulations of blood flow to six-degree-of-freedom simulations of embolus transport and resolves the interactions between rigid, spherical emboli and the blood flow using an immersed boundary method. Following model development and numerical verification and validation of the computational approach against benchmark data from the literature, embolus transport simulations are performed in an idealized IVC geometry. Centered and tilted filter orientations are considered using a nonlinear finite element-based virtual filter placement procedure. A total of 2048 coupled CFD/6-DOF simulations are performed to predict the embolus-trapping statistics of the filter. The simulations predict that the embolus-trapping efficiency of the IVC filter increases with increasing embolus diameter and increasing embolus-to-blood density ratio. Tilted filter placement is found to decrease the embolus-trapping efficiency compared with centered filter placement. Multiple embolus-trapping locations are predicted for the IVC filter, and the trapping locations are predicted to shift upstream and toward the vessel wall with increasing embolus diameter. Simulations of the injection of successive emboli into the IVC are also performed and reveal that the embolus-trapping efficiency decreases with increasing thrombus load in the IVC filter. In future work, the computational tool could be used to investigate IVC filter design improvements, the effect of patient anatomy on embolus transport and IVC filter embolus-trapping efficiency, and, with further development and validation, optimal filter selection and placement on a patient-specific basis.

  3. Benchmark simulation model no 2: general protocol and exploratory case studies.

    PubMed

    Jeppsson, U; Pons, M-N; Nopens, I; Alex, J; Copp, J B; Gernaey, K V; Rosen, C; Steyer, J-P; Vanrolleghem, P A

    2007-01-01

    Over a decade ago, the concept of objectively evaluating the performance of control strategies by simulating them using a standard model implementation was introduced for activated sludge wastewater treatment plants. The resulting Benchmark Simulation Model No 1 (BSM1) has been the basis for a significant new development that is reported on here: Rather than only evaluating control strategies at the level of the activated sludge unit (bioreactors and secondary clarifier) the new BSM2 now allows the evaluation of control strategies at the level of the whole plant, including primary clarifier and sludge treatment with anaerobic sludge digestion. In this contribution, the decisions that have been made over the past three years regarding the models used within the BSM2 are presented and argued, with particular emphasis on the ADM1 description of the digester, the interfaces between activated sludge and digester models, the included temperature dependencies and the reject water storage. BSM2-implementations are now available in a wide range of simulation platforms and a ring test has verified their proper implementation, consistent with the BSM2 definition. This guarantees that users can focus on the control strategy evaluation rather than on modelling issues. Finally, for illustration, twelve simple operational strategies have been implemented in BSM2 and their performance evaluated. Results show that it is an interesting control engineering challenge to further improve the performance of the BSM2 plant (which is the whole idea behind benchmarking) and that integrated control (i.e. acting at different places in the whole plant) is certainly worthwhile to achieve overall improvement.

  4. Benchmarking urban flood models of varying complexity and scale using high resolution terrestrial LiDAR data

    NASA Astrophysics Data System (ADS)

    Fewtrell, Timothy J.; Duncan, Alastair; Sampson, Christopher C.; Neal, Jeffrey C.; Bates, Paul D.

    2011-01-01

    This paper describes benchmark testing of a diffusive and an inertial formulation of the de St. Venant equations implemented within the LISFLOOD-FP hydraulic model using high resolution terrestrial LiDAR data. The models are applied to a hypothetical flooding scenario in a section of Alcester, UK which experienced significant surface water flooding in the June and July floods of 2007 in the UK. The sensitivity of water elevation and velocity simulations to model formulation and grid resolution are analyzed. The differences in depth and velocity estimates between the diffusive and inertial approximations are within 10% of the simulated value but inertial effects persist at the wetting front in steep catchments. Both models portray a similar scale dependency between 50 cm and 5 m resolution which reiterates previous findings that errors in coarse scale topographic data sets are significantly larger than differences between numerical approximations. In particular, these results confirm the need to distinctly represent the camber and curbs of roads in the numerical grid when simulating surface water flooding events. Furthermore, although water depth estimates at grid scales coarser than 1 m appear robust, velocity estimates at these scales seem to be inconsistent compared to the 50 cm benchmark. The inertial formulation is shown to reduce computational cost by up to three orders of magnitude at high resolutions thus making simulations at this scale viable in practice compared to diffusive models. For the first time, this paper highlights the utility of high resolution terrestrial LiDAR data to inform small-scale flood risk management studies.

  5. An imaging-based stochastic model for simulation of tumour vasculature

    NASA Astrophysics Data System (ADS)

    Adhikarla, Vikram; Jeraj, Robert

    2012-10-01

    A mathematical model which reconstructs the structure of existing vasculature using patient-specific anatomical, functional and molecular imaging as input was developed. The vessel structure is modelled according to empirical vascular parameters, such as the mean vessel branching angle. The model is calibrated such that the resultant oxygen map modelled from the simulated microvasculature stochastically matches the input oxygen map to a high degree of accuracy (R2 ≈ 1). The calibrated model was successfully applied to preclinical imaging data. Starting from the anatomical vasculature image (obtained from contrast-enhanced computed tomography), a representative map of the complete vasculature was stochastically simulated as determined by the oxygen map (obtained from hypoxia [64Cu]Cu-ATSM positron emission tomography). The simulated microscopic vasculature and the calculated oxygenation map successfully represent the imaged hypoxia distribution (R2 = 0.94). The model elicits the parameters required to simulate vasculature consistent with imaging and provides a key mathematical relationship relating the vessel volume to the tissue oxygen tension. Apart from providing an excellent framework for visualizing the imaging gap between the microscopic and macroscopic imagings, the model has the potential to be extended as a tool to study the dynamics between the tumour and the vasculature in a patient-specific manner and has an application in the simulation of anti-angiogenic therapies.

  6. Developing a molecular dynamics force field for both folded and disordered protein states.

    PubMed

    Robustelli, Paul; Piana, Stefano; Shaw, David E

    2018-05-07

    Molecular dynamics (MD) simulation is a valuable tool for characterizing the structural dynamics of folded proteins and should be similarly applicable to disordered proteins and proteins with both folded and disordered regions. It has been unclear, however, whether any physical model (force field) used in MD simulations accurately describes both folded and disordered proteins. Here, we select a benchmark set of 21 systems, including folded and disordered proteins, simulate these systems with six state-of-the-art force fields, and compare the results to over 9,000 available experimental data points. We find that none of the tested force fields simultaneously provided accurate descriptions of folded proteins, of the dimensions of disordered proteins, and of the secondary structure propensities of disordered proteins. Guided by simulation results on a subset of our benchmark, however, we modified parameters of one force field, achieving excellent agreement with experiment for disordered proteins, while maintaining state-of-the-art accuracy for folded proteins. The resulting force field, a99SB- disp , should thus greatly expand the range of biological systems amenable to MD simulation. A similar approach could be taken to improve other force fields. Copyright © 2018 the Author(s). Published by PNAS.

  7. Analysis of a Neutronic Experiment on a Simulated Mercury Spallation Neutron Target Assembly Bombarded by Giga-Electron-Volt Protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maekawa, Fujio; Meigo, Shin-ichiro; Kasugai, Yoshimi

    2005-05-15

    A neutronic benchmark experiment on a simulated spallation neutron target assembly was conducted by using the Alternating Gradient Synchrotron at Brookhaven National Laboratory and was analyzed to investigate the prediction capability of Monte Carlo simulation codes used in neutronic designs of spallation neutron sources. The target assembly consisting of a mercury target, a light water moderator, and a lead reflector was bombarded by 1.94-, 12-, and 24-GeV protons, and the fast neutron flux distributions around the target and the spectra of thermal neutrons leaking from the moderator were measured in the experiment. In this study, the Monte Carlo particle transportmore » simulation codes NMTC/JAM, MCNPX, and MCNP-4A with associated cross-section data in JENDL and LA-150 were verified based on benchmark analysis of the experiment. As a result, all the calculations predicted the measured quantities adequately; calculated integral fluxes of fast and thermal neutrons agreed approximately within {+-}40% with the experiments although the overall energy range encompassed more than 12 orders of magnitude. Accordingly, it was concluded that these simulation codes and cross-section data were adequate for neutronics designs of spallation neutron sources.« less

  8. Benchmarking Model Variants in Development of a Hardware-in-the-Loop Simulation System

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia M.; Kratz, Jonathan L.; Culley, Dennis E.; Thomas, George L.

    2016-01-01

    Distributed engine control architecture presents a significant increase in complexity over traditional implementations when viewed from the perspective of system simulation and hardware design and test. Even if the overall function of the control scheme remains the same, the hardware implementation can have a significant effect on the overall system performance due to differences in the creation and flow of data between control elements. A Hardware-in-the-Loop (HIL) simulation system is under development at NASA Glenn Research Center that enables the exploration of these hardware dependent issues. The system is based on, but not limited to, the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k). This paper describes the step-by-step conversion from the self-contained baseline model to the hardware in the loop model, and the validation of each step. As the control model hardware fidelity was improved during HIL system development, benchmarking simulations were performed to verify that engine system performance characteristics remained the same. The results demonstrate the goal of the effort; the new HIL configurations have similar functionality and performance compared to the baseline C-MAPSS40k system.

  9. A new formation control of multiple underactuated surface vessels

    NASA Astrophysics Data System (ADS)

    Xie, Wenjing; Ma, Baoli; Fernando, Tyrone; Iu, Herbert Ho-Ching

    2018-05-01

    This work investigates a new formation control problem of multiple underactuated surface vessels. The controller design is based on input-output linearisation technique, graph theory, consensus idea and some nonlinear tools. The proposed smooth time-varying distributed control law guarantees that the multiple underactuated surface vessels globally exponentially converge to some desired geometric shape, which is especially centred at the initial average position of vessels. Furthermore, the stability analysis of zero dynamics proves that the orientations of vessels tend to some constants that are dependent on the initial values of vessels, and the velocities and control inputs of the vessels decay to zero. All the results are obtained under the communication scenarios of static directed balanced graph with a spanning tree. Effectiveness of the proposed distributed control scheme is demonstrated using a simulation example.

  10. Effect of physical variables on capture of magnetic nanoparticles in simulated blood vessels

    NASA Astrophysics Data System (ADS)

    Zhang, Minghui; Brazel, Christopher

    2011-11-01

    This study investigated how the percent capture of magnetic nanoparticles in a simulated vessel varies with physical variables. Magnetic nanoparticles (MNPs) can used as part of therapeutic or diagnostic materials for cancer patients. By capturing these devices with a magnetic field, the particles can be concentrated in an area of diseased tissue. In this study, flow of nanoparticles in simulated blood vessels was used to determine the affect of applying an external magnetic field. This study used maghemite nanoparticles as the MNPs and either water or Fetal Bovine Serum as the carrier fluid. A UV-Vis collected capture data. The percent capture of MNPs was positively influenced by five physical variables: larger vessel diameters, lower linear flow velocity, higher magnetic field strength, better dispersion, lower MNP concentration, and lower protein content in fluid. Free MNPs were also compared to micelles, with the free particles having more successful magnetic capture. Four factors contributed to these trends: the strength of the magnetic field's influence on the MNPs, the MNPs' interactions with other particles and the fluid, the momentum of the nanoparticles, and magnetic mass to total mass ratio of the flowing particles. Funded by NSF REU Site #1062611.

  11. Interactive Physical Simulation of Catheter Motion within Mayor Vessel Structures and Cavities for ASD/VSD Treatment

    NASA Astrophysics Data System (ADS)

    Becherer, Nico; Hesser, Jürgen; Kornmesser, Ulrike; Schranz, Dietmar; Männer, Reinhard

    2007-03-01

    Simulation systems are becoming increasingly essential in medical education. Hereby, capturing the physical behaviour of the real world requires a sophisticated modelling of instruments within the virtual environment. Most models currently used are not capable of user interactive simulations due to the computation of the complex underlying analytical equations. Alternatives are often based on simplifying mass-spring systems, being able to deliver high update rates that come at the cost of less realistic motion. In addition, most techniques are limited to narrow and tubular vessel structures or restrict shape alterations to two degrees of freedom, not allowing instrument deformations like torsion. In contrast, our approach combines high update rates with highly realistic motion and can in addition be used with respect to arbitrary structures like vessels or cavities (e.g. atrium, ventricle) without limiting the degrees of freedom. Based on energy minimization, bending energies and vessel structures are considered as linear elastic elements; energies are evaluated at regularly spaced points on the instrument, while the distance of the points is fixed, i.e. we simulate an articulated structure of joints with fixed connections between them. Arbitrary tissue structures are modeled through adaptive distance fields and are connected by nodes via an undirected graph system. The instrument points are linked to nodes by a system of rules. Energy minimization uses a Quasi Newton method without preconditioning and, hereby, gradients are estimated using a combination of analytical and numerical terms. Results show a high quality in motion simulation when compared to a phantom model. The approach is also robust and fast. Simulating an instrument with 100 joints runs at 100 Hz on a 3 GHz PC.

  12. Using Machine Learning to Predict MCNP Bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grechanuk, Pavel Aleksandrovi

    For many real-world applications in radiation transport where simulations are compared to experimental measurements, like in nuclear criticality safety, the bias (simulated - experimental k eff) in the calculation is an extremely important quantity used for code validation. The objective of this project is to accurately predict the bias of MCNP6 [1] criticality calculations using machine learning (ML) algorithms, with the intention of creating a tool that can complement the current nuclear criticality safety methods. In the latest release of MCNP6, the Whisper tool is available for criticality safety analysts and includes a large catalogue of experimental benchmarks, sensitivity profiles,more » and nuclear data covariance matrices. This data, coming from 1100+ benchmark cases, is used in this study of ML algorithms for criticality safety bias predictions.« less

  13. Numerical simulation of air distribution in a room with a sidewall jet under benchmark test conditions

    NASA Astrophysics Data System (ADS)

    Zasimova, Marina; Ivanov, Nikolay

    2018-05-01

    The goal of the study is to validate Large Eddy Simulation (LES) data on mixing ventilation in an isothermal room at conditions of benchmark experiments by Hurnik et al. (2015). The focus is on the accuracy of the mean and rms velocity fields prediction in the quasi-free jet zone of the room with 3D jet supplied from a sidewall rectangular diffuser. Calculations were carried out using the ANSYS Fluent 16.2 software with an algebraic wall-modeled LES subgrid-scale model. CFD results on the mean velocity vector are compared with the Laser Doppler Anemometry data. The difference between the mean velocity vector and the mean air speed in the jet zone, both LES-computed, is presented and discussed.

  14. Application of CFX-10 to the Investigation of RPV Coolant Mixing in VVER Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moretti, Fabio; Melideo, Daniele; Terzuoli, Fulvio

    2006-07-01

    Coolant mixing phenomena occurring in the pressure vessel of a nuclear reactor constitute one of the main objectives of investigation by researchers concerned with nuclear reactor safety. For instance, mixing plays a relevant role in reactivity-induced accidents initiated by de-boration or boron dilution events, followed by transport of a de-borated slug into the vessel of a pressurized water reactor. Another example is constituted by temperature mixing, which may sensitively affect the consequences of a pressurized thermal shock scenario. Predictive analysis of mixing phenomena is strongly improved by the availability of computational tools able to cope with the inherent three-dimensionality ofmore » such problem, like system codes with three-dimensional capabilities, and Computational Fluid Dynamics (CFD) codes. The present paper deals with numerical analyses of coolant mixing in the reactor pressure vessel of a VVER-1000 reactor, performed by the ANSYS CFX-10 CFD code. In particular, the 'swirl' effect that has been observed to take place in the downcomer of such kind of reactor has been addressed, with the aim of assessing the capability of the codes to predict that effect, and to understand the reasons for its occurrence. Results have been compared against experimental data from V1000CT-2 Benchmark. Moreover, a boron mixing problem has been investigated, in the hypothesis that a de-borated slug, transported by natural circulation, enters the vessel. Sensitivity analyses have been conducted on some geometrical features, model parameters and boundary conditions. (authors)« less

  15. Spiking neural network simulation: memory-optimal synaptic event scheduling.

    PubMed

    Stewart, Robert D; Gurney, Kevin N

    2011-06-01

    Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using large-scale, benchmarking network simulations.

  16. Effects of amlodipine and adenosine on coronary haemodynamics: in vivo study and numerical simulation.

    PubMed

    De Lazzari, Claudio; L'Abbate, Antonio; Micalizzi, Mauro; Trivella, Maria Giovanna; Neglia, Danilo

    2014-11-01

    Amlodipine (AMLO) is a calcium channel blocker with vasodilating properties, in which the specific effects on the coronary circulation are not fully known. Coronary flow velocity-pressure (F/P) curves were obtained at rest and during administration of AMLO (10 mg to 20 mg iv) or adenosine (ADO, 1 mg ic) in 10 normal subjects (six women, age 48 ± 14 years). F/P curves were reproduced in a numerical simulator of systemic and coronary circulations (CARDIOSIM(©)) by adjustment of coronary resistance ( > or < 100 μm diameter vessels) and extravascular resistance applied to smaller vessels at endocardial (ENDO), middle and epicardial (EPI) myocardial layers. Best matching of in silico to in vivo curves was achieved by trial and error approach. ADO induced 170% and 250% increase in coronary flow velocity CFV and F/P diastolic slope as compared to 80% and 25-30% increase induced by AMLO, respectively. In the cardiovascular model, AMLO effects were predicted by progressive reduction of>100 μm vessels resistance from EPI to ENDO. ADO effects were mimicked by reducing resistance of both>100 μm and < 100 μm vessels, progressively from EPI to ENDO in the latter. Additional reduction in extravascular resistance avoided to impose a transmural gradient of vasodilating effect for both drugs. Numerical simulation predicts vasodilating effects of AMLO mainly on larger arteries and of ADO on both>and < 100 μm vessels. In vivo F/P loops could be completely reproduced in silico by adding extravascular resistance reduction for both drugs. Numerical simulator is useful tool for exploring the coronary effects of cardioactive drugs.

  17. Sensitivity Analysis of OECD Benchmark Tests in BISON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Gamble, Kyle; Schmidt, Rodney C.

    2015-09-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on sensitivity analysis of a fuels performance benchmark problem. The benchmark problem was defined by the Uncertainty Analysis in Modeling working group of the Nuclear Science Committee, part of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD ). The benchmark problem involv ed steady - state behavior of a fuel pin in a Pressurized Water Reactor (PWR). The problem was created in the BISON Fuels Performance code. Dakota was used to generate and analyze 300 samples of 17 input parameters defining coremore » boundary conditions, manuf acturing tolerances , and fuel properties. There were 24 responses of interest, including fuel centerline temperatures at a variety of locations and burnup levels, fission gas released, axial elongation of the fuel pin, etc. Pearson and Spearman correlatio n coefficients and Sobol' variance - based indices were used to perform the sensitivity analysis. This report summarizes the process and presents results from this study.« less

  18. Influence of cerebral blood vessel movements on the position of perivascular synapses

    PubMed Central

    DeFelipe, Javier

    2017-01-01

    Synaptic activity is regulated and limited by blood flow, which is controlled by blood vessel dilation and contraction. Traditionally, the study of neurovascular coupling has mainly focused on energy consumption and oxygen delivery. However, the mechanical changes that blood vessel movements induce in the surrounding tissue have not been considered. We have modeled the mechanical changes that movements of blood vessels cause in neighboring synapses. Our simulations indicate that synaptic densities increase or decrease during vascular dilation and contraction, respectively, near the blood vessel walls. This phenomenon may alter the concentration of neurotransmitters and vasoactive substances in the immediate vicinity of the vessel wall and thus may have an influence on local blood flow. PMID:28199396

  19. A wind energy benchmark for ABL modelling of a diurnal cycle with a nocturnal low-level jet: GABLS3 revisited

    DOE PAGES

    Rodrigo, J. Sanz; Churchfield, M.; Kosović, B.

    2016-10-03

    The third GEWEX Atmospheric Boundary Layer Studies (GABLS3) model intercomparison study, around the Cabauw met tower in the Netherlands, is revisited as a benchmark for wind energy atmospheric boundary layer (ABL) models. The case was originally developed by the boundary layer meteorology community, interested in analysing the performance of single-column and large-eddy simulation atmospheric models dealing with a diurnal cycle leading to the development of a nocturnal low-level jet. The case addresses fundamental questions related to the definition of the large-scale forcing, the interaction of the ABL with the surface and the evaluation of model results with observations. The characterizationmore » of mesoscale forcing for asynchronous microscale modelling of the ABL is discussed based on momentum budget analysis of WRF simulations. Then a single-column model is used to demonstrate the added value of incorporating different forcing mechanisms in microscale models. The simulations are evaluated in terms of wind energy quantities of interest.« less

  20. FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors

    NASA Astrophysics Data System (ADS)

    Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.

    2007-10-01

    One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies.

  1. Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulz, Roland; Lindner, Benjamin; Petridis, Loukas

    2009-01-01

    A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors,more » other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million atom biological systems scale well up to 30k cores, producing 30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.« less

  2. Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer.

    PubMed

    Schulz, Roland; Lindner, Benjamin; Petridis, Loukas; Smith, Jeremy C

    2009-10-13

    A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors, other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million-atom biological systems scale well up to ∼30k cores, producing ∼30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.

  3. Numerical modeling of fluid-structure interaction in arteries with anisotropic polyconvex hyperelastic and anisotropic viscoelastic material models at finite strains.

    PubMed

    Balzani, Daniel; Deparis, Simone; Fausten, Simon; Forti, Davide; Heinlein, Alexander; Klawonn, Axel; Quarteroni, Alfio; Rheinbach, Oliver; Schröder, Joerg

    2016-10-01

    The accurate prediction of transmural stresses in arterial walls requires on the one hand robust and efficient numerical schemes for the solution of boundary value problems including fluid-structure interactions and on the other hand the use of a material model for the vessel wall that is able to capture the relevant features of the material behavior. One of the main contributions of this paper is the application of a highly nonlinear, polyconvex anisotropic structural model for the solid in the context of fluid-structure interaction, together with a suitable discretization. Additionally, the influence of viscoelasticity is investigated. The fluid-structure interaction problem is solved using a monolithic approach; that is, the nonlinear system is solved (after time and space discretizations) as a whole without splitting among its components. The linearized block systems are solved iteratively using parallel domain decomposition preconditioners. A simple - but nonsymmetric - curved geometry is proposed that is demonstrated to be suitable as a benchmark testbed for fluid-structure interaction simulations in biomechanics where nonlinear structural models are used. Based on the curved benchmark geometry, the influence of different material models, spatial discretizations, and meshes of varying refinement is investigated. It turns out that often-used standard displacement elements with linear shape functions are not sufficient to provide good approximations of the arterial wall stresses, whereas for standard displacement elements or F-bar formulations with quadratic shape functions, suitable results are obtained. For the time discretization, a second-order backward differentiation formula scheme is used. It is shown that the curved geometry enables the analysis of non-rotationally symmetric distributions of the mechanical fields. For instance, the maximal shear stresses in the fluid-structure interface are found to be higher in the inner curve that corresponds to clinical observations indicating a high plaque nucleation probability at such locations. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Dose reduction potential of iterative reconstruction algorithms in neck CTA-a simulation study.

    PubMed

    Ellmann, Stephan; Kammerer, Ferdinand; Allmendinger, Thomas; Brand, Michael; Janka, Rolf; Hammon, Matthias; Lell, Michael M; Uder, Michael; Kramer, Manuel

    2016-10-01

    This study aimed to determine the degree of radiation dose reduction in neck CT angiography (CTA) achievable with Sinogram-affirmed iterative reconstruction (SAFIRE) algorithms. 10 consecutive patients scheduled for neck CTA were included in this study. CTA images of the external carotid arteries either were reconstructed with filtered back projection (FBP) at full radiation dose level or underwent simulated dose reduction by proprietary reconstruction software. The dose-reduced images were reconstructed using either SAFIRE 3 or SAFIRE 5 and compared with full-dose FBP images in terms of vessel definition. 5 observers performed a total of 3000 pairwise comparisons. SAFIRE allowed substantial radiation dose reductions in neck CTA while maintaining vessel definition. The possible levels of radiation dose reduction ranged from approximately 34 to approximately 90% and depended on the SAFIRE algorithm strength and the size of the vessel of interest. In general, larger vessels permitted higher degrees of radiation dose reduction, especially with higher SAFIRE strength levels. With small vessels, the superiority of SAFIRE 5 over SAFIRE 3 was lost. Neck CTA can be performed with substantially less radiation dose when SAFIRE is applied. The exact degree of radiation dose reduction should be adapted to the clinical question, in particular to the smallest vessel needing excellent definition.

  5. Ray tracing analysis of overlapping objects in refraction contrast imaging.

    PubMed

    Hirano, Masatsugu; Yamasaki, Katsuhito; Okada, Hiroshi; Sakurai, Takashi; Kondoh, Takeshi; Katafuchi, Tetsuro; Sugimura, Kazuro; Kitazawa, Sohei; Kitazawa, Riko; Maeda, Sakan; Tamura, Shinichi

    2005-08-01

    We simulated refraction contrast imaging in overlapping objects using the ray tracing method. The easiest case, in which two columnar objects (blood vessels) with a density of 1.0 [g/cm3], run at right angles in air, was calculated. For absorption, we performed simulation using the Snell law adapted to the object's boundary. A pair of bright and dark spot results from the interference of refracted X-rays where the blood vessels crossed. This has the possibility of increasing the visibility of the image.

  6. Comparative cath-lab assessment of coronary stenosis by radiology technician, junior and senior interventional cardiologist in patients treated with coronary angioplasty.

    PubMed

    Brunetti, Natale Daniele; Delli Carri, Felice; Ruggiero, Maria Assunta; Cuculo, Andrea; Ruggiero, Antonio; Ziccardi, Luigi; De Gennaro, Luisa; Di Biase, Matteo

    2014-03-01

    Exact quantification of plaque extension during coronary angioplasty (PCI) usually falls on interventional cardiologist (IC). Quantitative coronary stenosis assessment (QCA) may be possibly committed to the radiology technician (RT), who usually supports cath-lab nurse and IC during PCI. We therefore sought to investigate the reliability of QCA performed by RT in comparison with IC. Forty-four consecutive patients with acute coronary syndrome underwent PCI; target coronary vessel size beneath target coronary lesion (S) and target coronary lesion length (L) were assessed by the RT, junior IC (JIC), and senior IC (SIC) and then compared. SIC evaluation, which determined the final stent selection for coronary stenting, was considered as a reference benchmark. RT performance with QCA support in assessing target vessel size and target lesion length was not significantly different from SIC (r = 0.46, p < 0.01; r = 0.64, p < 0.001, respectively) as well as JIC (r = 0.79, r = 0.75, p < 0.001, respectively). JIC performance was significantly better than RT in assessing target vessel size (p < 0.05), while not significant when assessing target lesion length. RT may reliably assess target lesion by using adequate QCA software in the cath-lab in case of PCI; RT performance does not differ from SIC.

  7. Implications of Upwells as Hydrodynamic Jets in a Pulse Jet Mixed System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pease, Leonard F.; Bamberger, Judith A.; Minette, Michael J.

    2015-08-01

    This report evaluates the physics of the upwell flow in pulse jet mixed systems in the Hanford Tank Waste Treatment and Immobilization Plant (WTP). Although the initial downward flow and radial flow from pulse jet mixers (PJMs) has been analyzed in some detail, the upwells have received considerably less attention despite having significant implications for vessel mixing. Do the upwells behave like jets? How do the upwells scale? When will the central upwell breakthrough? What proportion of the vessel is blended by the upwells themselves? Indeed, how the physics of the central upwell is affected by multiple PJMs (e.g., sixmore » in the proposed mixing vessels), non-Newtonian rheology, and significant multicomponent solids loadings remain unexplored. The central upwell must satisfy several criteria to be considered a free jet. First, it must travel for several diameters in a nearly constant direction. Second, its velocity must decay with the inverse of elevation. Third, it should have an approximately Gaussian profile. Fourth, the influence of surface or body forces must be negligible. A combination of historical data in a 12.75 ft test vessel, newly analyzed data from the 8 ft test vessel, and conservation of momentum arguments derived specifically for PJM operating conditions demonstrate that the central upwell satisfies these criteria where vigorous breakthrough is achieved. An essential feature of scaling from one vessel to the next is the requirement that the underlying physics does not change adversely. One may have confidence in scaling if (1) correlations and formulas capture the relevant physics; (2) the underlying physics does not change from the conditions under which it was developed to the conditions of interest; (3) all factors relevant to scaling have been incorporated, including flow, material, and geometric considerations; and (4) the uncertainty in the relationships is sufficiently narrow to meet required specifications. Although the central upwell satisfies these criteria when vigorous breakthrough is achieved, not all available data follow the free jet profile for the central upwell, particularly at lower nozzle velocities. Alternative flow regimes are considered and new models for cloud height, “cavern height,” and the rate of jet penetration (jet celerity) are benchmarked against data to anchor scaling analyses. This analytical modeling effort to provide a technical basis for scaling PJM mixed vessels has significant implications for vessel mixing, because jet physics underlies “cavern” height, cloud height, and the volume of mixing considerations. A new four-parameter cloud height model compares favorably to experimental results. This model is predictive of breakthrough in 8 ft vessel tests with the two-part simulant. Analysis of the upwell in the presence of yield stresses finds evidence of expanding turbulent jets, confined turbulent jets, and confined laminar flows. For each, the critical elevation at which jet momentum depletes is predicted, which compare favorably to experimental cavern height data. Partially coupled momentum and energy balances suggest that these are limiting cases of a gradual transition from a turbulent expanding flow to a confined laminar flow. This analysis of the central upwell alone lays essential groundwork for complete analysis of mode three mixing (i.e., breakthrough with slow peripheral mixing). Consideration of jet celerity shows that the rate of jet penetration is a governing consideration in breakthrough to the surface. Estimates of the volume of mixing are presented. This analysis shows that flow along the vessel wall is sluggish such that the central upwell governs the volume of mixing. This analysis of the central upwell alone lays essential groundwork for complete analysis of mode three mixing and estimates of hydrogen release rates from first principles.« less

  8. SU-E-J-30: Benchmark Image-Based TCP Calculation for Evaluation of PTV Margins for Lung SBRT Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, M; Chetty, I; Zhong, H

    2014-06-01

    Purpose: Tumor control probability (TCP) calculated with accumulated radiation doses may help design appropriate treatment margins. Image registration errors, however, may compromise the calculated TCP. The purpose of this study is to develop benchmark CT images to quantify registration-induced errors in the accumulated doses and their corresponding TCP. Methods: 4DCT images were registered from end-inhale (EI) to end-exhale (EE) using a “demons” algorithm. The demons DVFs were corrected by an FEM model to get realistic deformation fields. The FEM DVFs were used to warp the EI images to create the FEM-simulated images. The two images combined with the FEM DVFmore » formed a benchmark model. Maximum intensity projection (MIP) images, created from the EI and simulated images, were used to develop IMRT plans. Two plans with 3 and 5 mm margins were developed for each patient. With these plans, radiation doses were recalculated on the simulated images and warped back to the EI images using the FEM DVFs to get the accumulated doses. The Elastix software was used to register the FEM-simulated images to the EI images. TCPs calculated with the Elastix-accumulated doses were compared with those generated by the FEM to get the TCP error of the Elastix registrations. Results: For six lung patients, the mean Elastix registration error ranged from 0.93 to 1.98 mm. Their relative dose errors in PTV were between 0.28% and 6.8% for 3mm margin plans, and between 0.29% and 6.3% for 5mm-margin plans. As the PTV margin reduced from 5 to 3 mm, the mean TCP error of the Elastix-reconstructed doses increased from 2.0% to 2.9%, and the mean NTCP errors decreased from 1.2% to 1.1%. Conclusion: Patient-specific benchmark images can be used to evaluate the impact of registration errors on the computed TCPs, and may help select appropriate PTV margins for lung SBRT patients.« less

  9. A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics

    NASA Astrophysics Data System (ADS)

    Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger

    2017-09-01

    Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.

  10. Bio-inspired benchmark generator for extracellular multi-unit recordings

    PubMed Central

    Mondragón-González, Sirenia Lizbeth; Burguière, Eric

    2017-01-01

    The analysis of multi-unit extracellular recordings of brain activity has led to the development of numerous tools, ranging from signal processing algorithms to electronic devices and applications. Currently, the evaluation and optimisation of these tools are hampered by the lack of ground-truth databases of neural signals. These databases must be parameterisable, easy to generate and bio-inspired, i.e. containing features encountered in real electrophysiological recording sessions. Towards that end, this article introduces an original computational approach to create fully annotated and parameterised benchmark datasets, generated from the summation of three components: neural signals from compartmental models and recorded extracellular spikes, non-stationary slow oscillations, and a variety of different types of artefacts. We present three application examples. (1) We reproduced in-vivo extracellular hippocampal multi-unit recordings from either tetrode or polytrode designs. (2) We simulated recordings in two different experimental conditions: anaesthetised and awake subjects. (3) Last, we also conducted a series of simulations to study the impact of different level of artefacts on extracellular recordings and their influence in the frequency domain. Beyond the results presented here, such a benchmark dataset generator has many applications such as calibration, evaluation and development of both hardware and software architectures. PMID:28233819

  11. How well does your model capture the terrestrial ecosystem dynamics of the Arctic-Boreal Region?

    NASA Astrophysics Data System (ADS)

    Stofferahn, E.; Fisher, J. B.; Hayes, D. J.; Huntzinger, D. N.; Schwalm, C.

    2016-12-01

    The Arctic-Boreal Region (ABR) is a major source of uncertainties for terrestrial biosphere model (TBM) simulations. These uncertainties are precipitated by a lack of observational data from the region, affecting the parameterizations of cold environment processes in the models. Addressing these uncertainties requires a coordinated effort of data collection and integration of the following key indicators of the ABR ecosystem: disturbance, flora / fauna and related ecosystem function, carbon pools and biogeochemistry, permafrost, and hydrology. We are developing a model-data integration framework for NASA's Arctic Boreal Vulnerability Experiment (ABoVE), wherein data collection for the key ABoVE indicators is driven by matching observations and model outputs to the ABoVE indicators. The data are used as reference datasets for a benchmarking system which evaluates TBM performance with respect to ABR processes. The benchmarking system utilizes performance metrics to identify intra-model and inter-model strengths and weaknesses, which in turn provides guidance to model development teams for reducing uncertainties in TBM simulations of the ABR. The system is directly connected to the International Land Model Benchmarking (ILaMB) system, as an ABR-focused application.

  12. Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.

    PubMed

    Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan

    2017-09-01

    In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.

  13. A Level-set based framework for viscous simulation of particle-laden supersonic flows

    NASA Astrophysics Data System (ADS)

    Das, Pratik; Sen, Oishik; Jacobs, Gustaaf; Udaykumar, H. S.

    2017-06-01

    Particle-laden supersonic flows are important in natural and industrial processes, such as, volcanic eruptions, explosions, pneumatic conveyance of particle in material processing etc. Numerical study of such high-speed particle laden flows at the mesoscale calls for a numerical framework which allows simulation of supersonic flow around multiple moving solid objects. Only a few efforts have been made toward development of numerical frameworks for viscous simulation of particle-fluid interaction in supersonic flow regime. The current work presents a Cartesian grid based sharp-interface method for viscous simulations of interaction between supersonic flow with moving rigid particles. The no-slip boundary condition is imposed at the solid-fluid interfaces using a modified ghost fluid method (GFM). The current method is validated against the similarity solution of compressible boundary layer over flat-plate and benchmark numerical solution for steady supersonic flow over cylinder. Further validation is carried out against benchmark numerical results for shock induced lift-off of a cylinder in a shock tube. 3D simulation of steady supersonic flow over sphere is performed to compare the numerically obtained drag co-efficient with experimental results. A particle-resolved viscous simulation of shock interaction with a cloud of particles is performed to demonstrate that the current method is suitable for large-scale particle resolved simulations of particle-laden supersonic flows.

  14. Benchmark of multi-phase method for the computation of fast ion distributions in a tokamak plasma in the presence of low-amplitude resonant MHD activity

    NASA Astrophysics Data System (ADS)

    Bierwage, A.; Todo, Y.

    2017-11-01

    The transport of fast ions in a beam-driven JT-60U tokamak plasma subject to resonant magnetohydrodynamic (MHD) mode activity is simulated using the so-called multi-phase method, where 4 ms intervals of classical Monte-Carlo simulations (without MHD) are interlaced with 1 ms intervals of hybrid simulations (with MHD). The multi-phase simulation results are compared to results obtained with continuous hybrid simulations, which were recently validated against experimental data (Bierwage et al., 2017). It is shown that the multi-phase method, in spite of causing significant overshoots in the MHD fluctuation amplitudes, accurately reproduces the frequencies and positions of the dominant resonant modes, as well as the spatial profile and velocity distribution of the fast ions, while consuming only a fraction of the computation time required by the continuous hybrid simulation. The present paper is limited to low-amplitude fluctuations consisting of a few long-wavelength modes that interact only weakly with each other. The success of this benchmark study paves the way for applying the multi-phase method to the simulation of Abrupt Large-amplitude Events (ALE), which were seen in the same JT-60U experiments but at larger time intervals. Possible implications for the construction of reduced models for fast ion transport are discussed.

  15. Interfacing VPSC with finite element codes. Demonstration of irradiation growth simulation in a cladding tube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patra, Anirban; Tome, Carlos

    This Milestone report shows good progress in interfacing VPSC with the FE codes ABAQUS and MOOSE, to perform component-level simulations of irradiation-induced deformation in Zirconium alloys. In this preliminary application, we have performed an irradiation growth simulation in the quarter geometry of a cladding tube. We have benchmarked VPSC-ABAQUS and VPSC-MOOSE predictions with VPSC-SA predictions to verify the accuracy of the VPSCFE interface. Predictions from the FE simulations are in general agreement with VPSC-SA simulations and also with experimental trends.

  16. A Simple Graphical Method for Quantification of Disaster Management Surge Capacity Using Computer Simulation and Process-control Tools.

    PubMed

    Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco

    2015-02-01

    Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.

  17. Microvascular stress analysis. Part I: simulation of microvascular anastomoses using finite element analysis.

    PubMed

    Al-Sukhun, Jehad; Lindqvist, Christian; Ashammakhi, Nureddin; Penttilä, Heikki

    2007-03-01

    To develop a finite element model (FEM) to study the effect of the stress and strain, in microvascular anastomoses that result from the geometrical mismatch of anastomosed vessels. FEMs of end-to-end and end-to-side anastomoses were constructed. Simulations were made using finite element software (NISA). We investigated the angle of inset in the end-to-side anastomosis and the discrepancy in the size of the opening in the vessel between the host and recipient vessels. The FEMs were used to predict principal and shear stress and strain at the position of each node. Two types of vascular deformation were predicted during different simulations: longitudinal distortion, and rotational distortion. Stress values ranged from 151.1 to 282.4MPa for the maximum principal stress, from -122.9 to -432.2MPa for the minimum principal stress, and from 122.1 to 333.1MPa for the maximum shear stress. The highest values were recorded when there was a 50% mismatch in the diameter of the vessels at the site of the end-to-end anastomosis. The effect of the vessel's size discrepancy on the blood flow and deformation was remarkable in the end-to-end anastomosis. End-to-side anastomosis was superior to end-to-end anastomosis. FEM is a powerful tool to study vascular deformation, as it predicts deformation and biomechanical processes at sites where physical measurements are likely to remain impossible in living humans.

  18. Analysis of 2D Torus and Hub Topologies of 100Mb/s Ethernet for the Whitney Commodity Computing Testbed

    NASA Technical Reports Server (NTRS)

    Pedretti, Kevin T.; Fineberg, Samuel A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    A variety of different network technologies and topologies are currently being evaluated as part of the Whitney Project. This paper reports on the implementation and performance of a Fast Ethernet network configured in a 4x4 2D torus topology in a testbed cluster of 'commodity' Pentium Pro PCs. Several benchmarks were used for performance evaluation: an MPI point to point message passing benchmark, an MPI collective communication benchmark, and the NAS Parallel Benchmarks version 2.2 (NPB2). Our results show that for point to point communication on an unloaded network, the hub and 1 hop routes on the torus have about the same bandwidth and latency. However, the bandwidth decreases and the latency increases on the torus for each additional route hop. Collective communication benchmarks show that the torus provides roughly four times more aggregate bandwidth and eight times faster MPI barrier synchronizations than a hub based network for 16 processor systems. Finally, the SOAPBOX benchmarks, which simulate real-world CFD applications, generally demonstrated substantially better performance on the torus than on the hub. In the few cases the hub was faster, the difference was negligible. In total, our experimental results lead to the conclusion that for Fast Ethernet networks, the torus topology has better performance and scales better than a hub based network.

  19. Production and Testing of the VITAMIN-B7 Fine-Group and BUGLE-B7 Broad-Group Coupled Neutron/Gamma Cross-Section Libraries Derived from ENDF/B-VII.0 Nuclear Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risner, J. M.; Wiarda, D.; Dunn, M. E.

    2011-09-30

    New coupled neutron-gamma cross-section libraries have been developed for use in light water reactor (LWR) shielding applications, including pressure vessel dosimetry calculations. The libraries, which were generated using Evaluated Nuclear Data File/B Version VII Release 0 (ENDF/B-VII.0), use the same fine-group and broad-group energy structures as the VITAMIN-B6 and BUGLE-96 libraries. The processing methodology used to generate both libraries is based on the methods used to develop VITAMIN-B6 and BUGLE-96 and is consistent with ANSI/ANS 6.1.2. The ENDF data were first processed into the fine-group pseudo-problem-independent VITAMIN-B7 library and then collapsed into the broad-group BUGLE-B7 library. The VITAMIN-B7 library containsmore » data for 391 nuclides. This represents a significant increase compared to the VITAMIN-B6 library, which contained data for 120 nuclides. The BUGLE-B7 library contains data for the same nuclides as BUGLE-96, and maintains the same numeric IDs for those nuclides. The broad-group data includes nuclides which are infinitely dilute and group collapsed using a concrete weighting spectrum, as well as nuclides which are self-shielded and group collapsed using weighting spectra representative of important regions of LWRs. The verification and validation of the new libraries includes a set of critical benchmark experiments, a set of regression tests that are used to evaluate multigroup crosssection libraries in the SCALE code system, and three pressure vessel dosimetry benchmarks. Results of these tests confirm that the new libraries are appropriate for use in LWR shielding analyses and meet the requirements of Regulatory Guide 1.190.« less

  20. Reactor Pressure Vessel Fracture Analysis Capabilities in Grizzly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Backman, Marie; Chakraborty, Pritam

    2015-03-01

    Efforts have been underway to develop fracture mechanics capabilities in the Grizzly code to enable it to be used to perform deterministic fracture assessments of degraded reactor pressure vessels (RPVs). Development in prior years has resulted a capability to calculate -integrals. For this application, these are used to calculate stress intensity factors for cracks to be used in deterministic linear elastic fracture mechanics (LEFM) assessments of fracture in degraded RPVs. The -integral can only be used to evaluate stress intensity factors for axis-aligned flaws because it can only be used to obtain the stress intensity factor for pure Mode Imore » loading. Off-axis flaws will be subjected to mixed-mode loading. For this reason, work has continued to expand the set of fracture mechanics capabilities to permit it to evaluate off-axis flaws. This report documents the following work to enhance Grizzly’s engineering fracture mechanics capabilities for RPVs: • Interaction Integral and -stress: To obtain mixed-mode stress intensity factors, a capability to evaluate interaction integrals for 2D or 3D flaws has been developed. A -stress evaluation capability has been developed to evaluate the constraint at crack tips in 2D or 3D. Initial verification testing of these capabilities is documented here. • Benchmarking for axis-aligned flaws: Grizzly’s capabilities to evaluate stress intensity factors for axis-aligned flaws have been benchmarked against calculations for the same conditions in FAVOR. • Off-axis flaw demonstration: The newly-developed interaction integral capabilities are demon- strated in an application to calculate the mixed-mode stress intensity factors for off-axis flaws. • Other code enhancements: Other enhancements to the thermomechanics capabilities that relate to the solution of the engineering RPV fracture problem are documented here.« less

  1. WE-G-BRE-04: Gold Nanoparticle Induced Vasculature Damage for Proton Therapy: Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y; Paganetti, H; Schuemann, J

    2014-06-15

    Purpose: The aim of this work is to investigate the gold nanoparticle (GNP) induced vasculature damage in a proton beam. We compared the results using a clinical proton beam, 6MV photon beam and two kilovoltage photon beams. Methods: Monte Carlo simulations were carried out using TOPAS (TOol for PArticle Simulation) to obtain the spatial dose distribution in close proximity to GNPs up to 20μm distance. The spatial dose distribution was used as an input to calculate the additional dose deposited to the blood vessels. For this study, GNP induced vasculature damage is evaluated for three particle sources (proton beam, MVmore » photon beam and kV photon beam), various treatment depths for each particle source, various GNP uptakes and three different vessel diameters (8μm, 14μm and 20μm). Results: The result shows that for kV photon, GNPs induce more dose in the vessel wall for 150kVp photon source than 250kVp. For proton therapy, GNPs cause more dose in the vessel wall at shallower treatment depths. For 6MV photons, GNPs induce more dose in the vessel wall at deeper treatment depths. For the same GNP concentration and prescribed dose, the additional dose at the inner vessel wall is 30% more than the prescribed dose for the kVp photon source, 15% more for the proton source and only 2% more for the 6MV photon source. In addition, the dose from GNPs deceases sharper for proton therapy than kVp photon therapy as the distance from the vessel inner wall increases. Conclusion: We show in this study that GNPs can potentially be used to enhance radiation therapy by causing vasculature damage using clinical proton beams. The GNP induced damage for proton therapy is less than for the kVp photon source but significantly larger than for the clinical MV photon source.« less

  2. Preliminary evaluation of the SimPORTAL major vessel injury (MVI) repair model.

    PubMed

    Veneziano, Domenico; Poniatowski, Lauren H; Reihsen, Troy E; Sweet, Robert M

    2016-04-01

    Major vessel injury (MVI) is a dangerous complication associated with laparoscopic surgery that leads, if not properly handled, to blood loss, conversion to open surgery, and eventually death. In this paper, we describe the preliminary evaluation of the SimPORTAL MVI model, created with the goal of simulating an intra-corporeal injury to a large vessel. For this study, we created MVI models for 17 residents (PGY 1-4). Each resident was asked to perform an intracorporeal knot on a penrose drain within a maximum time limit of 6 min (in accordance with European basic laparoscopic urological skills rules) and then to subsequently repair a vessel injury on the MVI model, which was perfused with synthetic blood, within a maximum blood loss of 3 L. During the vessel repair, low lights and pulse sounds were used to simulate the operating room environment. All participants filled out a survey pre- and post-task to score various aspects of the model. We successfully created a model that simulates a critical surgical event. None of the participants reported having previous experience repairing a MVI. Six participants were able to perform the intracorporeal knot, and 12 residents (70.5%) were able to repair the MVI model under the given time and blood loss limits. Eleven participants agreed that the MVI model behaves like a real vessel, and six felt to be capable of performing the task prior to attempting it. Sixteen participants thought that the MVI model should be part of laparoscopic curriculums during residency. The SimPORTAL MVI model is a feasible low-cost model that would be well appreciated as a part of laparoscopic curriculum for residents. Minor improvements, including pressure measurement in the vessel for task assessment, will be made in the future, and further studies are necessary to definitively validate this model.

  3. Mechanical characterization of atherosclerotic arteries using finite-element modeling: feasibility study on mock arteries.

    PubMed

    Pazos, Valérie; Mongrain, Rosaire; Tardif, Jean-Claude

    2010-06-01

    Clinical studies on lipid-lowering therapy have shown that changing the composition of lipid pools reduced significantly the risk of cardiac events associated with plaque rupture. It has been shown also that changing the composition of the lipid pool affects its mechanical properties. However, knowledge about the mechanical properties of human atherosclerotic lesions remains limited due to the difficulty of the experiments. This paper aims to assess the feasibility of characterizing a lipid pool embedded in the wall of a pressurized vessel using finite-element simulations and an optimization algorithm. Finite-element simulations of inflation experiments were used together with nonlinear least squares algorithm to estimate the material model parameters of the wall and of the inclusion. An optimal fit of the simulated experiment and the real experiment was sought with the parameter estimation algorithm. The method was first tested on a single-layer polyvinyl alcohol (PVA) cryogel stenotic vessel, and then, applied on a double-layered PVA cryogel stenotic vessel with a lipid inclusion.

  4. Measurement with microscopic MRI and simulation of flow in different aneurysm models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelhoff, Daniel, E-mail: daniel.edelhoff@tu-dortmund.de; Frank, Frauke; Heil, Marvin

    2015-10-15

    Purpose: The impact and the development of aneurysms depend to a significant degree on the exchange of liquid between the regular vessel and the pathological extension. A better understanding of this process will lead to improved prediction capabilities. The aim of the current study was to investigate fluid-exchange in aneurysm models of different complexities by combining microscopic magnetic resonance measurements with numerical simulations. In order to evaluate the accuracy and applicability of these methods, the fluid-exchange process between the unaltered vessel lumen and the aneurysm phantoms was analyzed quantitatively using high spatial resolution. Methods: Magnetic resonance flow imaging was usedmore » to visualize fluid-exchange in two different models produced with a 3D printer. One model of an aneurysm was based on histological findings. The flow distribution in the different models was measured on a microscopic scale using time of flight magnetic resonance imaging. The whole experiment was simulated using fast graphics processing unit-based numerical simulations. The obtained simulation results were compared qualitatively and quantitatively with the magnetic resonance imaging measurements, taking into account flow and spin–lattice relaxation. Results: The results of both presented methods compared well for the used aneurysm models and the chosen flow distributions. The results from the fluid-exchange analysis showed comparable characteristics concerning measurement and simulation. Similar symmetry behavior was observed. Based on these results, the amount of fluid-exchange was calculated. Depending on the geometry of the models, 7% to 45% of the liquid was exchanged per second. Conclusions: The result of the numerical simulations coincides well with the experimentally determined velocity field. The rate of fluid-exchange between vessel and aneurysm was well-predicted. Hence, the results obtained by simulation could be validated by the experiment. The observed deviations can be caused by the noise in the measurement and by the limited resolution of the simulation. The resulting differences are small enough to allow reliable predictions of the flow distribution in vessels with stents and for pulsed blood flow.« less

  5. Benchmark results for few-body hypernuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffino, Fabrizio Ferrari; Lonardoni, Diego; Barnea, Nir

    2017-03-16

    Here, the Non-Symmetrized Hyperspherical Harmonics method (NSHH) is introduced in the hypernuclear sector and benchmarked with three different ab-initio methods, namely the Auxiliary Field Diffusion Monte Carlo method, the Faddeev–Yakubovsky approach and the Gaussian Expansion Method. Binding energies and hyperon separation energies of three- to five-body hypernuclei are calculated by employing the two-body ΛN component of the phenomenological Bodmer–Usmani potential, and a hyperon-nucleon interaction simulating the scattering phase shifts given by NSC97f. The range of applicability of the NSHH method is briefly discussed.

  6. Simulation-based validation and arrival-time correction for Patlak analyses of Perfusion-CT scans

    NASA Astrophysics Data System (ADS)

    Bredno, Jörg; Hom, Jason; Schneider, Thomas; Wintermark, Max

    2009-02-01

    Blood-brain-barrier (BBB) breakdown is a hypothesized mechanism for hemorrhagic transformation in acute stroke. The Patlak analysis of a Perfusion Computed Tomography (PCT) scan measures the BBB permeability, but the method yields higher estimates when applied to the first pass of the contrast bolus compared to a delayed phase. We present a numerical phantom that simulates vascular and parenchymal time-attenuation curves to determine the validity of permeability measurements obtained with different acquisition protocols. A network of tubes represents the major cerebral arteries ipsi- and contralateral to an ischemic event. These tubes branch off into smaller segments that represent capillary beds. Blood flow in the phantom is freely defined and simulated as non-Newtonian tubular flow. Diffusion of contrast in the vessels and permeation through vessel walls is part of the simulation. The phantom allows us to compare the results of a permeability measurement to the simulated vessel wall status. A Patlak analysis reliably detects areas with BBB breakdown for acquisitions of 240s duration, whereas results obtained from the first pass are biased in areas of reduced blood flow. Compensating for differences in contrast arrival times reduces this bias and gives good estimates of BBB permeability for PCT acquisitions of 90-150s duration.

  7. Quantitative phenomenological model of the BOLD contrast mechanism

    NASA Astrophysics Data System (ADS)

    Dickson, John D.; Ash, Tom W. J.; Williams, Guy B.; Sukstanskii, Alexander L.; Ansorge, Richard E.; Yablonskiy, Dmitriy A.

    2011-09-01

    Different theoretical models of the BOLD contrast mechanism are used for many applications including BOLD quantification (qBOLD) and vessel size imaging, both in health and disease. Each model simplifies the system under consideration, making approximations about the structure of the blood vessel network and diffusion of water molecules through inhomogeneities in the magnetic field created by deoxyhemoglobin-containing blood vessels. In this study, Monte-Carlo methods are used to simulate the BOLD MR signal generated by diffusing water molecules in the presence of long, cylindrical blood vessels. Using these simulations we introduce a new, phenomenological model that is far more accurate over a range of blood oxygenation levels and blood vessel radii than existing models. This model could be used to extract physiological parameters of the blood vessel network from experimental data in BOLD-based experiments. We use our model to establish ranges of validity for the existing analytical models of Yablonskiy and Haacke, Kiselev and Posse, Sukstanskii and Yablonskiy (extended to the case of arbitrary time in the spin echo sequence) and Bauer et al. (extended to the case of randomly oriented cylinders). Although these models are shown to be accurate in the limits of diffusion under which they were derived, none of them is accurate for the whole physiological range of blood vessels radii and blood oxygenation levels. We also show the extent of systematic errors that are introduced due to the approximations of these models when used for BOLD signal quantification.

  8. Using SPARK as a Solver for Modelica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Wetter, Michael; Haves, Philip

    Modelica is an object-oriented acausal modeling language that is well positioned to become a de-facto standard for expressing models of complex physical systems. To simulate a model expressed in Modelica, it needs to be translated into executable code. For generating run-time efficient code, such a translation needs to employ algebraic formula manipulations. As the SPARK solver has been shown to be competitive for generating such code but currently cannot be used with the Modelica language, we report in this paper how SPARK's symbolic and numerical algorithms can be implemented in OpenModelica, an open-source implementation of a Modelica modeling and simulationmore » environment. We also report benchmark results that show that for our air flow network simulation benchmark, the SPARK solver is competitive with Dymola, which is believed to provide the best solver for Modelica.« less

  9. Engine dynamic analysis with general nonlinear finite element codes. II - Bearing element implementation, overall numerical characteristics and benchmarking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.

    1982-01-01

    Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.

  10. The development of a virtual reality training curriculum for colonoscopy.

    PubMed

    Sugden, Colin; Aggarwal, Rajesh; Banerjee, Amrita; Haycock, Adam; Thomas-Gibson, Siwan; Williams, Christopher B; Darzi, Ara

    2012-07-01

    The development of a structured virtual reality (VR) training curriculum for colonoscopy using high-fidelity simulation. Colonoscopy requires detailed knowledge and technical skill. Changes to working practices in recent times have reduced the availability of traditional training opportunities. Much might, therefore, be achieved by applying novel technologies such as VR simulation to colonoscopy. Scientifically developed device-specific curricula aim to maximize the yield of laboratory-based training by focusing on validated modules and linking progression to the attainment of benchmarked proficiency criteria. Fifty participants comprised of 30 novices (<10 colonoscopies), 10 intermediates (100 to 500 colonoscopies), and 10 experienced (>500 colonoscopies) colonoscopists were recruited to participate. Surrogates of proficiency, such as number of procedures undertaken, determined prospective allocation to 1 of 3 groups (novice, intermediate, and experienced). Construct validity and learning value (comparison between groups and within groups respectively) for each task and metric on the chosen simulator model determined suitability for inclusion in the curriculum. Eight tasks in possession of construct validity and significant learning curves were included in the curriculum: 3 abstract tasks, 4 part-procedural tasks, and 1 procedural task. The whole-procedure task was valid for 11 metrics including the following: "time taken to complete the task" (1238, 343, and 293 s; P < 0.001) and "insertion length with embedded tip" (23.8, 3.6, and 4.9 cm; P = 0.005). Learning curves consistently plateaued at or beyond the ninth attempt. Valid metrics were used to define benchmarks, derived from the performance of the experienced cohort, for each included task. A comprehensive, stratified, benchmarked, whole-procedure curriculum has been developed for a modern high-fidelity VR colonoscopy simulator.

  11. Erythroid cell growth and differentiation in vitro in the simulated microgravity environment of the NASA rotating wall vessel bioreactor

    NASA Technical Reports Server (NTRS)

    Sytkowski, A. J.; Davis, K. L.

    2001-01-01

    Prolonged exposure of humans and experimental animals to the altered gravitational conditions of space flight has adverse effects on the lymphoid and erythroid hematopoietic systems. Although some information is available regarding the cellular and molecular changes in lymphocytes exposed to microgravity, little is known about the erythroid cellular changes that may underlie the reduction in erythropoiesis and resultant anemia. We now report a reduction in erythroid growth and a profound inhibition of erythropoietin (Epo)-induced differentiation in a ground-based simulated microgravity model system. Rauscher murine erythroleukemia cells were grown either in tissue culture vessels at 1 x g or in the simulated microgravity environment of the NASA-designed rotating wall vessel (RWV) bioreactor. Logarithmic growth was observed under both conditions; however, the doubling time in simulated microgravity was only one-half of that seen at 1 x g. No difference in apoptosis was detected. Induction with Epo at the initiation of the culture resulted in differentiation of approximately 25% of the cells at 1 x g, consistent with our previous observations. In contrast, induction with Epo at the initiation of simulated microgravity resulted in only one-half of this degree of differentiation. Significantly, the growth of cells in simulated microgravity for 24 h prior to Epo induction inhibited the differentiation almost completely. The results suggest that the NASA RWV bioreactor may serve as a suitable ground-based microgravity simulator to model the cellular and molecular changes in erythroid cells observed in true microgravity.

  12. Raman Monte Carlo simulation for light propagation for tissue with embedded objects

    NASA Astrophysics Data System (ADS)

    Periyasamy, Vijitha; Jaafar, Humaira Bte; Pramanik, Manojit

    2018-02-01

    Monte Carlo (MC) stimulation is one of the prominent simulation technique and is rapidly becoming the model of choice to study light-tissue interaction. Monte Carlo simulation for light transport in multi-layered tissue (MCML) is adapted and modelled with different geometry by integrating embedded objects of various shapes (i.e., sphere, cylinder, cuboid and ellipsoid) into the multi-layered structure. These geometries would be useful in providing a realistic tissue structure such as modelling for lymph nodes, tumors, blood vessels, head and other simulation medium. MC simulations were performed on various geometric medium. Simulation of MCML with embedded object (MCML-EO) was improvised for propagation of the photon in the defined medium with Raman scattering. The location of Raman photon generation is recorded. Simulations were experimented on a modelled breast tissue with tumor (spherical and ellipsoidal) and blood vessels (cylindrical). Results were presented in both A-line and B-line scans for embedded objects to determine spatial location where Raman photons were generated. Studies were done for different Raman probabilities.

  13. Stochastic simulation of human pulmonary blood flow and transit time frequency distribution based on anatomic and elasticity data.

    PubMed

    Huang, Wei; Shi, Jun; Yen, R T

    2012-12-01

    The objective of our study was to develop a computing program for computing the transit time frequency distributions of red blood cell in human pulmonary circulation, based on our anatomic and elasticity data of blood vessels in human lung. A stochastic simulation model was introduced to simulate blood flow in human pulmonary circulation. In the stochastic simulation model, the connectivity data of pulmonary blood vessels in human lung was converted into a probability matrix. Based on this model, the transit time of red blood cell in human pulmonary circulation and the output blood pressure were studied. Additionally, the stochastic simulation model can be used to predict the changes of blood flow in human pulmonary circulation with the advantage of the lower computing cost and the higher flexibility. In conclusion, a stochastic simulation approach was introduced to simulate the blood flow in the hierarchical structure of a pulmonary circulation system, and to calculate the transit time distributions and the blood pressure outputs.

  14. 3D printing of microtube in solid phantom to simulate tissue oxygenation and perfusion (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lv, Xiang; Xue, Yue; Wang, Haili; Shen, Shu Wei; Zhou, Ximing; Liu, Guangli; Dong, Erbao; Xu, Ronald X.

    2017-03-01

    Tissue-simulating phantoms with interior vascular network may facilitate traceable calibration and quantitative validation of many medical optical devices. However, a solid phantom that reliably simulates tissue oxygenation and blood perfusion is still not available. This paper presents a new method to fabricate hollow microtubes for blood vessel simulation in solid phantoms. The fabrication process combines ultraviolet (UV) rapid prototyping technique with fluid mechanics of a coaxial jet flow. Polydimethylsiloxane (PDMS) and a UV-curable polymer are mixed at the designated ratio and extruded through a coaxial needle device to produce a coaxial jet flow. The extruded jet flow is quickly photo-polymerized by ultraviolet (UV) light to form vessel-simulating solid structures at different sizes ranging from 700 μm to 1000 μm. Microtube structures with adequate mechanical properties can be fabricated by adjusting material compositions and illumination intensity. Curved, straight and stretched microtubes can be formed by adjusting the extrusion speed of the materials and the speed of the 3D printing platform. To simulate vascular structures in biologic tissue, we embed vessel-simulating microtubes in a gel wax phantom of 10 cm x10 cm x 5 cm at the depth from 1 to 2 mm. Bloods at different oxygenation and hemoglobin concentration levels are circulated through the microtubes at different flow rates in order to simulate different oxygenation and perfusion conditions. The simulated physiologic parameters are detected by a tissue oximeter and a laser speckle blood flow meter respectively and compared with the actual values. Our experiments demonstrate that the proposed 3D printing process is able to produce solid phantoms with simulated vascular networks for potential applications in medical device calibration and drug delivery studies.

  15. 76 FR 52569 - Regulated Navigation Area; Arthur Kill, NY and NJ

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ... simulator assessments of the drilling and blasting areas. These simulations studied the possibility of... all times for vessel transits. The results of the simulation allowed the USACE to determine that... 2011 to discuss the results of the navigation simulations. In June the USACE and the contractor...

  16. Simply actuated closure for a pressure vessel - Design for use to trap deep-sea animals

    NASA Technical Reports Server (NTRS)

    Yayanos, A. A.

    1977-01-01

    A pressure vessel is described that can be closed by a single translational motion within 1 sec. The vessel is a key component of a trap for small marine animals and operates automatically on the sea floor. As the vessel descends to the sea floor, it is subjected both internally and externally to the high pressures of the deep sea. The mechanism for closing the pressure vessel on the sea floor is activated by the timed release of the ballast which was used to sink the trap. As it rises to the sea surface, the internal pressure of the vessel remains near the value present on the sea floor. The pressure vessel has been used in simulated ocean deployments and in the deep ocean (9500 m) with a 75%-85% retention of the deep-sea pressure. Nearly 100% retention of pressure can be achieved by using an accumulator filled with a gas.

  17. PEP Integrated Test D Run Report Caustic and Oxidative Leaching in UFP-VSL-T02A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevigny, Gary J.; Bredt, Ofelia P.; Burns, Carolyn A.

    2009-12-11

    Pacific Northwest National Laboratory (PNNL) has been tasked by Bechtel National Inc. (BNI) on the River Protection Project-Hanford Tank Waste Treatment and Immobilization Plant (RPP-WTP) project to perform research and development activities to resolve technical issues identified for the Pretreatment Facility (PTF). The Pretreatment Engineering Platform (PEP) was designed, constructed and operated as part of a plan to respond to issue M12, "Undemonstrated Leaching Processes" of the External Flowsheet Review Team (EFRT) issue response plan. The PEP is a 1/4.5-scale test platform designed to simulate the WTP pretreatment caustic leaching, oxidative leaching, ultrafiltration solids concentration, and slurry washing processes. Themore » PEP replicates the WTP leaching processes using prototypic equipment and control strategies. The PEP also includes non-prototypic ancillary equipment to support the core processing. Two operating scenarios are currently being evaluated for the ultrafiltration process (UFP) and leaching operations. The first scenario (Test B and D) has caustic leaching performed in the UFP-2 ultrafiltration feed vessels (i.e., vessel UFP-VSL-T02A in the PEP and vessels UFP-VSL-00002A and B in the WTP PTF). The second scenario (Test A) has caustic leaching conducted in the UFP-1 ultrafiltration feed preparation vessels (i.e., vessels UFP-VSL-T01A and B in the PEP and vessels UFP VSL-00001A and B in the WTP PTF). In Test D, 19M sodium hydroxide (NaOH, caustic) was added to the waste slurry in the UFP VSL T02 vessel after the solids were concentrated to ~20% undissolved solids. The NaOH was added to leach solid aluminum compounds (e.g., gibbsite, boehmite). Caustic addition is followed by heating to 85°C using direct injection of steam to accelerate the leach process. The main difference of Test D compared to Test B is that the leach temperature is 85°C for 24 hrs as compared to 100°C for 12 hours. The other difference is the Test D simulant had Cr in the simulant from the start of processing and Test B had Cr added to adjust the simulant composition after aluminum leaching. Following the caustic leach, the UFP-VSL-T02A vessel contents are cooled using the vessel cooling jacket. The slurry was then concentrated to 17 wt% undissolved solids and washed with inhibited water to remove NaOH and other soluble salts. Next, the slurry was oxidatively leached using sodium permanganate to solubilize chrome. The slurry was then washed to remove the dissolved chrome and concentrated.« less

  18. Standard High Solids Vessel Design De-inventory Simulant Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiskum, Sandra K.; Burns, Carolyn A.M.; Gauglitz, Phillip A.

    The Hanford Tank Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant be developed that would represent the de-inventory (residual high-density tank solids cleanout) process. Its basis and target characteristics are defined in 24590-WTP-ES-ENG-16-021 and implemented through PNNL Test Plan TP-WTPSP-132 Rev. 1.0. This document describes the de-inventory Newtonian carrier fluid (DNCF) simulant composition that will satisfy the basis requirement to mimic the density (1.18 g/mL ± 0.1 g/mL) and viscosity (2.8 cP ± 0.5more » cP) of 5 M NaOH at 25 °C.1 The simulant viscosity changes significantly with temperature. Therefore, various solution compositions may be required, dependent on the test stand process temperature range, to meet these requirements. Table ES.1 provides DNCF compositions at selected temperatures that will meet the density and viscosity specifications as well as the temperature range at which the solution will meet the acceptable viscosity tolerance.« less

  19. Assessment of Polarimetric SAR Interferometry for Improving Ship Classification based on Simulated Data

    PubMed Central

    Margarit, Gerard; Mallorqui, Jordi J.

    2008-01-01

    This paper uses a complete and realistic SAR simulation processing chain, GRECOSAR, to study the potentialities of Polarimetric SAR Interferometry (POLInSAR) in the development of new classification methods for ships. Its high processing efficiency and scenario flexibility have allowed to develop exhaustive scattering studies. The results have revealed, first, vessels' geometries can be described by specific combinations of Permanent Polarimetric Scatterers (PePS) and, second, each type of vessel could be characterized by a particular spatial and polarimetric distribution of PePS. Such properties have been recently exploited to propose a new Vessel Classification Algorithm (VCA) working with POLInSAR data, which, according to several simulation tests, may provide promising performance in real scenarios. Along the paper, explanation of the main steps summarizing the whole research activity carried out with ships and GRECOSAR are provided as well as examples of the main results and VCA validation tests. Special attention will be devoted to the new improvements achieved, which are related to simulations processing a new and highly realistic sea surface model. The paper will show that, for POLInSAR data with fine resolution, VCA can help to classify ships with notable robustness under diverse and adverse observation conditions. PMID:27873954

  20. New methods to benchmark simulations of accreting black holes systems against observations

    NASA Astrophysics Data System (ADS)

    Markoff, Sera; Chatterjee, Koushik; Liska, Matthew; Tchekhovskoy, Alexander; Hesp, Casper; Ceccobello, Chiara; Russell, Thomas

    2017-08-01

    The field of black hole accretion has been significantly advanced by the use of complex ideal general relativistic magnetohydrodynamics (GRMHD) codes, now capable of simulating scales from the event horizon out to ~10^5 gravitational radii at high resolution. The challenge remains how to test these simulations against data, because the self-consistent treatment of radiation is still in its early days, and is complicated by dependence on non-ideal/microphysical processes not yet included in the codes. On the other extreme, a variety of phenomenological models (disk, corona, jet, wind) can well-describe spectra or variability signatures in a particular waveband, although often not both. To bring these two methodologies together, we need robust observational “benchmarks” that can be identified and studied in simulations. I will focus on one example of such a benchmark, from recent observational campaigns on black holes across the mass scale: the jet break. I will describe new work attempting to understand what drives this feature by searching for regions that share similar trends in terms of dependence on accretion power or magnetisation. Such methods can allow early tests of simulation assumptions and help pinpoint which regions will dominate the light production, well before full radiative processes are incorporated, and will help guide the interpretation of, e.g. Event Horizon Telescope data.

  1. Genomic prediction in animals and plants: simulation of data, validation, reporting, and benchmarking.

    PubMed

    Daetwyler, Hans D; Calus, Mario P L; Pong-Wong, Ricardo; de Los Campos, Gustavo; Hickey, John M

    2013-02-01

    The genomic prediction of phenotypes and breeding values in animals and plants has developed rapidly into its own research field. Results of genomic prediction studies are often difficult to compare because data simulation varies, real or simulated data are not fully described, and not all relevant results are reported. In addition, some new methods have been compared only in limited genetic architectures, leading to potentially misleading conclusions. In this article we review simulation procedures, discuss validation and reporting of results, and apply benchmark procedures for a variety of genomic prediction methods in simulated and real example data. Plant and animal breeding programs are being transformed by the use of genomic data, which are becoming widely available and cost-effective to predict genetic merit. A large number of genomic prediction studies have been published using both simulated and real data. The relative novelty of this area of research has made the development of scientific conventions difficult with regard to description of the real data, simulation of genomes, validation and reporting of results, and forward in time methods. In this review article we discuss the generation of simulated genotype and phenotype data, using approaches such as the coalescent and forward in time simulation. We outline ways to validate simulated data and genomic prediction results, including cross-validation. The accuracy and bias of genomic prediction are highlighted as performance indicators that should be reported. We suggest that a measure of relatedness between the reference and validation individuals be reported, as its impact on the accuracy of genomic prediction is substantial. A large number of methods were compared in example simulated and real (pine and wheat) data sets, all of which are publicly available. In our limited simulations, most methods performed similarly in traits with a large number of quantitative trait loci (QTL), whereas in traits with fewer QTL variable selection did have some advantages. In the real data sets examined here all methods had very similar accuracies. We conclude that no single method can serve as a benchmark for genomic prediction. We recommend comparing accuracy and bias of new methods to results from genomic best linear prediction and a variable selection approach (e.g., BayesB), because, together, these methods are appropriate for a range of genetic architectures. An accompanying article in this issue provides a comprehensive review of genomic prediction methods and discusses a selection of topics related to application of genomic prediction in plants and animals.

  2. Genomic Prediction in Animals and Plants: Simulation of Data, Validation, Reporting, and Benchmarking

    PubMed Central

    Daetwyler, Hans D.; Calus, Mario P. L.; Pong-Wong, Ricardo; de los Campos, Gustavo; Hickey, John M.

    2013-01-01

    The genomic prediction of phenotypes and breeding values in animals and plants has developed rapidly into its own research field. Results of genomic prediction studies are often difficult to compare because data simulation varies, real or simulated data are not fully described, and not all relevant results are reported. In addition, some new methods have been compared only in limited genetic architectures, leading to potentially misleading conclusions. In this article we review simulation procedures, discuss validation and reporting of results, and apply benchmark procedures for a variety of genomic prediction methods in simulated and real example data. Plant and animal breeding programs are being transformed by the use of genomic data, which are becoming widely available and cost-effective to predict genetic merit. A large number of genomic prediction studies have been published using both simulated and real data. The relative novelty of this area of research has made the development of scientific conventions difficult with regard to description of the real data, simulation of genomes, validation and reporting of results, and forward in time methods. In this review article we discuss the generation of simulated genotype and phenotype data, using approaches such as the coalescent and forward in time simulation. We outline ways to validate simulated data and genomic prediction results, including cross-validation. The accuracy and bias of genomic prediction are highlighted as performance indicators that should be reported. We suggest that a measure of relatedness between the reference and validation individuals be reported, as its impact on the accuracy of genomic prediction is substantial. A large number of methods were compared in example simulated and real (pine and wheat) data sets, all of which are publicly available. In our limited simulations, most methods performed similarly in traits with a large number of quantitative trait loci (QTL), whereas in traits with fewer QTL variable selection did have some advantages. In the real data sets examined here all methods had very similar accuracies. We conclude that no single method can serve as a benchmark for genomic prediction. We recommend comparing accuracy and bias of new methods to results from genomic best linear prediction and a variable selection approach (e.g., BayesB), because, together, these methods are appropriate for a range of genetic architectures. An accompanying article in this issue provides a comprehensive review of genomic prediction methods and discusses a selection of topics related to application of genomic prediction in plants and animals. PMID:23222650

  3. Quantitative analysis of artifacts in 4D DSA: the relative contributions of beam hardening and scatter to vessel dropout behind highly attenuating structures

    NASA Astrophysics Data System (ADS)

    Hermus, James; Szczykutowicz, Timothy P.; Strother, Charles M.; Mistretta, Charles

    2014-03-01

    When performing Computed Tomographic (CT) image reconstruction on digital subtraction angiography (DSA) projections, loss of vessel contrast has been observed behind highly attenuating anatomy, such as dental implants and large contrast filled aneurysms. Because this typically occurs only in a limited range of projection angles, the observed contrast time course can potentially be altered. In this work, we have developed a model for acquiring DSA projections that models both the polychromatic nature of the x-ray spectrum and the x-ray scattering interactions to investigate this problem. In our simulation framework, scatter and beam hardening contributions to vessel dropout can be analyzed separately. We constructed digital phantoms with large clearly defined regions containing iodine contrast, bone, soft issue, titanium (dental implants) or combinations of these materials. As the regions containing the materials were large and rectangular, when the phantoms were forward projected, the projections contained uniform regions of interest (ROI) and enabled accurate vessel dropout analysis. Two phantom models were used, one to model the case of a vessel behind a large contrast filled aneurysm and the other to model a vessel behind a dental implant. Cases in which both beam hardening and scatter were turned off, only scatter was turned on, only beam hardening was turned on, and both scatter and beam hardening were turned on, were simulated for both phantom models. The analysis of this data showed that the contrast degradation is primarily due to scatter. When analyzing the aneurysm case, 90.25% of the vessel contrast was lost in the polychromatic scatter image, however only 50.5% of the vessel contrast was lost in the beam hardening only image. When analyzing the teeth case, 44.2% of the vessel contrast was lost in the polychromatic scatter image and only 26.2% of the vessel contrast was lost in the beam hardening only image.

  4. Computational Investigation of In-Flight Temperature in Shaped Charge Jets and Explosively Formed Penetrators

    NASA Astrophysics Data System (ADS)

    Sable, Peter; Helminiak, Nathaniel; Harstad, Eric; Gullerud, Arne; Hollenshead, Jeromy; Hertel, Eugene; Sandia National Laboratories Collaboration; Marquette University Collaboration

    2017-06-01

    With the increasing use of hydrocodes in modeling and system design, experimental benchmarking of software has never been more important. While this has been a large area of focus since the inception of computational design, comparisons with temperature data are sparse due to experimental limitations. A novel temperature measurement technique, magnetic diffusion analysis, has enabled the acquisition of in-flight temperature measurements of hyper velocity projectiles. Using this, an AC-14 bare shaped charge and an LX-14 EFP, both with copper linings, were simulated using CTH to benchmark temperature against experimental results. Particular attention was given to the slug temperature profiles after separation, and the effect of varying equation-of-state and strength models. Simulations are in agreement with experimental, attaining better than 2% error between observed shaped charge temperatures. This varied notably depending on the strength model used. Similar observations were made simulating the EFP case, with a minimum 4% deviation. Jet structures compare well with radiographic images and are consistent with ALEGRA simulations previously conducted. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  5. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold H. Kritz

    PTRANSP, which is the predictive version of the TRANSP code, was developed in a collaborative effort involving the Princeton Plasma Physics Laboratory, General Atomics Corporation, Lawrence Livermore National Laboratory, and Lehigh University. The PTRANSP/TRANSP suite of codes is the premier integrated tokamak modeling software in the United States. A production service for PTRANSP/TRANSP simulations is maintained at the Princeton Plasma Physics Laboratory; the server has a simple command line client interface and is subscribed to by about 100 researchers from tokamak projects in the US, Europe, and Asia. This service produced nearly 13000 PTRANSP/TRANSP simulations in the four year periodmore » FY 2005 through FY 2008. Major archives of TRANSP results are maintained at PPPL, MIT, General Atomics, and JET. Recent utilization, counting experimental analysis simulations as well as predictive simulations, more than doubled from slightly over 2000 simulations per year in FY 2005 and FY 2006 to over 4300 simulations per year in FY 2007 and FY 2008. PTRANSP predictive simulations applied to ITER increased eight fold from 30 simulations per year in FY 2005 and FY 2006 to 240 simulations per year in FY 2007 and FY 2008, accounting for more than half of combined PTRANSP/TRANSP service CPU resource utilization in FY 2008. PTRANSP studies focused on ITER played a key role in journal articles. Examples of validation studies carried out for momentum transport in PTRANSP simulations were presented at the 2008 IAEA conference. The increase in number of PTRANSP simulations has continued (more than 7000 TRANSP/PTRANSP simulations in 2010) and results of PTRANSP simulations appear in conference proceedings, for example the 2010 IAEA conference, and in peer reviewed papers. PTRANSP provides a bridge to the Fusion Simulation Program (FSP) and to the future of integrated modeling. Through years of widespread usage, each of the many parts of the PTRANSP suite of codes has been thoroughly validated against experimental data and benchmarked against other codes. At the same time, architectural modernizations are improving the modularity of the PTRANSP code base. The NUBEAM neutral beam and fusion products fast ion model, the Plasma State data repository (developed originally in the SWIM SciDAC project and adapted for use in PTRANSP), and other components are already shared with the SWIM, FACETS, and CPES SciDAC FSP prototype projects. Thus, the PTRANSP code is already serving as a bridge between our present integrated modeling capability and future capability. As the Fusion Simulation Program builds toward the facility currently available in the PTRANSP suite of codes, early versions of the FSP core plasma model will need to be benchmarked against the PTRANSP simulations. This will be necessary to build user confidence in FSP, but this benchmarking can only be done if PTRANSP itself is maintained and developed.« less

  7. An End-to-End simulator for the development of atmospheric corrections and temperature - emissivity separation algorithms in the TIR spectral domain

    NASA Astrophysics Data System (ADS)

    Rock, Gilles; Fischer, Kim; Schlerf, Martin; Gerhards, Max; Udelhoven, Thomas

    2017-04-01

    The development and optimization of image processing algorithms requires the availability of datasets depicting every step from earth surface to the sensor's detector. The lack of ground truth data obliges to develop algorithms on simulated data. The simulation of hyperspectral remote sensing data is a useful tool for a variety of tasks such as the design of systems, the understanding of the image formation process, and the development and validation of data processing algorithms. An end-to-end simulator has been set up consisting of a forward simulator, a backward simulator and a validation module. The forward simulator derives radiance datasets based on laboratory sample spectra, applies atmospheric contributions using radiative transfer equations, and simulates the instrument response using configurable sensor models. This is followed by the backward simulation branch, consisting of an atmospheric correction (AC), a temperature and emissivity separation (TES) or a hybrid AC and TES algorithm. An independent validation module allows the comparison between input and output dataset and the benchmarking of different processing algorithms. In this study, hyperspectral thermal infrared scenes of a variety of surfaces have been simulated to analyze existing AC and TES algorithms. The ARTEMISS algorithm was optimized and benchmarked against the original implementations. The errors in TES were found to be related to incorrect water vapor retrieval. The atmospheric characterization could be optimized resulting in increasing accuracies in temperature and emissivity retrieval. Airborne datasets of different spectral resolutions were simulated from terrestrial HyperCam-LW measurements. The simulated airborne radiance spectra were subjected to atmospheric correction and TES and further used for a plant species classification study analyzing effects related to noise and mixed pixels.

  8. Non-rigid point set registration of curves: registration of the superficial vessel centerlines of the brain

    NASA Astrophysics Data System (ADS)

    Marreiros, Filipe M. M.; Wang, Chunliang; Rossitti, Sandro; Smedby, Örjan

    2016-03-01

    In this study we present a non-rigid point set registration for 3D curves (composed by 3D set of points). The method was evaluated in the task of registration of 3D superficial vessels of the brain where it was used to match vessel centerline points. It consists of a combination of the Coherent Point Drift (CPD) and the Thin-Plate Spline (TPS) semilandmarks. The CPD is used to perform the initial matching of centerline 3D points, while the semilandmark method iteratively relaxes/slides the points. For the evaluation, a Magnetic Resonance Angiography (MRA) dataset was used. Deformations were applied to the extracted vessels centerlines to simulate brain bulging and sinking, using a TPS deformation where a few control points were manipulated to obtain the desired transformation (T1). Once the correspondences are known, the corresponding points are used to define a new TPS deformation(T2). The errors are measured in the deformed space, by transforming the original points using T1 and T2 and measuring the distance between them. To simulate cases where the deformed vessel data is incomplete, parts of the reference vessels were cut and then deformed. Furthermore, anisotropic normally distributed noise was added. The results show that the error estimates (root mean square error and mean error) are below 1 mm, even in the presence of noise and incomplete data.

  9. Differential Die-Away Instrument: Report on Benchmark Measurements and Comparison with Simulation for the Effects of Neutron Poisons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsell, Alison Victoria; Swinhoe, Martyn Thomas; Henzl, Vladimir

    2015-03-30

    In this report, new experimental data and MCNPX simulation results of the differential die-away (DDA) instrument response to the presence of neutron absorbers are evaluated. In our previous fresh nuclear fuel experiments and simulations, no neutron absorbers or poisons were included in the fuel definition. These new results showcase the capability of the DDA instrument to acquire data from a system that better mimics spent nuclear fuel.

  10. Analysis of the Effect of Environmental Conditions in Conducting Amphibious Assaults Using a Ship Simulator/Vessel-Response Model Proof-of-Concept Study

    DTIC Science & Technology

    2017-05-01

    Center ESRI Environmental Systems Research Institute GIS Geographic Information System HTML Hyper -Text Markup Language LCAC Landing Craft Air... loop .” The ship simulator bridge is generic in that its layout is similar to that found in a variety of ships. As shown in Figures 17 and 18, the...information stored in the geodatabases. The Hyper -Text Markup Language (HTML) capability built into ArcMap permits a planner to click on a vessel track and

  11. Commercial Building Energy Saver, Web App

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    The CBES App is a web-based toolkit for use by small businesses and building owners and operators of small and medium size commercial buildings to perform energy benchmarking and retrofit analysis for buildings. The CBES App analyzes the energy performance of user's building for pre-and posto-retrofit, in conjunction with user's input data, to identify recommended retrofit measures, energy savings and economic analysis for the selected measures. The CBES App provides energy benchmarking, including getting an EnergyStar score using EnergyStar API and benchmarking against California peer buildings using the EnergyIQ API. The retrofit analysis includes a preliminary analysis by looking upmore » retrofit measures from a pre-simulated database DEEP, and a detailed analysis creating and running EnergyPlus models to calculate energy savings of retrofit measures. The CBES App builds upon the LBNL CBES API.« less

  12. Benchmarking study of the MCNP code against cold critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitaraman, S.

    1991-01-01

    The purpose of this study was to benchmark the widely used Monte Carlo code MCNP against a set of cold critical experiments with a view to using the code as a means of independently verifying the performance of faster but less accurate Monte Carlo and deterministic codes. The experiments simulated consisted of both fast and thermal criticals as well as fuel in a variety of chemical forms. A standard set of benchmark cold critical experiments was modeled. These included the two fast experiments, GODIVA and JEZEBEL, the TRX metallic uranium thermal experiments, the Babcock and Wilcox oxide and mixed oxidemore » experiments, and the Oak Ridge National Laboratory (ORNL) and Pacific Northwest Laboratory (PNL) nitrate solution experiments. The principal case studied was a small critical experiment that was performed with boiling water reactor bundles.« less

  13. Creation of problem-dependent Doppler-broadened cross sections in the KENO Monte Carlo code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Shane W. D.; Celik, Cihangir; Maldonado, G. Ivan

    2015-11-06

    In this paper, we introduce a quick method for improving the accuracy of Monte Carlo simulations by generating one- and two-dimensional cross sections at a user-defined temperature before performing transport calculations. A finite difference method is used to Doppler-broaden cross sections to the desired temperature, and unit-base interpolation is done to generate the probability distributions for double differential two-dimensional thermal moderator cross sections at any arbitrarily user-defined temperature. The accuracy of these methods is tested using a variety of contrived problems. In addition, various benchmarks at elevated temperatures are modeled, and results are compared with benchmark results. Lastly, the problem-dependentmore » cross sections are observed to produce eigenvalue estimates that are closer to the benchmark results than those without the problem-dependent cross sections.« less

  14. Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part I: Benchmark comparisons of WIMS-D5 and DRAGON cell and control rod parameters with MCNP5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mollerach, R.; Leszczynski, F.; Fink, J.

    2006-07-01

    In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out coveringmore » cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)« less

  15. A Novel Pairwise Comparison-Based Method to Determine Radiation Dose Reduction Potentials of Iterative Reconstruction Algorithms, Exemplified Through Circle of Willis Computed Tomography Angiography.

    PubMed

    Ellmann, Stephan; Kammerer, Ferdinand; Brand, Michael; Allmendinger, Thomas; May, Matthias S; Uder, Michael; Lell, Michael M; Kramer, Manuel

    2016-05-01

    The aim of this study was to determine the dose reduction potential of iterative reconstruction (IR) algorithms in computed tomography angiography (CTA) of the circle of Willis using a novel method of evaluating the quality of radiation dose-reduced images. This study relied on ReconCT, a proprietary reconstruction software that allows simulating CT scans acquired with reduced radiation dose based on the raw data of true scans. To evaluate the performance of ReconCT in this regard, a phantom study was performed to compare the image noise of true and simulated scans within simulated vessels of a head phantom. That followed, 10 patients scheduled for CTA of the circle of Willis were scanned according to our institute's standard protocol (100 kV, 145 reference mAs). Subsequently, CTA images of these patients were reconstructed as either a full-dose weighted filtered back projection or with radiation dose reductions down to 10% of the full-dose level and Sinogram-Affirmed Iterative Reconstruction (SAFIRE) with either strength 3 or 5. Images were marked with arrows pointing on vessels of different sizes, and image pairs were presented to observers. Five readers assessed image quality with 2-alternative forced choice comparisons. In the phantom study, no significant differences were observed between the noise levels of simulated and true scans in filtered back projection, SAFIRE 3, and SAFIRE 5 reconstructions.The dose reduction potential for patient scans showed a strong dependence on IR strength as well as on the size of the vessel of interest. Thus, the potential radiation dose reductions ranged from 84.4% for the evaluation of great vessels reconstructed with SAFIRE 5 to 40.9% for the evaluation of small vessels reconstructed with SAFIRE 3. This study provides a novel image quality evaluation method based on 2-alternative forced choice comparisons. In CTA of the circle of Willis, higher IR strengths and greater vessel sizes allowed higher degrees of radiation dose reduction.

  16. Aortic dissection simulation models for clinical support: fluid-structure interaction vs. rigid wall models.

    PubMed

    Alimohammadi, Mona; Sherwood, Joseph M; Karimpour, Morad; Agu, Obiekezie; Balabani, Stavroula; Díaz-Zuccarini, Vanessa

    2015-04-15

    The management and prognosis of aortic dissection (AD) is often challenging and the use of personalised computational models is being explored as a tool to improve clinical outcome. Including vessel wall motion in such simulations can provide more realistic and potentially accurate results, but requires significant additional computational resources, as well as expertise. With clinical translation as the final aim, trade-offs between complexity, speed and accuracy are inevitable. The present study explores whether modelling wall motion is worth the additional expense in the case of AD, by carrying out fluid-structure interaction (FSI) simulations based on a sample patient case. Patient-specific anatomical details were extracted from computed tomography images to provide the fluid domain, from which the vessel wall was extrapolated. Two-way fluid-structure interaction simulations were performed, with coupled Windkessel boundary conditions and hyperelastic wall properties. The blood was modelled using the Carreau-Yasuda viscosity model and turbulence was accounted for via a shear stress transport model. A simulation without wall motion (rigid wall) was carried out for comparison purposes. The displacement of the vessel wall was comparable to reports from imaging studies in terms of intimal flap motion and contraction of the true lumen. Analysis of the haemodynamics around the proximal and distal false lumen in the FSI model showed complex flow structures caused by the expansion and contraction of the vessel wall. These flow patterns led to significantly different predictions of wall shear stress, particularly its oscillatory component, which were not captured by the rigid wall model. Through comparison with imaging data, the results of the present study indicate that the fluid-structure interaction methodology employed herein is appropriate for simulations of aortic dissection. Regions of high wall shear stress were not significantly altered by the wall motion, however, certain collocated regions of low and oscillatory wall shear stress which may be critical for disease progression were only identified in the FSI simulation. We conclude that, if patient-tailored simulations of aortic dissection are to be used as an interventional planning tool, then the additional complexity, expertise and computational expense required to model wall motion is indeed justified.

  17. Validation of numerical codes for impact and explosion cratering: Impacts on strengthless and metal targets

    NASA Astrophysics Data System (ADS)

    Pierazzo, E.; Artemieva, N.; Asphaug, E.; Baldwin, E. C.; Cazamias, J.; Coker, R.; Collins, G. S.; Crawford, D. A.; Davison, T.; Elbeshausen, D.; Holsapple, K. A.; Housen, K. R.; Korycansky, D. G.; Wünnemann, K.

    2008-12-01

    Over the last few decades, rapid improvement of computer capabilities has allowed impact cratering to be modeled with increasing complexity and realism, and has paved the way for a new era of numerical modeling of the impact process, including full, three-dimensional (3D) simulations. When properly benchmarked and validated against observation, computer models offer a powerful tool for understanding the mechanics of impact crater formation. This work presents results from the first phase of a project to benchmark and validate shock codes. A variety of 2D and 3D codes were used in this study, from commercial products like AUTODYN, to codes developed within the scientific community like SOVA, SPH, ZEUS-MP, iSALE, and codes developed at U.S. National Laboratories like CTH, SAGE/RAGE, and ALE3D. Benchmark calculations of shock wave propagation in aluminum-on-aluminum impacts were performed to examine the agreement between codes for simple idealized problems. The benchmark simulations show that variability in code results is to be expected due to differences in the underlying solution algorithm of each code, artificial stability parameters, spatial and temporal resolution, and material models. Overall, the inter-code variability in peak shock pressure as a function of distance is around 10 to 20%. In general, if the impactor is resolved by at least 20 cells across its radius, the underestimation of peak shock pressure due to spatial resolution is less than 10%. In addition to the benchmark tests, three validation tests were performed to examine the ability of the codes to reproduce the time evolution of crater radius and depth observed in vertical laboratory impacts in water and two well-characterized aluminum alloys. Results from these calculations are in good agreement with experiments. There appears to be a general tendency of shock physics codes to underestimate the radius of the forming crater. Overall, the discrepancy between the model and experiment results is between 10 and 20%, similar to the inter-code variability.

  18. Design and Application of a Community Land Benchmarking System for Earth System Models

    NASA Astrophysics Data System (ADS)

    Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Koven, C. D.; Kluzek, E. B.; Mao, J.; Randerson, J. T.

    2015-12-01

    Benchmarking has been widely used to assess the ability of climate models to capture the spatial and temporal variability of observations during the historical era. For the carbon cycle and terrestrial ecosystems, the design and development of an open-source community platform has been an important goal as part of the International Land Model Benchmarking (ILAMB) project. Here we developed a new benchmarking software system that enables the user to specify the models, benchmarks, and scoring metrics, so that results can be tailored to specific model intercomparison projects. Evaluation data sets included soil and aboveground carbon stocks, fluxes of energy, carbon and water, burned area, leaf area, and climate forcing and response variables. We used this system to evaluate simulations from the 5th Phase of the Coupled Model Intercomparison Project (CMIP5) with prognostic atmospheric carbon dioxide levels over the period from 1850 to 2005 (i.e., esmHistorical simulations archived on the Earth System Grid Federation). We found that the multi-model ensemble had a high bias in incoming solar radiation across Asia, likely as a consequence of incomplete representation of aerosol effects in this region, and in South America, primarily as a consequence of a low bias in mean annual precipitation. The reduced precipitation in South America had a larger influence on gross primary production than the high bias in incoming light, and as a consequence gross primary production had a low bias relative to the observations. Although model to model variations were large, the multi-model mean had a positive bias in atmospheric carbon dioxide that has been attributed in past work to weak ocean uptake of fossil emissions. In mid latitudes of the northern hemisphere, most models overestimate latent heat fluxes in the early part of the growing season, and underestimate these fluxes in mid-summer and early fall, whereas sensible heat fluxes show the opposite trend.

  19. Development of Benchmark Examples for Delamination Onset and Fatigue Growth Prediction

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2011-01-01

    An approach for assessing the delamination propagation and growth capabilities in commercial finite element codes was developed and demonstrated for the Virtual Crack Closure Technique (VCCT) implementations in ABAQUS. The Double Cantilever Beam (DCB) specimen was chosen as an example. First, benchmark results to assess delamination propagation capabilities under static loading were created using models simulating specimens with different delamination lengths. For each delamination length modeled, the load and displacement at the load point were monitored. The mixed-mode strain energy release rate components were calculated along the delamination front across the width of the specimen. A failure index was calculated by correlating the results with the mixed-mode failure criterion of the graphite/epoxy material. The calculated critical loads and critical displacements for delamination onset for each delamination length modeled were used as a benchmark. The load/displacement relationship computed during automatic propagation should closely match the benchmark case. Second, starting from an initially straight front, the delamination was allowed to propagate based on the algorithms implemented in the commercial finite element software. The load-displacement relationship obtained from the propagation analysis results and the benchmark results were compared. Good agreements could be achieved by selecting the appropriate input parameters, which were determined in an iterative procedure.

  20. Cyclic crack growth behavior of reactor pressure vessel steels in light water reactor environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Der Sluys, W.A.; Emanuelson, R.H.

    1986-01-01

    During normal operation light water reactor (LWR) pressure vessels are subjected to a variety of transients resulting in time varying stresses. Consequently, fatigue and environmentally assisted fatigue are growth mechanisms relevant to flaws in these pressure vessels. In order to provide a better understanding of the resistance of nuclear pressure vessel steels to flaw growth process, a series of fracture mechanics experiments were conducted to generate data on the rate of cyclic crack growth in SA508-2 and SA533b-1 steels in simulated 550/sup 0/F boiling water reactor (BWR) and 550/sup 0/F pressurized water reactor (PWR) environments. Areas investigated over the coursemore » of the test program included the effects of loading frequency and r ratio (Kmin-Kmax) on crack growth rate as a function of the stress intensity factor (deltaK) range. In addition, the effect of sulfur content of the test material on the cyclic crack growth rate was studied. Cyclic crack growth rates were found to be controlled by deltaK, R ratio, and loading frequency. The sulfur impurity content of the reactor pressure vessel steels studied had a significant effect on the cyclic crack growth rates. The higher growth rates were always associated with materials of higher sulfur content. For a given level of sulfur, growth rates were in a 550/sup 0/F simulated BWR environment than in a 550/sup 0/F simulated PWR environment. In both environments cyclic crack growth rates were a strong function of the loading frequency.« less

  1. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  2. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  3. Optimization of Deep Drilling Performance - Development and Benchmark Testing of Advanced Diamond Product Drill Bits & HP/HT Fluids to Significantly Improve Rates of Penetration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alan Black; Arnis Judzis

    2005-09-30

    This document details the progress to date on the OPTIMIZATION OF DEEP DRILLING PERFORMANCE--DEVELOPMENT AND BENCHMARK TESTING OF ADVANCED DIAMOND PRODUCT DRILL BITS AND HP/HT FLUIDS TO SIGNIFICANTLY IMPROVE RATES OF PENETRATION contract for the year starting October 2004 through September 2005. The industry cost shared program aims to benchmark drilling rates of penetration in selected simulated deep formations and to significantly improve ROP through a team development of aggressive diamond product drill bit--fluid system technologies. Overall the objectives are as follows: Phase 1--Benchmark ''best in class'' diamond and other product drilling bits and fluids and develop concepts for amore » next level of deep drilling performance; Phase 2--Develop advanced smart bit-fluid prototypes and test at large scale; and Phase 3--Field trial smart bit--fluid concepts, modify as necessary and commercialize products. As of report date, TerraTek has concluded all Phase 1 testing and is planning Phase 2 development.« less

  4. Experimental Creep Life Assessment for the Advanced Stirling Convertor Heater Head

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Kalluri, Sreeramesh; Shah, Ashwin R.; Korovaichuk, Igor

    2010-01-01

    The United States Department of Energy is planning to develop the Advanced Stirling Radioisotope Generator (ASRG) for the National Aeronautics and Space Administration (NASA) for potential use on future space missions. The ASRG provides substantial efficiency and specific power improvements over radioisotope power systems of heritage designs. The ASRG would use General Purpose Heat Source modules as energy sources and the free-piston Advanced Stirling Convertor (ASC) to convert heat into electrical energy. Lockheed Martin Corporation of Valley Forge, Pennsylvania, is integrating the ASRG systems, and Sunpower, Inc., of Athens, Ohio, is designing and building the ASC. NASA Glenn Research Center of Cleveland, Ohio, manages the Sunpower contract and provides technology development in several areas for the ASC. One area is reliability assessment for the ASC heater head, a critical pressure vessel within which heat is converted into mechanical oscillation of a displacer piston. For high system efficiency, the ASC heater head operates at very high temperature (850 C) and therefore is fabricated from an advanced heat-resistant nickel-based superalloy Microcast MarM-247. Since use of MarM-247 in a thin-walled pressure vessel is atypical, much effort is required to assure that the system will operate reliably for its design life of 17 years. One life-limiting structural response for this application is creep; creep deformation is the accumulation of time-dependent inelastic strain under sustained loading over time. If allowed to progress, the deformation eventually results in creep rupture. Since creep material properties are not available in the open literature, a detailed creep life assessment of the ASC heater head effort is underway. This paper presents an overview of that creep life assessment approach, including the reliability-based creep criteria developed from coupon testing, and the associated heater head deterministic and probabilistic analyses. The approach also includes direct benchmark experimental creep assessment. This element provides high-fidelity creep testing of prototypical heater head test articles to investigate the relevant material issues and multiaxial stress state. Benchmark testing provides required data to evaluate the complex life assessment methodology and to validate that analysis. Results from current benchmark heater head tests and newly developed experimental methods are presented. In the concluding remarks, the test results are shown to compare favorably with the creep strain predictions and are the first experimental evidence for a robust ASC heater head creep life.

  5. A Benchmark and Comparative Study of Video-Based Face Recognition on COX Face Database.

    PubMed

    Huang, Zhiwu; Shan, Shiguang; Wang, Ruiping; Zhang, Haihong; Lao, Shihong; Kuerban, Alifu; Chen, Xilin

    2015-12-01

    Face recognition with still face images has been widely studied, while the research on video-based face recognition is inadequate relatively, especially in terms of benchmark datasets and comparisons. Real-world video-based face recognition applications require techniques for three distinct scenarios: 1) Videoto-Still (V2S); 2) Still-to-Video (S2V); and 3) Video-to-Video (V2V), respectively, taking video or still image as query or target. To the best of our knowledge, few datasets and evaluation protocols have benchmarked for all the three scenarios. In order to facilitate the study of this specific topic, this paper contributes a benchmarking and comparative study based on a newly collected still/video face database, named COX(1) Face DB. Specifically, we make three contributions. First, we collect and release a largescale still/video face database to simulate video surveillance with three different video-based face recognition scenarios (i.e., V2S, S2V, and V2V). Second, for benchmarking the three scenarios designed on our database, we review and experimentally compare a number of existing set-based methods. Third, we further propose a novel Point-to-Set Correlation Learning (PSCL) method, and experimentally show that it can be used as a promising baseline method for V2S/S2V face recognition on COX Face DB. Extensive experimental results clearly demonstrate that video-based face recognition needs more efforts, and our COX Face DB is a good benchmark database for evaluation.

  6. Mechanisms of Microgravity Effect on Vascular Function

    NASA Technical Reports Server (NTRS)

    Purdy, Ralph E.

    1995-01-01

    The overall goal of the project is to characterize the effects of simulated microgravity on vascular function. Microgravity is simulated using the hindlimb unweighted (HU) rat, and the following vessels are removed from HU and paired control rats for in vitro analysis: abdominal aorta, carotid and femoral arteries, jugular and femoral veins. These vessels are cut into 3 mm long rings and mounted in tissue baths for the measurement of either isometric contraction, or relaxation of pre- contracted vessels. The isolated mesenteric vascular bed is perfused for the measurement of changes in perfusion pressure as an index of arteriolar constriction or dilation. This report presents, in addition to the statement of the overall goal of the project, a summary list of the specific hypotheses to be tested. These are followed by sections on results, conclusions, significance and plans for the next year.

  7. A Zr-based bulk metallic glass for future stent applications: Materials properties, finite element modeling, and in vitro human vascular cell response.

    PubMed

    Huang, Lu; Pu, Chao; Fisher, Richard K; Mountain, Deidra J H; Gao, Yanfei; Liaw, Peter K; Zhang, Wei; He, Wei

    2015-10-01

    Despite the prevalent use of crystalline alloys in current vascular stent technology, new biomaterials are being actively sought after to improve stent performance. In this study, we demonstrated the potential of a Zr-Al-Fe-Cu bulk metallic glass (BMG) to serve as a candidate stent material. The mechanical properties of the Zr-based BMG, determined under both static and cyclic loadings, were characterized by high strength, which would allow for the design of thinner stent struts to improve stent biocompatibility. Finite element analysis further complemented the experimental results and revealed that a stent made of the Zr-based BMG was more compliant with the beats of a blood vessel, compared with medical 316L stainless steel. The Zr-based BMG was found to be corrosion resistant in a simulated body environment, owing to the presence of a highly stable ZrO2-rich surface passive film. Application-specific biocompatibility studies were conducted using human aortic endothelial cells and smooth muscle cells. The Zr-Al-Fe-Cu BMG was found to support stronger adhesion and faster coverage of endothelial cells and slower growth of smooth muscle cells than 316L stainless steel. These results suggest that the Zr-based BMG could promote re-endothelialization and potentially lower the risk of restenosis, which are critical to improve vascular stent implantation integration. In general, findings in this study raised the curtain for the potential application of BMGs as future candidates for stent applications. Vascular stents are medical devices typically used to restore the lumen of narrowed or clogged blood vessel. Despite the clinical success of metallic materials in stent-assisted angioplasty, post-surgery complications persist due to the mechanical failures, corrosion, and in-stent restenosis of current stents. To overcome these hurdles, strategies including new designs and surface functionalization have been exercised. In addition, the development of new materials with higher performance and biocompatibility can intrinsically reduce stent failure rates. The present study demonstrates the advantages of a novel material, named bulk metallic glass (BMG), over the benchmarked 316L stainless steel through experimental methods and computational simulations. It raises the curtain of new research endeavors on BMGs as competitive alternatives for stent applications. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  8. Evaluation of the synoptic and mesoscale predictive capabilities of a mesoscale atmospheric simulation system

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K.; Keyser, D. A.; Mccumber, M. C.

    1983-01-01

    The overall performance characteristics of a limited area, hydrostatic, fine (52 km) mesh, primitive equation, numerical weather prediction model are determined in anticipation of satellite data assimilations with the model. The synoptic and mesoscale predictive capabilities of version 2.0 of this model, the Mesoscale Atmospheric Simulation System (MASS 2.0), were evaluated. The two part study is based on a sample of approximately thirty 12h and 24h forecasts of atmospheric flow patterns during spring and early summer. The synoptic scale evaluation results benchmark the performance of MASS 2.0 against that of an operational, synoptic scale weather prediction model, the Limited area Fine Mesh (LFM). The large sample allows for the calculation of statistically significant measures of forecast accuracy and the determination of systematic model errors. The synoptic scale benchmark is required before unsmoothed mesoscale forecast fields can be seriously considered.

  9. Binary Associative Memories as a Benchmark for Spiking Neuromorphic Hardware

    PubMed Central

    Stöckel, Andreas; Jenzen, Christoph; Thies, Michael; Rückert, Ulrich

    2017-01-01

    Large-scale neuromorphic hardware platforms, specialized computer systems for energy efficient simulation of spiking neural networks, are being developed around the world, for example as part of the European Human Brain Project (HBP). Due to conceptual differences, a universal performance analysis of these systems in terms of runtime, accuracy and energy efficiency is non-trivial, yet indispensable for further hard- and software development. In this paper we describe a scalable benchmark based on a spiking neural network implementation of the binary neural associative memory. We treat neuromorphic hardware and software simulators as black-boxes and execute exactly the same network description across all devices. Experiments on the HBP platforms under varying configurations of the associative memory show that the presented method allows to test the quality of the neuron model implementation, and to explain significant deviations from the expected reference output. PMID:28878642

  10. A simple numerical model for membrane oxygenation of an artificial lung machine

    NASA Astrophysics Data System (ADS)

    Subraveti, Sai Nikhil; Sai, P. S. T.; Viswanathan Pillai, Vinod Kumar; Patnaik, B. S. V.

    2015-11-01

    Optimal design of membrane oxygenators will have far reaching ramification in the development of artificial heart-lung systems. In the present CFD study, we simulate the gas exchange between the venous blood and air that passes through the hollow fiber membranes on a benchmark device. The gas exchange between the tube side fluid and the shell side venous liquid is modeled by solving mass, momentum conservation equations. The fiber bundle was modelled as a porous block with a bundle porosity of 0.6. The resistance offered by the fiber bundle was estimated by the standard Ergun correlation. The present numerical simulations are validated against available benchmark data. The effect of bundle porosity, bundle size, Reynolds number, non-Newtonian constitutive relation, upstream velocity distribution etc. on the pressure drop, oxygen saturation levels etc. are investigated. To emulate the features of gas transfer past the alveoli, the effect of pulsatility on the membrane oxygenation is also investigated.

  11. Integrated Prediction and Mitigation Methods of Materials Damage and Lifetime Assessment during Plasma Operation and Various Instabilities in Fusion Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassanein, Ahmed

    2015-03-31

    This report describes implementation of comprehensive and integrated models to evaluate plasma material interactions during normal and abnormal plasma operations. The models in full3D simulations represent state-of-the art worldwide development with numerous benchmarking of various tokamak devices and plasma simulators. In addition, significant number of experimental work has been performed in our center for materials under extreme environment (CMUXE) at Purdue to benchmark the effect of intense particle and heat fluxes on plasma-facing components. This represents one-year worth of work and resulted in more than 23 Journal Publications and numerous conferences presentations. The funding has helped several students to obtainmore » their M.Sc. and Ph.D. degrees and many of them are now faculty members in US and around the world teaching and conducting fusion research. Our work has also been recognized through many awards.« less

  12. Simulated annealing with probabilistic analysis for solving traveling salesman problems

    NASA Astrophysics Data System (ADS)

    Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan

    2013-09-01

    Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.

  13. The Equivalent Thermal Resistance of Tile Roofs with and without Batten Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, William A

    Clay and concrete tile roofs were installed on a fully instrumented attic test facility operating in East Tennessee s climate. Roof, attic and deck temperatures and heat flows were recorded for each of the tile roofs and also on an adjacent attic cavity covered with a conventionally pigmented and direct-nailed asphalt shingle roof. The data were used to benchmark a computer tool for simulation of roofs and attics and the tool used to develop an approach for computing an equivalent seasonal R-value for sub-tile venting. The approach computed equal heat fluxes through the ceilings of roofs having different combinations ofmore » surface radiation properties and or building constructions. A direct nailed shingle roof served as a control for estimating the equivalent thermal resistance of the air space. Simulations were benchmarked to data in the ASHRAE Fundamentals for the thermal resistance of inclined and closed air spaces.« less

  14. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClenaghan, J.; Lin, Z.; Holod, I.

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  15. Large eddy simulation of the FDA benchmark nozzle for a Reynolds number of 6500.

    PubMed

    Janiga, Gábor

    2014-04-01

    This work investigates the flow in a benchmark nozzle model of an idealized medical device proposed by the FDA using computational fluid dynamics (CFD). It was in particular shown that a proper modeling of the transitional flow features is particularly challenging, leading to large discrepancies and inaccurate predictions from the different research groups using Reynolds-averaged Navier-Stokes (RANS) modeling. In spite of the relatively simple, axisymmetric computational geometry, the resulting turbulent flow is fairly complex and non-axisymmetric, in particular due to the sudden expansion. The resulting flow cannot be well predicted with simple modeling approaches. Due to the varying diameters and flow velocities encountered in the nozzle, different typical flow regions and regimes can be distinguished, from laminar to transitional and to weakly turbulent. The purpose of the present work is to re-examine the FDA-CFD benchmark nozzle model at a Reynolds number of 6500 using large eddy simulation (LES). The LES results are compared with published experimental data obtained by Particle Image Velocimetry (PIV) and an excellent agreement can be observed considering the temporally averaged flow velocities. Different flow regimes are characterized by computing the temporal energy spectra at different locations along the main axis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Microbially Mediated Kinetic Sulfur Isotope Fractionation: Reactive Transport Modeling Benchmark

    NASA Astrophysics Data System (ADS)

    Wanner, C.; Druhan, J. L.; Cheng, Y.; Amos, R. T.; Steefel, C. I.; Ajo Franklin, J. B.

    2014-12-01

    Microbially mediated sulfate reduction is a ubiquitous process in many subsurface systems. Isotopic fractionation is characteristic of this anaerobic process, since sulfate reducing bacteria (SRB) favor the reduction of the lighter sulfate isotopologue (S32O42-) over the heavier isotopologue (S34O42-). Detection of isotopic shifts have been utilized as a proxy for the onset of sulfate reduction in subsurface systems such as oil reservoirs and aquifers undergoing uranium bioremediation. Reactive transport modeling (RTM) of kinetic sulfur isotope fractionation has been applied to field and laboratory studies. These RTM approaches employ different mathematical formulations in the representation of kinetic sulfur isotope fractionation. In order to test the various formulations, we propose a benchmark problem set for the simulation of kinetic sulfur isotope fractionation during microbially mediated sulfate reduction. The benchmark problem set is comprised of four problem levels and is based on a recent laboratory column experimental study of sulfur isotope fractionation. Pertinent processes impacting sulfur isotopic composition such as microbial sulfate reduction and dispersion are included in the problem set. To date, participating RTM codes are: CRUNCHTOPE, TOUGHREACT, MIN3P and THE GEOCHEMIST'S WORKBENCH. Preliminary results from various codes show reasonable agreement for the problem levels simulating sulfur isotope fractionation in 1D.

  17. An Approach to Assess Delamination Propagation Simulation Capabilities in Commercial Finite Element Codes

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2008-01-01

    An approach for assessing the delamination propagation simulation capabilities in commercial finite element codes is presented and demonstrated. For this investigation, the Double Cantilever Beam (DCB) specimen and the Single Leg Bending (SLB) specimen were chosen for full three-dimensional finite element simulations. First, benchmark results were created for both specimens. Second, starting from an initially straight front, the delamination was allowed to propagate. The load-displacement relationship and the total strain energy obtained from the propagation analysis results and the benchmark results were compared and good agreements could be achieved by selecting the appropriate input parameters. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Qualitatively, the delamination front computed for the DCB specimen did not take the shape of a curved front as expected. However, the analysis of the SLB specimen yielded a curved front as was expected from the distribution of the energy release rate and the failure index across the width of the specimen. Overall, the results are encouraging but further assessment on a structural level is required.

  18. Mesoscale Simulation of Blood Flow in Small Vessels

    PubMed Central

    Bagchi, Prosenjit

    2007-01-01

    Computational modeling of blood flow in microvessels with internal diameter 20–500 μm is a major challenge. It is because blood in such vessels behaves as a multiphase suspension of deformable particles. A continuum model of blood is not adequate if the motion of individual red blood cells in the suspension is of interest. At the same time, multiple cells, often a few thousands in number, must also be considered to account for cell-cell hydrodynamic interaction. Moreover, the red blood cells (RBCs) are highly deformable. Deformation of the cells must also be considered in the model, as it is a major determinant of many physiologically significant phenomena, such as formation of a cell-free layer, and the Fahraeus-Lindqvist effect. In this article, we present two-dimensional computational simulation of blood flow in vessels of size 20–300 μm at discharge hematocrit of 10–60%, taking into consideration the particulate nature of blood and cell deformation. The numerical model is based on the immersed boundary method, and the red blood cells are modeled as liquid capsules. A large RBC population comprising of as many as 2500 cells are simulated. Migration of the cells normal to the wall of the vessel and the formation of the cell-free layer are studied. Results on the trajectory and velocity traces of the RBCs, and their fluctuations are presented. Also presented are the results on the plug-flow velocity profile of blood, the apparent viscosity, and the Fahraeus-Lindqvist effect. The numerical results also allow us to investigate the variation of apparent blood viscosity along the cross-section of a vessel. The computational results are compared with the experimental results. To the best of our knowledge, this article presents the first simulation to simultaneously consider a large ensemble of red blood cells and the cell deformation. PMID:17208982

  19. Effect of blood vessels on light distribution in optogenetic stimulation of cortex.

    PubMed

    Azimipour, Mehdi; Atry, Farid; Pashaie, Ramin

    2015-05-15

    In this Letter, the impact of blood vessels on light distribution during photostimulation of cortical tissue in small rodents is investigated. Brain optical properties were extracted using a double-integrating sphere setup, and optical coherence tomography was used to image cortical vessels and capillaries to generate a three-dimensional angiogram of the cortex. By combining these two datasets, a complete volumetric structure of the cortical tissue was developed and linked to a Monte Carlo code which simulates light propagation in this inhomogeneous structure and illustrates the effect of blood vessels on the penetration depth and pattern preservation in optogenetic stimulation.

  20. Low energy electron transport in furfural

    NASA Astrophysics Data System (ADS)

    Lozano, Ana I.; Krupa, Kateryna; Ferreira da Silva, Filipe; Limão-Vieira, Paulo; Blanco, Francisco; Muñoz, Antonio; Jones, Darryl B.; Brunger, Michael J.; García, Gustavo

    2017-09-01

    We report on an initial investigation into the transport of electrons through a gas cell containing 1 mTorr of gaseous furfural. Results from our Monte Carlo simulation are implicitly checked against those from a corresponding electron transmission measurement. To enable this simulation a self-consistent cross section data base was constructed. This data base is benchmarked through new total cross section measurements which are also described here. In addition, again to facilitate the simulation, our preferred energy loss distribution function is presented and discussed.

  1. Neutron streaming studies along JET shielding penetrations

    NASA Astrophysics Data System (ADS)

    Stamatelatos, Ion E.; Vasilopoulou, Theodora; Batistoni, Paola; Obryk, Barbara; Popovichev, Sergey; Naish, Jonathan

    2017-09-01

    Neutronic benchmark experiments are carried out at JET aiming to assess the neutronic codes and data used in ITER analysis. Among other activities, experiments are performed in order to validate neutron streaming simulations along long penetrations in the JET shielding configuration. In this work, neutron streaming calculations along the JET personnel entrance maze are presented. Simulations were performed using the MCNP code for Deuterium-Deuterium and Deuterium- Tritium plasma sources. The results of the simulations were compared against experimental data obtained using thermoluminescence detectors and activation foils.

  2. Shuttle Main Propulsion System LH2 Feed Line and Inducer Simulations

    NASA Technical Reports Server (NTRS)

    Dorney, Daniel J.; Rothermel, Jeffry

    2002-01-01

    This viewgraph presentation includes simulations of the unsteady flow field in the LH2 feed line, flow line, flow liner, backing cavity and inducer of Shuttle engine #1. It also evaluates aerodynamic forcing functions which may contribute to the formation of the cracks observed on the flow liner slots. The presentation lists the numerical methods used, and profiles a benchmark test case.

  3. Validation of Tendril TrueHome Using Software-to-Software Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan

    This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.

  4. A simulation study of thinning and fuel treatments on a wildland-urban interface in eastern Oregon, USA

    Treesearch

    Alan A. Ager; Andrew J. McMahan; James J. Barrett; Charles W. McHugh

    2007-01-01

    We simulated long-term forest management activities on 16,000-ha wildland-urban interface in the Blue Mountains near La Grande, Oregon. The study area is targeted for thinning and fuels treatments on both private and Federally managed lands to address forest health and sustainability concerns and reduce the risk of severe wildfire. We modeled number of benchmark...

  5. Gaming in risk-adjusted mortality rates: effect of misclassification of risk factors in the benchmarking of cardiac surgery risk-adjusted mortality rates.

    PubMed

    Siregar, Sabrina; Groenwold, Rolf H H; Versteegh, Michel I M; Noyez, Luc; ter Burg, Willem Jan P P; Bots, Michiel L; van der Graaf, Yolanda; van Herwerden, Lex A

    2013-03-01

    Upcoding or undercoding of risk factors could affect the benchmarking of risk-adjusted mortality rates. The aim was to investigate the effect of misclassification of risk factors on the benchmarking of mortality rates after cardiac surgery. A prospective cohort was used comprising all adult cardiac surgery patients in all 16 cardiothoracic centers in The Netherlands from January 1, 2007, to December 31, 2009. A random effects model, including the logistic European system for cardiac operative risk evaluation (EuroSCORE) was used to benchmark the in-hospital mortality rates. We simulated upcoding and undercoding of 5 selected variables in the patients from 1 center. These patients were selected randomly (nondifferential misclassification) or by the EuroSCORE (differential misclassification). In the random patients, substantial misclassification was required to affect benchmarking: a 1.8-fold increase in prevalence of the 4 risk factors changed an underperforming center into an average performing one. Upcoding of 1 variable required even more. When patients with the greatest EuroSCORE were upcoded (ie, differential misclassification), a 1.1-fold increase was sufficient: moderate left ventricular function from 14.2% to 15.7%, poor left ventricular function from 8.4% to 9.3%, recent myocardial infarction from 7.9% to 8.6%, and extracardiac arteriopathy from 9.0% to 9.8%. Benchmarking using risk-adjusted mortality rates can be manipulated by misclassification of the EuroSCORE risk factors. Misclassification of random patients or of single variables will have little effect. However, limited upcoding of multiple risk factors in high-risk patients can greatly influence benchmarking. To minimize "gaming," the prevalence of all risk factors should be carefully monitored. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  6. Results of the 2013 UT modeling benchmark obtained with models implemented in CIVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toullelan, Gwénaël; Raillon, Raphaële; Chatillon, Sylvain

    The 2013 Ultrasonic Testing (UT) modeling benchmark concerns direct echoes from side drilled holes (SDH), flat bottom holes (FBH) and corner echoes from backwall breaking artificial notches inspected with a matrix phased array probe. This communication presents the results obtained with the models implemented in the CIVA software: the pencilmodel is used to compute the field radiated by the probe, the Kirchhoff approximation is applied to predict the response of FBH and notches and the SOV (Separation Of Variables) model is used for the SDH responses. The comparison between simulated and experimental results are presented and discussed.

  7. Benchmarking of relative permeability

    NASA Astrophysics Data System (ADS)

    DiCarlo, D. A.

    2017-12-01

    Relative permeability is the key relation in terms of multi-phase flow through porous media. There are hundreds of published relative permeability curves for various media, some classic (Oak 90 and 91), some contradictory. This can lead to a confusing situation if one is trying to benchmark simulation results to "experimental data". Coming from the experimental side, I have found that modelers have too much trust in relative permeability data sets. In this talk, I will discuss reasons for discrepancies within and between data sets, and give guidance on which portions of the data sets are most solid in terms of matching through models.

  8. Construction of topological structure of 3D coronary vessels for analysis of catheter navigation in interventional cardiology simulation

    NASA Astrophysics Data System (ADS)

    Wang, Yaoping; Chui, Cheekong K.; Cai, Yiyu; Mak, KoonHou

    1998-06-01

    This study presents an approach to build a 3D vascular system of coronary for the development of a virtual cardiology simulator. The 3D model of the coronary arterial tree is reconstructed from the geometric information segmented from the Visible Human data set for physical analysis of catheterization. The process of segmentation is guided by a 3D topologic hierarchy structure of coronary vessels which is obtained from a mechanical model by using Coordinate Measuring Machine (CMM) probing. This mechanical professional model includes all major coronary arterials ranging from right coronary artery to atrioventricular branch and from left main trunk to left anterior descending branch. All those branches are considered as the main operating sites for cardiology catheterization. Along with the primary arterial vasculature and accompanying secondary and tertiary networks obtained from a previous work, a more complete vascular structure can then be built for the simulation of catheterization. A novel method has been developed for real time Finite Element Analysis of catheter navigation based on this featured vasculature of vessels.

  9. Study on underclad cracking in nuclear reactor vessel steels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiya, T.; Takeda, T.; Yamato, K.

    1985-02-01

    Susceptibility to underclad cracking in nuclear reactor vessel steels, such as SA533 Grade B Class 1 and SA508 Class 2, was studied in detail. A convenient simulation test method using simulated HAZ specimens of small size has been developed for quantitative evaluation of susceptibility to underclad cracks. The method can predict precisely the cracking behavior in weldments of steels with relative low crack susceptibility. The effect of chemical compositions on susceptibility to the cracking was examined systematically using the developed simulation test method and the following index was obtained from the test results: U = 20(V) + 7(C) + 4(Mo)more » + (Cr) + (Cu) - 0.5(Mn) + 1.5 log(X) X = Al . . . Al/2N less than or equal to 1 X = 2N . . . Al/2N > 1 It was confirmed that the new index (U) is useful for the prediction of crack susceptibility of the nuclear vessel steels; i.e., no crack initiation is detected in weldments in the roller bend test for steels having U value below 0.90.« less

  10. Modeling of Non-Homogeneous Containment Atmosphere in the ThAI Experimental Facility Using a CFD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babic, Miroslav; Kljenak, Ivo; Mavko, Borut

    2006-07-01

    The CFD code CFX4.4 was used to simulate an experiment in the ThAI facility, which was designed for investigation of thermal-hydraulic processes during a severe accident inside a Light Water Reactor containment. In the considered experiment, air was initially present in the vessel, and helium and steam were injected during different phases of the experiment at various mass flow rates and at different locations. The main purpose of the simulation was to reproduce the non-homogeneous temperature and species concentration distributions in the ThAI experimental facility. A three-dimensional model of the ThAI vessel for the CFX4.4 code was developed. The flowmore » in the simulation domain was modeled as single-phase. Steam condensation on vessel walls was modeled as a sink of mass and energy using a correlation that was originally developed for an integral approach. A simple model of bulk phase change was also introduced. The calculated time-dependent variables together with temperature and concentration distributions at the end of experiment phases are compared to experimental results. (authors)« less

  11. Three dimensional computed tomography lung modeling is useful in simulation and navigation of lung cancer surgery.

    PubMed

    Ikeda, Norihiko; Yoshimura, Akinobu; Hagiwara, Masaru; Akata, Soichi; Saji, Hisashi

    2013-01-01

    The number of minimally invasive operations, such as video-assisted thoracoscopic surgery (VATS) lobectomy or segmentectomy, has enormously increased in recent years. These operations require extreme knowledge of the anatomy of pulmonary vessels and bronchi in each patient, and surgeons must carefully dissect the branches of pulmonary vessels during operation. Thus, foreknowledge of the anatomy of each patient would greatly contribute to the safety and accuracy of the operation. The development of multi-detector computed tomography (MDCT) has promoted three dimensional (3D) images of lung structures. It is possible to see the vascular and bronchial structures from the view of the operator; therefore, it is employed for preoperative simulation as well as navigation during operation. Due to advances in software, even small vessels can be accurately imaged, which is useful in performing segmentectomy. Surgical simulation and navigation systems based on high quality 3D lung modeling, including vascular and bronchial structures, can be used routinely to enhance the safety operation, education of junior staff, as well as providing a greater sense of security to the operators.

  12. [Simulation of lung lobe resection with personal computer].

    PubMed

    Onuki, T; Murasugi, M; Mae, M; Koyama, K; Ikeda, T; Shimizu, T

    2005-09-01

    Various patterns of branching are seen for pulmonary arteries and veins in the lung hilum. However, thoracic surgeons usually cannot expect to discern much anatomical detail preoperatively. If the surgeon can gain an understanding of individual patterns preoperatively, the risks inherent in exposing the pulmonary vessels in the hilum can be avoided, reducing invasiveness. This software will meet the increasing needs of them in video-assisted thoracoscopic surgery (VATS) which prefer lesser dissections of the vessels and bronchus of hilum. We have produced free application software, where we can mark on pulmonary arteries, vein, bronchus and tumor of the successive images of computed tomography (CT). After receiving a compact disk containing 60 images of 2 mm CT slices, from tumor to hilum, in DICOM format, we required only 1 hour to obtain 3-dimensional images for a patient with other free software (Metasequoia LE). Furthermore, with Metasequoia LE, we can simulate cut the vessels and change the figure of them 3-dimensionally. Although the picture image leaves much room for improvement, we believe it is very attractive for residents because they can simulate operations.

  13. Shock-induced collapse of a bubble inside a deformable vessel

    PubMed Central

    Coralic, Vedran; Colonius, Tim

    2013-01-01

    Shockwave lithotripsy repeatedly focuses shockwaves on kidney stones to induce their fracture, partially through cavitation erosion. A typical side effect of the procedure is hemorrhage, which is potentially the result of the growth and collapse of bubbles inside blood vessels. To identify the mechanisms by which shock-induced collapse could lead to the onset of injury, we study an idealized problem involving a preexisting bubble in a deformable vessel. We utilize a high-order accurate, shock- and interface-capturing, finite-volume scheme and simulate the three-dimensional shock-induced collapse of an air bubble immersed in a cylindrical water column which is embedded in a gelatin/water mixture. The mixture is a soft tissue simulant, 10% gelatin by weight, and is modeled by the stiffened gas equation of state. The bubble dynamics of this model configuration are characterized by the collapse of the bubble and its subsequent jetting in the direction of the propagation of the shockwave. The vessel wall, which is defined by the material interface between the water and gelatin/water mixture, is invaginated by the collapse and distended by the impact of the jet. The present results show that the highest measured pressures and deformations occur when the volumetric confinement of the bubble is strongest, the bubble is nearest the vessel wall and/or the angle of incidence of the shockwave reduces the distance between the jet tip and the nearest vessel surface. For a particular case considered, the 40 MPa shockwave utilized in this study to collapse the bubble generated a vessel wall pressure of almost 450 MPa and produced both an invagination and distention of nearly 50% of the initial vessel radius on a 𝒪(10) ns timescale. These results are indicative of the significant potential of shock-induced collapse to contribute to the injury of blood vessels in shockwave lithotripsy. PMID:24015027

  14. Field Test of a Hybrid Finite-Difference and Analytic Element Regional Model.

    PubMed

    Abrams, D B; Haitjema, H M; Feinstein, D T; Hunt, R J

    2016-01-01

    Regional finite-difference models often have cell sizes that are too large to sufficiently model well-stream interactions. Here, a steady-state hybrid model is applied whereby the upper layer or layers of a coarse MODFLOW model are replaced by the analytic element model GFLOW, which represents surface waters and wells as line and point sinks. The two models are coupled by transferring cell-by-cell leakage obtained from the original MODFLOW model to the bottom of the GFLOW model. A real-world test of the hybrid model approach is applied on a subdomain of an existing model of the Lake Michigan Basin. The original (coarse) MODFLOW model consists of six layers, the top four of which are aggregated into GFLOW as a single layer, while the bottom two layers remain part of MODFLOW in the hybrid model. The hybrid model and a refined "benchmark" MODFLOW model simulate similar baseflows. The hybrid and benchmark models also simulate similar baseflow reductions due to nearby pumping when the well is located within the layers represented by GFLOW. However, the benchmark model requires refinement of the model grid in the local area of interest, while the hybrid approach uses a gridless top layer and is thus unaffected by grid discretization errors. The hybrid approach is well suited to facilitate cost-effective retrofitting of existing coarse grid MODFLOW models commonly used for regional studies because it leverages the strengths of both finite-difference and analytic element methods for predictions in mildly heterogeneous systems that can be simulated with steady-state conditions. © 2015, National Ground Water Association.

  15. Retinal blood vessel segmentation in high resolution fundus photographs using automated feature parameter estimation

    NASA Astrophysics Data System (ADS)

    Orlando, José Ignacio; Fracchia, Marcos; del Río, Valeria; del Fresno, Mariana

    2017-11-01

    Several ophthalmological and systemic diseases are manifested through pathological changes in the properties and the distribution of the retinal blood vessels. The characterization of such alterations requires the segmentation of the vasculature, which is a tedious and time-consuming task that is infeasible to be performed manually. Numerous attempts have been made to propose automated methods for segmenting the retinal vasculature from fundus photographs, although their application in real clinical scenarios is usually limited by their ability to deal with images taken at different resolutions. This is likely due to the large number of parameters that have to be properly calibrated according to each image scale. In this paper we propose to apply a novel strategy for automated feature parameter estimation, combined with a vessel segmentation method based on fully connected conditional random fields. The estimation model is learned by linear regression from structural properties of the images and known optimal configurations, that were previously obtained for low resolution data sets. Our experiments in high resolution images show that this approach is able to estimate appropriate configurations that are suitable for performing the segmentation task without requiring to re-engineer parameters. Furthermore, our combined approach reported state of the art performance on the benchmark data set HRF, as measured in terms of the F1-score and the Matthews correlation coefficient.

  16. A Multi-Anatomical Retinal Structure Segmentation System for Automatic Eye Screening Using Morphological Adaptive Fuzzy Thresholding

    PubMed Central

    Elleithy, Khaled; Elleithy, Abdelrahman

    2018-01-01

    Eye exam can be as efficacious as physical one in determining health concerns. Retina screening can be the very first clue for detecting a variety of hidden health issues including pre-diabetes and diabetes. Through the process of clinical diagnosis and prognosis; ophthalmologists rely heavily on the binary segmented version of retina fundus image; where the accuracy of segmented vessels, optic disc, and abnormal lesions extremely affects the diagnosis accuracy which in turn affect the subsequent clinical treatment steps. This paper proposes an automated retinal fundus image segmentation system composed of three segmentation subsystems follow same core segmentation algorithm. Despite of broad difference in features and characteristics; retinal vessels, optic disc, and exudate lesions are extracted by each subsystem without the need for texture analysis or synthesis. For sake of compact diagnosis and complete clinical insight, our proposed system can detect these anatomical structures in one session with high accuracy even in pathological retina images. The proposed system uses a robust hybrid segmentation algorithm combines adaptive fuzzy thresholding and mathematical morphology. The proposed system is validated using four benchmark datasets: DRIVE and STARE (vessels), DRISHTI-GS (optic disc), and DIARETDB1 (exudates lesions). Competitive segmentation performance is achieved, outperforming a variety of up-to-date systems and demonstrating the capacity to deal with other heterogeneous anatomical structures. PMID:29888146

  17. Comparative cath-lab assessment of coronary stenosis by radiology technician, junior and senior interventional cardiologist in patients treated with coronary angioplasty

    PubMed Central

    Delli Carri, Felice; Ruggiero, Maria Assunta; Cuculo, Andrea; Ruggiero, Antonio; Ziccardi, Luigi; De Gennaro, Luisa; Di Biase, Matteo

    2014-01-01

    Background Exact quantification of plaque extension during coronary angioplasty (PCI) usually falls on interventional cardiologist (IC). Quantitative coronary stenosis assessment (QCA) may be possibly committed to the radiology technician (RT), who usually supports cath-lab nurse and IC during PCI. We therefore sought to investigate the reliability of QCA performed by RT in comparison with IC. Methods Forty-four consecutive patients with acute coronary syndrome underwent PCI; target coronary vessel size beneath target coronary lesion (S) and target coronary lesion length (L) were assessed by the RT, junior IC (JIC), and senior IC (SIC) and then compared. SIC evaluation, which determined the final stent selection for coronary stenting, was considered as a reference benchmark. Results RT performance with QCA support in assessing target vessel size and target lesion length was not significantly different from SIC (r = 0.46, p < 0.01; r = 0.64, p < 0.001, respectively) as well as JIC (r = 0.79, r = 0.75, p < 0.001, respectively). JIC performance was significantly better than RT in assessing target vessel size (p < 0.05), while not significant when assessing target lesion length. Conclusions RT may reliably assess target lesion by using adequate QCA software in the cath-lab in case of PCI; RT performance does not differ from SIC. PMID:24672672

  18. HRSSA - Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    NASA Astrophysics Data System (ADS)

    Marchetti, Luca; Priami, Corrado; Thanh, Vo Hong

    2016-07-01

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance and accuracy of HRSSA against other state of the art algorithms.

  19. Quantification of Processing Effects on Filament Wound Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Aiello, Robert A.; Chamis, Christos C.

    1999-01-01

    A computational simulation procedure is described which is designed specifically for the modeling and analysis of filament wound pressure vessels. Cylindrical vessels with spherical or elliptical end caps can be generated automatically. End caps other than spherical or elliptical may be modeled by varying circular sections along the x-axis according to the C C! end cap shape. The finite element model generated is composed of plate type quadrilateral shell elements on the entire vessel surface. This computational procedure can also be sued to generate grid, connectivity and material cards (bulk data) for component parts of a larger model. These bulk data are assigned to a user designated file for finite element structural/stress analysis of composite pressure vessels. The procedure accommodates filament would pressure vessels of all types of shells-of-revolution. It has provisions to readily evaluate initial stresses due to pretension in the winding filaments and residual stresses due to cure temperature.

  20. Quantification of Processing Effects on Filament Wound Pressure Vessels. Revision

    NASA Technical Reports Server (NTRS)

    Aiello, Robert A.; Chamis, Christos C.

    2002-01-01

    A computational simulation procedure is described which is designed specifically for the modeling and analysis of filament wound pressure vessels. Cylindrical vessels with spherical or elliptical end caps can be generated automatically. End caps other than spherical or elliptical may be modeled by varying circular sections along the x-axis according to the end cap shape. The finite element model generated is composed of plate type quadrilateral shell elements on the entire vessel surface. This computational procedure can also be used to generate grid, connectivity and material cards (bulk data) for component parts of a larger model. These bulk data are assigned to a user designated file for finite element structural/stress analysis of composite pressure vessels. The procedure accommodates filament wound pressure vessels of all types of shells-of -revolution. It has provisions to readily evaluate initial stresses due to pretension in the winding filaments and residual stresses due to cure temperature.

  1. Experimental depth dose curves of a 67.5 MeV proton beam for benchmarking and validation of Monte Carlo simulation

    PubMed Central

    Faddegon, Bruce A.; Shin, Jungwook; Castenada, Carlos M.; Ramos-Méndez, José; Daftari, Inder K.

    2015-01-01

    Purpose: To measure depth dose curves for a 67.5 ± 0.1 MeV proton beam for benchmarking and validation of Monte Carlo simulation. Methods: Depth dose curves were measured in 2 beam lines. Protons in the raw beam line traversed a Ta scattering foil, 0.1016 or 0.381 mm thick, a secondary emission monitor comprised of thin Al foils, and a thin Kapton exit window. The beam energy and peak width and the composition and density of material traversed by the beam were known with sufficient accuracy to permit benchmark quality measurements. Diodes for charged particle dosimetry from two different manufacturers were used to scan the depth dose curves with 0.003 mm depth reproducibility in a water tank placed 300 mm from the exit window. Depth in water was determined with an uncertainty of 0.15 mm, including the uncertainty in the water equivalent depth of the sensitive volume of the detector. Parallel-plate chambers were used to verify the accuracy of the shape of the Bragg peak and the peak-to-plateau ratio measured with the diodes. The uncertainty in the measured peak-to-plateau ratio was 4%. Depth dose curves were also measured with a diode for a Bragg curve and treatment beam spread out Bragg peak (SOBP) on the beam line used for eye treatment. The measurements were compared to Monte Carlo simulation done with geant4 using topas. Results: The 80% dose at the distal side of the Bragg peak for the thinner foil was at 37.47 ± 0.11 mm (average of measurement with diodes from two different manufacturers), compared to the simulated value of 37.20 mm. The 80% dose for the thicker foil was at 35.08 ± 0.15 mm, compared to the simulated value of 34.90 mm. The measured peak-to-plateau ratio was within one standard deviation experimental uncertainty of the simulated result for the thinnest foil and two standard deviations for the thickest foil. It was necessary to include the collimation in the simulation, which had a more pronounced effect on the peak-to-plateau ratio for the thicker foil. The treatment beam, being unfocussed, had a broader Bragg peak than the raw beam. A 1.3 ± 0.1 MeV FWHM peak width in the energy distribution was used in the simulation to match the Bragg peak width. An additional 1.3–2.24 mm of water in the water column was required over the nominal values to match the measured depth penetration. Conclusions: The proton Bragg curve measured for the 0.1016 mm thick Ta foil provided the most accurate benchmark, having a low contribution of proton scatter from upstream of the water tank. The accuracy was 0.15% in measured beam energy and 0.3% in measured depth penetration at the Bragg peak. The depth of the distal edge of the Bragg peak in the simulation fell short of measurement, suggesting that the mean ionization potential of water is 2–5 eV higher than the 78 eV used in the stopping power calculation for the simulation. The eye treatment beam line depth dose curves provide validation of Monte Carlo simulation of a Bragg curve and SOBP with 4%/2 mm accuracy. PMID:26133619

  2. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    NASA Astrophysics Data System (ADS)

    Abtahi, Amir-Reza; Bijari, Afsane

    2017-03-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  3. Anharmonic Vibrational Spectroscopy on Metal Transition Complexes

    NASA Astrophysics Data System (ADS)

    Latouche, Camille; Bloino, Julien; Barone, Vincenzo

    2014-06-01

    Advances in hardware performance and the availability of efficient and reliable computational models have made possible the application of computational spectroscopy to ever larger molecular systems. The systematic interpretation of experimental data and the full characterization of complex molecules can then be facilitated. Focusing on vibrational spectroscopy, several approaches have been proposed to simulate spectra beyond the double harmonic approximation, so that more details become available. However, a routine use of such tools requires the preliminary definition of a valid protocol with the most appropriate combination of electronic structure and nuclear calculation models. Several benchmark of anharmonic calculations frequency have been realized on organic molecules. Nevertheless, benchmarks of organometallics or inorganic metal complexes at this level are strongly lacking despite the interest of these systems due to their strong emission and vibrational properties. Herein we report the benchmark study realized with anharmonic calculations on simple metal complexes, along with some pilot applications on systems of direct technological or biological interest.

  4. GLOFRIM v1.0 - A globally applicable computational framework for integrated hydrological-hydrodynamic modelling

    NASA Astrophysics Data System (ADS)

    Hoch, Jannis M.; Neal, Jeffrey C.; Baart, Fedor; van Beek, Rens; Winsemius, Hessel C.; Bates, Paul D.; Bierkens, Marc F. P.

    2017-10-01

    We here present GLOFRIM, a globally applicable computational framework for integrated hydrological-hydrodynamic modelling. GLOFRIM facilitates spatially explicit coupling of hydrodynamic and hydrologic models and caters for an ensemble of models to be coupled. It currently encompasses the global hydrological model PCR-GLOBWB as well as the hydrodynamic models Delft3D Flexible Mesh (DFM; solving the full shallow-water equations and allowing for spatially flexible meshing) and LISFLOOD-FP (LFP; solving the local inertia equations and running on regular grids). The main advantages of the framework are its open and free access, its global applicability, its versatility, and its extensibility with other hydrological or hydrodynamic models. Before applying GLOFRIM to an actual test case, we benchmarked both DFM and LFP for a synthetic test case. Results show that for sub-critical flow conditions, discharge response to the same input signal is near-identical for both models, which agrees with previous studies. We subsequently applied the framework to the Amazon River basin to not only test the framework thoroughly, but also to perform a first-ever benchmark of flexible and regular grids on a large-scale. Both DFM and LFP produce comparable results in terms of simulated discharge with LFP exhibiting slightly higher accuracy as expressed by a Kling-Gupta efficiency of 0.82 compared to 0.76 for DFM. However, benchmarking inundation extent between DFM and LFP over the entire study area, a critical success index of 0.46 was obtained, indicating that the models disagree as often as they agree. Differences between models in both simulated discharge and inundation extent are to a large extent attributable to the gridding techniques employed. In fact, the results show that both the numerical scheme of the inundation model and the gridding technique can contribute to deviations in simulated inundation extent as we control for model forcing and boundary conditions. This study shows that the presented computational framework is robust and widely applicable. GLOFRIM is designed as open access and easily extendable, and thus we hope that other large-scale hydrological and hydrodynamic models will be added. Eventually, more locally relevant processes would be captured and more robust model inter-comparison, benchmarking, and ensemble simulations of flood hazard on a large scale would be allowed for.

  5. Numerical investigation of hyperelastic wall deformation characteristics in a micro-scale stenotic blood vessel

    NASA Astrophysics Data System (ADS)

    Cheema, Taqi Ahmad; Park, Cheol Woo

    2013-08-01

    Stenosis is the drastic reduction of blood vessel diameter because of cholesterol accumulation in the vessel wall. In addition to the changes in blood flow characteristics, significant changes occur in the mechanical behavior of a stenotic blood vessel. We conducted a 3-D study of such behavior in micro-scale blood vessels by considering the fluid structure interaction between blood flow and vessel wall structure. The simulation consisted of one-way coupled analysis of blood flow and the resulting structural deformation without a moving mesh. A commercial code based on a finite element method with a hyperelastic material model (Neo-Hookean) of the wall was used to calculate wall deformation. Three different cases of stenosis severity and aspect ratios with and without muscles around the blood vessel were considered. The results showed that the wall deformation in a stenotic channel is directly related to stenosis severity and aspect ratio. The presence of muscles reduces the degree of deformation even in very severe stenosis.

  6. Benchmark levels for the consumptive water footprint of crop production for different environmental conditions: a case study for winter wheat in China

    NASA Astrophysics Data System (ADS)

    Zhuo, La; Mekonnen, Mesfin M.; Hoekstra, Arjen Y.

    2016-11-01

    Meeting growing food demands while simultaneously shrinking the water footprint (WF) of agricultural production is one of the greatest societal challenges. Benchmarks for the WF of crop production can serve as a reference and be helpful in setting WF reduction targets. The consumptive WF of crops, the consumption of rainwater stored in the soil (green WF), and the consumption of irrigation water (blue WF) over the crop growing period varies spatially and temporally depending on environmental factors like climate and soil. The study explores which environmental factors should be distinguished when determining benchmark levels for the consumptive WF of crops. Hereto we determine benchmark levels for the consumptive WF of winter wheat production in China for all separate years in the period 1961-2008, for rain-fed vs. irrigated croplands, for wet vs. dry years, for warm vs. cold years, for four different soil classes, and for two different climate zones. We simulate consumptive WFs of winter wheat production with the crop water productivity model AquaCrop at a 5 by 5 arcmin resolution, accounting for water stress only. The results show that (i) benchmark levels determined for individual years for the country as a whole remain within a range of ±20 % around long-term mean levels over 1961-2008, (ii) the WF benchmarks for irrigated winter wheat are 8-10 % larger than those for rain-fed winter wheat, (iii) WF benchmarks for wet years are 1-3 % smaller than for dry years, (iv) WF benchmarks for warm years are 7-8 % smaller than for cold years, (v) WF benchmarks differ by about 10-12 % across different soil texture classes, and (vi) WF benchmarks for the humid zone are 26-31 % smaller than for the arid zone, which has relatively higher reference evapotranspiration in general and lower yields in rain-fed fields. We conclude that when determining benchmark levels for the consumptive WF of a crop, it is useful to primarily distinguish between different climate zones. If actual consumptive WFs of winter wheat throughout China were reduced to the benchmark levels set by the best 25 % of Chinese winter wheat production (1224 m3 t-1 for arid areas and 841 m3 t-1 for humid areas), the water saving in an average year would be 53 % of the current water consumption at winter wheat fields in China. The majority of the yield increase and associated improvement in water productivity can be achieved in southern China.

  7. Visco-Resistive MHD Modeling Benchmark of Forced Magnetic Reconnection

    NASA Astrophysics Data System (ADS)

    Beidler, M. T.; Hegna, C. C.; Sovinec, C. R.; Callen, J. D.; Ferraro, N. M.

    2016-10-01

    The presence of externally-applied 3D magnetic fields can affect important phenomena in tokamaks, including mode locking, disruptions, and edge localized modes. External fields penetrate into the plasma and can lead to forced magnetic reconnection (FMR), and hence magnetic islands, on resonant surfaces if the local plasma rotation relative to the external field is slow. Preliminary visco-resistive MHD simulations of FMR in a slab geometry are consistent with theory. Specifically, linear simulations exhibit proper scaling of the penetrated field with resistivity, viscosity, and flow, and nonlinear simulations exhibit a bifurcation from a flow-screened to a field-penetrated, magnetic island state as the external field is increased, due to the 3D electromagnetic force. These results will be compared to simulations of FMR in a circular cross-section, cylindrical geometry by way of a benchmark between the NIMROD and M3D-C1 extended-MHD codes. Because neither this geometry nor the MHD model has the physics of poloidal flow damping, the theory of will be expanded to include poloidal flow effects. The resulting theory will be tested with linear and nonlinear simulations that vary the resistivity, viscosity, flow, and external field. Supported by OFES DoE Grants DE-FG02-92ER54139, DE-FG02-86ER53218, DE-AC02-09CH11466, and the SciDAC Center for Extended MHD Modeling.

  8. Solids Erosion Patterns Developed by Pulse Jet Mixers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bamberger, Judith A.; Pease, Leonard F.; Minette, Michael J.

    Millions of gallons of radioactive waste are stored in underground storage tanks at the Hanford Site in Washington State. This waste will be vitrified at the Waste Treatment and Immobilization Plant that is under construction. Vessels in the pretreatment portion of the plant are being configured for processing waste slurries with challenging physical and rheological properties that range from Newtonian slurries to non-Newtonian sludge. Pulse jet mixing technology has been selected for mobilizing and mixing this waste. In the pulse jet mixing process, slurry is expelled from pulse tube nozzles directed towards the vessel floor. The expelled fluid forms amore » radial jet that erodes the settled layer of solids. The pulse tubes are configured in a ring or multiple rings and operate concurrently. The expelled fluid and mobilized solids traverse toward the center of the tank. At the tank center the jets from pulse tubes in the ring collide and lift solids upward in a central plume. At the end of the pulse, when the desired fluid volume is expelled from the pulse tube, the applied pressure switches to suction and the pulse tube is refilled. This cycle is used to mobilize and mix the tank contents. An initial step of the process is the erosion of solids from the vessel floor by the radial jets that form on the vessel flow beneath each pulse tube. Experiments have been conducted using simulants to evaluate the ability of the pulse jet mixing system radial jets to combine to develop the central upwell and lift solids into the vessel. These experiments have been conducted at three scales using a range of granular simulants over a range of concentrations. The vessels have elliptical, spherical, or flanged and dished bottoms. Process parameters evaluated include the velocity of fluid expelled from the pulse tube, the duration of the pulse and the duty cycle, the ratio of pulse duration to cycle time. Videos taken from beneath the vessel show the growth of the cleared area from each pulse tube as a function of time. All solids are lifted from the vessel bottom when the system is operating at the critical suspension velocity. The focus of this paper is to compare and contrast erosion patterns developed from different simulants and pulse tube configurations. The cases are evaluated to determine how changes in process parameters affects the PJM ability to mobilize solids from the vessel floor.« less

  9. Analysis of ship maneuvering data from simulators

    NASA Astrophysics Data System (ADS)

    Frette, V.; Kleppe, G.; Christensen, K.

    2011-03-01

    We analyze complex manuevering histories of ships obtained from training sessions on bridge simulators. Advanced ships are used in fields like offshore oil exploration: dive support vessels, supply vessels, anchor handling vessels, tugs, cable layers, and multi-purpose vessels. Due to high demands from the operations carried out, these ships need to have very high maneuverability. This is achieved through a propulsion system with several thrusters, water jets, and rudders in addition to standard propellers. For some operations, like subsea maintenance, it is crucial that the ship accurately keeps a fixed position. Therefore, bridge systems usually incorporate equipment for Dynamic Positioning (DP). DP is a method to keep ships and semi submersible rigs in a fixed position using the propulsion systems instead of anchors. It may also be used for sailing a vessel from one position to another along a predefined route. Like an autopilot on an airplane, DP may operate without human involvement. The method relies on accurate determination of position from external reference systems like GPS, as well as a continuously adjusted mathematical model of the ship and external forces from wind, waves and currents. In a specific simulator exercise for offshore crews, a ship is to be taken up to an installation consisting of three nearby oil platforms connected by bridges (Frigg field, North Sea), where a subsea inspection is to be carried out. Due to the many degrees of freedom during maneuvering, including partly or full use of DP, the chosen routes vary significantly. In this poster we report preliminary results on representations of the complex maneuvering histories; representations that allow comparison between crew groups, and, possibly, sorting of the different strategic choices behind.

  10. Monte Carlo modeling of light-tissue interactions in narrow band imaging.

    PubMed

    Le, Du V N; Wang, Quanzeng; Ramella-Roman, Jessica C; Pfefer, T Joshua

    2013-01-01

    Light-tissue interactions that influence vascular contrast enhancement in narrow band imaging (NBI) have not been the subject of extensive theoretical study. In order to elucidate relevant mechanisms in a systematic and quantitative manner we have developed and validated a Monte Carlo model of NBI and used it to study the effect of device and tissue parameters, specifically, imaging wavelength (415 versus 540 nm) and vessel diameter and depth. Simulations provided quantitative predictions of contrast-including up to 125% improvement in small, superficial vessel contrast for 415 over 540 nm. Our findings indicated that absorption rather than scattering-the mechanism often cited in prior studies-was the dominant factor behind spectral variations in vessel depth-selectivity. Narrow-band images of a tissue-simulating phantom showed good agreement in terms of trends and quantitative values. Numerical modeling represents a powerful tool for elucidating the factors that affect the performance of spectral imaging approaches such as NBI.

  11. Validating the performance of correlated fission multiplicity implementation in radiation transport codes with subcritical neutron multiplication benchmark experiments

    DOE PAGES

    Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...

    2018-06-14

    Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less

  12. Validating the performance of correlated fission multiplicity implementation in radiation transport codes with subcritical neutron multiplication benchmark experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson

    Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less

  13. A Uranium Bioremediation Reactive Transport Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yabusaki, Steven B.; Sengor, Sevinc; Fang, Yilin

    A reactive transport benchmark problem set has been developed based on in situ uranium bio-immobilization experiments that have been performed at a former uranium mill tailings site in Rifle, Colorado, USA. Acetate-amended groundwater stimulates indigenous microorganisms to catalyze the reduction of U(VI) to a sparingly soluble U(IV) mineral. The interplay between the flow, acetate loading periods and rates, microbially-mediated and geochemical reactions leads to dynamic behavior in metal- and sulfate-reducing bacteria, pH, alkalinity, and reactive mineral surfaces. The benchmark is based on an 8.5 m long one-dimensional model domain with constant saturated flow and uniform porosity. The 159-day simulation introducesmore » acetate and bromide through the upgradient boundary in 14-day and 85-day pulses separated by a 10 day interruption. Acetate loading is tripled during the second pulse, which is followed by a 50 day recovery period. Terminal electron accepting processes for goethite, phyllosilicate Fe(III), U(VI), and sulfate are modeled using Monod-type rate laws. Major ion geochemistry modeled includes mineral reactions, as well as aqueous and surface complexation reactions for UO2++, Fe++, and H+. In addition to the dynamics imparted by the transport of the acetate pulses, U(VI) behavior involves the interplay between bioreduction, which is dependent on acetate availability, and speciation-controlled surface complexation, which is dependent on pH, alkalinity and available surface complexation sites. The general difficulty of this benchmark is the large number of reactions (74), multiple rate law formulations, a multisite uranium surface complexation model, and the strong interdependency and sensitivity of the reaction processes. Results are presented for three simulators: HYDROGEOCHEM, PHT3D, and PHREEQC.« less

  14. Model evaluation using a community benchmarking system for land surface models

    NASA Astrophysics Data System (ADS)

    Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Kluzek, E. B.; Koven, C. D.; Randerson, J. T.

    2014-12-01

    Evaluation of atmosphere, ocean, sea ice, and land surface models is an important step in identifying deficiencies in Earth system models and developing improved estimates of future change. For the land surface and carbon cycle, the design of an open-source system has been an important objective of the International Land Model Benchmarking (ILAMB) project. Here we evaluated CMIP5 and CLM models using a benchmarking system that enables users to specify models, data sets, and scoring systems so that results can be tailored to specific model intercomparison projects. Our scoring system used information from four different aspects of global datasets, including climatological mean spatial patterns, seasonal cycle dynamics, interannual variability, and long-term trends. Variable-to-variable comparisons enable investigation of the mechanistic underpinnings of model behavior, and allow for some control of biases in model drivers. Graphics modules allow users to evaluate model performance at local, regional, and global scales. Use of modular structures makes it relatively easy for users to add new variables, diagnostic metrics, benchmarking datasets, or model simulations. Diagnostic results are automatically organized into HTML files, so users can conveniently share results with colleagues. We used this system to evaluate atmospheric carbon dioxide, burned area, global biomass and soil carbon stocks, net ecosystem exchange, gross primary production, ecosystem respiration, terrestrial water storage, evapotranspiration, and surface radiation from CMIP5 historical and ESM historical simulations. We found that the multi-model mean often performed better than many of the individual models for most variables. We plan to publicly release a stable version of the software during fall of 2014 that has land surface, carbon cycle, hydrology, radiation and energy cycle components.

  15. Accelerating cardiac bidomain simulations using graphics processing units.

    PubMed

    Neic, A; Liebmann, M; Hoetzl, E; Mitchell, L; Vigmond, E J; Haase, G; Plank, G

    2012-08-01

    Anatomically realistic and biophysically detailed multiscale computer models of the heart are playing an increasingly important role in advancing our understanding of integrated cardiac function in health and disease. Such detailed simulations, however, are computationally vastly demanding, which is a limiting factor for a wider adoption of in-silico modeling. While current trends in high-performance computing (HPC) hardware promise to alleviate this problem, exploiting the potential of such architectures remains challenging since strongly scalable algorithms are necessitated to reduce execution times. Alternatively, acceleration technologies such as graphics processing units (GPUs) are being considered. While the potential of GPUs has been demonstrated in various applications, benefits in the context of bidomain simulations where large sparse linear systems have to be solved in parallel with advanced numerical techniques are less clear. In this study, the feasibility of multi-GPU bidomain simulations is demonstrated by running strong scalability benchmarks using a state-of-the-art model of rabbit ventricles. The model is spatially discretized using the finite element methods (FEM) on fully unstructured grids. The GPU code is directly derived from a large pre-existing code, the Cardiac Arrhythmia Research Package (CARP), with very minor perturbation of the code base. Overall, bidomain simulations were sped up by a factor of 11.8 to 16.3 in benchmarks running on 6-20 GPUs compared to the same number of CPU cores. To match the fastest GPU simulation which engaged 20 GPUs, 476 CPU cores were required on a national supercomputing facility.

  16. Accelerating Cardiac Bidomain Simulations Using Graphics Processing Units

    PubMed Central

    Neic, Aurel; Liebmann, Manfred; Hoetzl, Elena; Mitchell, Lawrence; Vigmond, Edward J.; Haase, Gundolf

    2013-01-01

    Anatomically realistic and biophysically detailed multiscale computer models of the heart are playing an increasingly important role in advancing our understanding of integrated cardiac function in health and disease. Such detailed simulations, however, are computationally vastly demanding, which is a limiting factor for a wider adoption of in-silico modeling. While current trends in high-performance computing (HPC) hardware promise to alleviate this problem, exploiting the potential of such architectures remains challenging since strongly scalable algorithms are necessitated to reduce execution times. Alternatively, acceleration technologies such as graphics processing units (GPUs) are being considered. While the potential of GPUs has been demonstrated in various applications, benefits in the context of bidomain simulations where large sparse linear systems have to be solved in parallel with advanced numerical techniques are less clear. In this study, the feasibility of multi-GPU bidomain simulations is demonstrated by running strong scalability benchmarks using a state-of-the-art model of rabbit ventricles. The model is spatially discretized using the finite element methods (FEM) on fully unstructured grids. The GPU code is directly derived from a large pre-existing code, the Cardiac Arrhythmia Research Package (CARP), with very minor perturbation of the code base. Overall, bidomain simulations were sped up by a factor of 11.8 to 16.3 in benchmarks running on 6–20 GPUs compared to the same number of CPU cores. To match the fastest GPU simulation which engaged 20GPUs, 476 CPU cores were required on a national supercomputing facility. PMID:22692867

  17. Entropic multirelaxation-time lattice Boltzmann method for moving and deforming geometries in three dimensions

    NASA Astrophysics Data System (ADS)

    Dorschner, B.; Chikatamarla, S. S.; Karlin, I. V.

    2017-06-01

    Entropic lattice Boltzmann methods have been developed to alleviate intrinsic stability issues of lattice Boltzmann models for under-resolved simulations. Its reliability in combination with moving objects was established for various laminar benchmark flows in two dimensions in our previous work [B. Dorschner, S. Chikatamarla, F. Bösch, and I. Karlin, J. Comput. Phys. 295, 340 (2015), 10.1016/j.jcp.2015.04.017] as well as for three-dimensional one-way coupled simulations of engine-type geometries in B . Dorschner, F. Bösch, S. Chikatamarla, K. Boulouchos, and I. Karlin [J. Fluid Mech. 801, 623 (2016), 10.1017/jfm.2016.448] for flat moving walls. The present contribution aims to fully exploit the advantages of entropic lattice Boltzmann models in terms of stability and accuracy and extends the methodology to three-dimensional cases, including two-way coupling between fluid and structure and then turbulence and deforming geometries. To cover this wide range of applications, the classical benchmark of a sedimenting sphere is chosen first to validate the general two-way coupling algorithm. Increasing the complexity, we subsequently consider the simulation of a plunging SD7003 airfoil in the transitional regime at a Reynolds number of Re =40 000 and, finally, to access the model's performance for deforming geometries, we conduct a two-way coupled simulation of a self-propelled anguilliform swimmer. These simulations confirm the viability of the new fluid-structure interaction lattice Boltzmann algorithm to simulate flows of engineering relevance.

  18. V&V Of CFD Modeling Of The Argonne Bubble Experiment: FY15 Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, Nathaniel C.; Wardle, Kent E.; Bailey, James L.

    2015-09-30

    In support of the development of accelerator-driven production of the fission product Mo 99, computational fluid dynamics (CFD) simulations of an electron-beam irradiated, experimental-scale bubble chamber have been conducted in order to aid in interpretation of existing experimental results, provide additional insights into the physical phenomena, and develop predictive thermal hydraulic capabilities that can be applied to full-scale target solution vessels. Toward that end, a custom hybrid Eulerian-Eulerian-Lagrangian multiphase solver was developed, and simulations have been performed on high-resolution meshes. Good agreement between experiments and simulations has been achieved, especially with respect to the prediction of the maximum temperature ofmore » the uranyl sulfate solution in the experimental vessel. These positive results suggest that the simulation methodology that has been developed will prove to be suitable to assist in the development of full-scale production hardware.« less

  19. GAPD: a GPU-accelerated atom-based polychromatic diffraction simulation code.

    PubMed

    E, J C; Wang, L; Chen, S; Zhang, Y Y; Luo, S N

    2018-03-01

    GAPD, a graphics-processing-unit (GPU)-accelerated atom-based polychromatic diffraction simulation code for direct, kinematics-based, simulations of X-ray/electron diffraction of large-scale atomic systems with mono-/polychromatic beams and arbitrary plane detector geometries, is presented. This code implements GPU parallel computation via both real- and reciprocal-space decompositions. With GAPD, direct simulations are performed of the reciprocal lattice node of ultralarge systems (∼5 billion atoms) and diffraction patterns of single-crystal and polycrystalline configurations with mono- and polychromatic X-ray beams (including synchrotron undulator sources), and validation, benchmark and application cases are presented.

  20. Commercial Building Energy Saver, API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    2015-08-27

    The CBES API provides Application Programming Interface to a suite of functions to improve energy efficiency of buildings, including building energy benchmarking, preliminary retrofit analysis using a pre-simulation database DEEP, and detailed retrofit analysis using energy modeling with the EnergyPlus simulation engine. The CBES API is used to power the LBNL CBES Web App. It can be adopted by third party developers and vendors into their software tools and platforms.

  1. Scalable Metropolis Monte Carlo for simulation of hard shapes

    NASA Astrophysics Data System (ADS)

    Anderson, Joshua A.; Eric Irrgang, M.; Glotzer, Sharon C.

    2016-07-01

    We design and implement a scalable hard particle Monte Carlo simulation toolkit (HPMC), and release it open source as part of HOOMD-blue. HPMC runs in parallel on many CPUs and many GPUs using domain decomposition. We employ BVH trees instead of cell lists on the CPU for fast performance, especially with large particle size disparity, and optimize inner loops with SIMD vector intrinsics on the CPU. Our GPU kernel proposes many trial moves in parallel on a checkerboard and uses a block-level queue to redistribute work among threads and avoid divergence. HPMC supports a wide variety of shape classes, including spheres/disks, unions of spheres, convex polygons, convex spheropolygons, concave polygons, ellipsoids/ellipses, convex polyhedra, convex spheropolyhedra, spheres cut by planes, and concave polyhedra. NVT and NPT ensembles can be run in 2D or 3D triclinic boxes. Additional integration schemes permit Frenkel-Ladd free energy computations and implicit depletant simulations. In a benchmark system of a fluid of 4096 pentagons, HPMC performs 10 million sweeps in 10 min on 96 CPU cores on XSEDE Comet. The same simulation would take 7.6 h in serial. HPMC also scales to large system sizes, and the same benchmark with 16.8 million particles runs in 1.4 h on 2048 GPUs on OLCF Titan.

  2. Cherry-picking functionally relevant substates from long md trajectories using a stratified sampling approach.

    PubMed

    Chandramouli, Balasubramanian; Mancini, Giordano

    2016-01-01

    Classical Molecular Dynamics (MD) simulations can provide insights at the nanoscopic scale into protein dynamics. Currently, simulations of large proteins and complexes can be routinely carried out in the ns-μs time regime. Clustering of MD trajectories is often performed to identify selective conformations and to compare simulation and experimental data coming from different sources on closely related systems. However, clustering techniques are usually applied without a careful validation of results and benchmark studies involving the application of different algorithms to MD data often deal with relatively small peptides instead of average or large proteins; finally clustering is often applied as a means to analyze refined data and also as a way to simplify further analysis of trajectories. Herein, we propose a strategy to classify MD data while carefully benchmarking the performance of clustering algorithms and internal validation criteria for such methods. We demonstrate the method on two showcase systems with different features, and compare the classification of trajectories in real and PCA space. We posit that the prototype procedure adopted here could be highly fruitful in clustering large trajectories of multiple systems or that resulting especially from enhanced sampling techniques like replica exchange simulations. Copyright: © 2016 by Fabrizio Serra editore, Pisa · Roma.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinilla, Maria Isabel

    This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.

  4. HEURISTIC OPTIMIZATION AND ALGORITHM TUNING APPLIED TO SORPTIVE BARRIER DESIGN

    EPA Science Inventory

    While heuristic optimization is applied in environmental applications, ad-hoc algorithm configuration is typical. We use a multi-layer sorptive barrier design problem as a benchmark for an algorithm-tuning procedure, as applied to three heuristics (genetic algorithms, simulated ...

  5. Summary of comparison and analysis of results from exercises 1 and 2 of the OECD PBMR coupled neutronics/thermal hydraulics transient benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mkhabela, P.; Han, J.; Tyobeka, B.

    2006-07-01

    The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has accepted, through the Nuclear Science Committee (NSC), the inclusion of the Pebble-Bed Modular Reactor 400 MW design (PBMR-400) coupled neutronics/thermal hydraulics transient benchmark problem as part of their official activities. The scope of the benchmark is to establish a well-defined problem, based on a common given library of cross sections, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events through a set of multi-dimensional computational test problems. The benchmark includes three steady state exercises andmore » six transient exercises. This paper describes the first two steady state exercises, their objectives and the international participation in terms of organization, country and computer code utilized. This description is followed by a comparison and analysis of the participants' results submitted for these two exercises. The comparison of results from different codes allows for an assessment of the sensitivity of a result to the method employed and can thus help to focus the development efforts on the most critical areas. The two first exercises also allow for removing of user-related modeling errors and prepare core neutronics and thermal-hydraulics models of the different codes for the rest of the exercises in the benchmark. (authors)« less

  6. Mathematical simulations of photon interactions using Monte Carlo analysis to evaluate the uncertainty associated with in vivo K X-ray fluorescence measurements of stable lead in bone

    NASA Astrophysics Data System (ADS)

    Lodwick, Camille J.

    This research utilized Monte Carlo N-Particle version 4C (MCNP4C) to simulate K X-ray fluorescent (K XRF) measurements of stable lead in bone. Simulations were performed to investigate the effects that overlying tissue thickness, bone-calcium content, and shape of the calibration standard have on detector response in XRF measurements at the human tibia. Additional simulations of a knee phantom considered uncertainty associated with rotation about the patella during XRF measurements. Simulations tallied the distribution of energy deposited in a high-purity germanium detector originating from collimated 88 keV 109Cd photons in backscatter geometry. Benchmark measurements were performed on simple and anthropometric XRF calibration phantoms of the human leg and knee developed at the University of Cincinnati with materials proven to exhibit radiological characteristics equivalent to human tissue and bone. Initial benchmark comparisons revealed that MCNP4C limits coherent scatter of photons to six inverse angstroms of momentum transfer and a Modified MCNP4C was developed to circumvent the limitation. Subsequent benchmark measurements demonstrated that Modified MCNP4C adequately models photon interactions associated with in vivo K XRF of lead in bone. Further simulations of a simple leg geometry possessing tissue thicknesses from 0 to 10 mm revealed increasing overlying tissue thickness from 5 to 10 mm reduced predicted lead concentrations an average 1.15% per 1 mm increase in tissue thickness (p < 0.0001). An anthropometric leg phantom was mathematically defined in MCNP to more accurately reflect the human form. A simulated one percent increase in calcium content (by mass) of the anthropometric leg phantom's cortical bone demonstrated to significantly reduce the K XRF normalized ratio by 4.5% (p < 0.0001). Comparison of the simple and anthropometric calibration phantoms also suggested that cylindrical calibration standards can underestimate lead content of a human leg up to 4%. The patellar bone structure in which the fluorescent photons originate was found to vary dramatically with measurement angle. The relative contribution of lead signal from the patella declined from 65% to 27% when rotated 30°. However, rotation of the source-detector about the patella from 0 to 45° demonstrated no significant effect on the net K XRF response at the knee.

  7. Gas Gun Model and Comparison to Experimental Performance of Pipe Guns Operating with Light Propellant Gases and Large Cryogenic Pellets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, J. R.; Carmichael, J. R.; Gebhart, T. E.

    Injection of multiple large (~10 to 30 mm diameter) shattered pellets into ITER plasmas is presently part of the scheme planned to mitigate the deleterious effects of disruptions on the vessel components. To help in the design and optimize performance of the pellet injectors for this application, a model referred to as “the gas gun simulator” has been developed and benchmarked against experimental data. The computer code simulator is a Java program that models the gas-dynamics characteristics of a single-stage gas gun. Following a stepwise approach, the code utilizes a variety of input parameters to incrementally simulate and analyze themore » dynamics of the gun as the projectile is launched down the barrel. Using input data, the model can calculate gun performance based on physical characteristics, such as propellant-gas and fast-valve properties, barrel geometry, and pellet mass. Although the model is fundamentally generic, the present version is configured to accommodate cryogenic pellets composed of H2, D2, Ne, Ar, and mixtures of them and light propellant gases (H2, D2, and He). The pellets are solidified in situ in pipe guns that consist of stainless steel tubes and fast-acting valves that provide the propellant gas for pellet acceleration (to speeds ~200 to 700 m/s). The pellet speed is the key parameter in determining the response time of a shattered pellet system to a plasma disruption event. The calculated speeds from the code simulations of experiments were typically in excellent agreement with the measured values. With the gas gun simulator validated for many test shots and over a wide range of physical and operating parameters, it is a valuable tool for optimization of the injector design, including the fast valve design (orifice size and volume) for any operating pressure (~40 bar expected for the ITER application) and barrel length for any pellet size (mass, diameter, and length). Key design parameters and proposed values for the pellet injectors for the ITER disruption mitigation systems are discussed.« less

  8. Gas Gun Model and Comparison to Experimental Performance of Pipe Guns Operating with Light Propellant Gases and Large Cryogenic Pellets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combs, S. K.; Reed, J. R.; Lyttle, M. S.

    2016-01-01

    Injection of multiple large (~10 to 30 mm diameter) shattered pellets into ITER plasmas is presently part of the scheme planned to mitigate the deleterious effects of disruptions on the vessel components. To help in the design and optimize performance of the pellet injectors for this application, a model referred to as “the gas gun simulator” has been developed and benchmarked against experimental data. The computer code simulator is a Java program that models the gas-dynamics characteristics of a single-stage gas gun. Following a stepwise approach, the code utilizes a variety of input parameters to incrementally simulate and analyze themore » dynamics of the gun as the projectile is launched down the barrel. Using input data, the model can calculate gun performance based on physical characteristics, such as propellant-gas and fast-valve properties, barrel geometry, and pellet mass. Although the model is fundamentally generic, the present version is configured to accommodate cryogenic pellets composed of H2, D2, Ne, Ar, and mixtures of them and light propellant gases (H2, D2, and He). The pellets are solidified in situ in pipe guns that consist of stainless steel tubes and fast-acting valves that provide the propellant gas for pellet acceleration (to speeds ~200 to 700 m/s). The pellet speed is the key parameter in determining the response time of a shattered pellet system to a plasma disruption event. The calculated speeds from the code simulations of experiments were typically in excellent agreement with the measured values. With the gas gun simulator validated for many test shots and over a wide range of physical and operating parameters, it is a valuable tool for optimization of the injector design, including the fast valve design (orifice size and volume) for any operating pressure (~40 bar expected for the ITER application) and barrel length for any pellet size (mass, diameter, and length). Key design parameters and proposed values for the pellet injectors for the ITER disruption mitigation systems are discussed.« less

  9. Analysis of the Effect of Environmental Conditions in Conducting Amphibious Assaults Using a Ship Simulator/Vessel-Response Model Proof-of-Concept Study

    DTIC Science & Technology

    2017-05-01

    Analyzing these factors enables a planner to develop an axis-of-advance that a vessel can easily maintain, as well as to reduce the travel time from...operational risk by testing the feasibility of the navigability of an area; 2) determining the capacity and timing of that operation; 3) defining the...conditions at this location dictate that only a narrow window of time is available for conducting surface ship-to- shore operations. The vessel

  10. Nitinol Embolic Protection Filters: Design Investigation by Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Conti, Michele; de Beule, Matthieu; Mortier, Peter; van Loo, Denis; Verdonck, Pascal; Vermassen, Frank; Segers, Patrick; Auricchio, Ferdinando; Verhegghe, Benedict

    2009-08-01

    The widespread acceptance of carotid artery stenting (CAS) to treat carotid artery stenosis and its effectiveness compared with surgical counterpart, carotid endarterectomy (CEA), is still a matter of debate. Transient or permanent neurological deficits may develop in patients undergoing CAS due to distal embolization or hemodynamic changes. Design, development, and usage of embolic protection devices (EPDs), such as embolic protection filters, appear to have a significant impact on the success of CAS. Unfortunately, some drawbacks, such as filtering failure, inability to cross tortuous high-grade stenoses, malpositioning and vessel injury, still remain and require design improvement. Currently, many different designs of such devices are available on the rapidly growing dedicated market. In spite of such a growing commercial interest, there is a significant need for design tools as well as for careful engineering investigations and design analyses of such nitinol devices. The present study aims to investigate the embolic protection filter design by finite element analysis. We first developed a parametrical computer-aided design model of an embolic filter based on micro-CT scans of the Angioguard™ XP (Cordis Endovascular, FL) EPD by means of the open source pyFormex software. Subsequently, we used the finite element method to simulate the deployment of the nitinol filter as it exits the delivery sheath. Comparison of the simulations with micro-CT images of the real device exiting the catheter showed excellent correspondence with our simulations. Finally, we evaluated circumferential basket-vessel wall apposition of a 4 mm size filter in a straight vessel of different sizes and shape. We conclude that the proposed methodology offers a useful tool to evaluate and to compare current or new designs of EPDs. Further simulations will investigate vessel wall apposition in a realistic tortuous anatomy.

  11. Direct Numerical Simulation of Cellular-Scale Blood Flow in 3D Microvascular Networks.

    PubMed

    Balogh, Peter; Bagchi, Prosenjit

    2017-12-19

    We present, to our knowledge, the first direct numerical simulation of 3D cellular-scale blood flow in physiologically realistic microvascular networks. The vascular networks are designed following in vivo images and data, and are comprised of bifurcating, merging, and winding vessels. Our model resolves the large deformation and dynamics of each individual red blood cell flowing through the networks with high fidelity, while simultaneously retaining the highly complex geometric details of the vascular architecture. To our knowledge, our simulations predict several novel and unexpected phenomena. We show that heterogeneity in hemodynamic quantities, which is a hallmark of microvascular blood flow, appears both in space and time, and that the temporal heterogeneity is more severe than its spatial counterpart. The cells are observed to frequently jam at vascular bifurcations resulting in reductions in hematocrit and flow rate in the daughter and mother vessels. We find that red blood cell jamming at vascular bifurcations results in several orders-of-magnitude increase in hemodynamic resistance, and thus provides an additional mechanism of increased in vivo blood viscosity as compared to that determined in vitro. A striking result from our simulations is negative pressure-flow correlations observed in several vessels, implying a significant deviation from Poiseuille's law. Furthermore, negative correlations between vascular resistance and hematocrit are observed in various vessels, also defying a major principle of particulate suspension flow. To our knowledge, these novel findings are absent in blood flow in straight tubes, and they underscore the importance of considering realistic physiological geometry and resolved cellular interactions in modeling microvascular hemodynamics. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  12. Limits on estimating the width of thin tubular structures in 3D images.

    PubMed

    Wörz, Stefan; Rohr, Karl

    2006-01-01

    This work studies limits on estimating the width of thin tubular structures in 3D images. Based on nonlinear estimation theory we analyze the minimal stochastic error of estimating the width. Given a 3D analytic model of the image intensities of tubular structures, we derive a closed-form expression for the Cramér-Rao bound of the width estimate under image noise. We use the derived lower bound as a benchmark and compare it with three previously proposed accuracy limits for vessel width estimation. Moreover, by experimental investigations we demonstrate that the derived lower bound can be achieved by fitting a 3D parametric intensity model directly to the image data.

  13. ZPR-6 assembly 7 high {sup 240} PU core : a cylindrical assemby with mixed (PU, U)-oxide fuel and a central high {sup 240} PU zone.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lell, R. M.; Schaefer, R. W.; McKnight, R. D.

    Over a period of 30 years more than a hundred Zero Power Reactor (ZPR) critical assemblies were constructed at Argonne National Laboratory. The ZPR facilities, ZPR-3, ZPR-6, ZPR-9 and ZPPR, were all fast critical assembly facilities. The ZPR critical assemblies were constructed to support fast reactor development, but data from some of these assemblies are also well suited to form the basis for criticality safety benchmarks. Of the three classes of ZPR assemblies, engineering mockups, engineering benchmarks and physics benchmarks, the last group tends to be most useful for criticality safety. Because physics benchmarks were designed to test fast reactormore » physics data and methods, they were as simple as possible in geometry and composition. The principal fissile species was {sup 235}U or {sup 239}Pu. Fuel enrichments ranged from 9% to 95%. Often there were only one or two main core diluent materials, such as aluminum, graphite, iron, sodium or stainless steel. The cores were reflected (and insulated from room return effects) by one or two layers of materials such as depleted uranium, lead or stainless steel. Despite their more complex nature, a small number of assemblies from the other two classes would make useful criticality safety benchmarks because they have features related to criticality safety issues, such as reflection by soil-like material. The term 'benchmark' in a ZPR program connotes a particularly simple loading aimed at gaining basic reactor physics insight, as opposed to studying a reactor design. In fact, the ZPR-6/7 Benchmark Assembly (Reference 1) had a very simple core unit cell assembled from plates of depleted uranium, sodium, iron oxide, U3O8, and plutonium. The ZPR-6/7 core cell-average composition is typical of the interior region of liquid-metal fast breeder reactors (LMFBRs) of the era. It was one part of the Demonstration Reactor Benchmark Program,a which provided integral experiments characterizing the important features of demonstration-size LMFBRs. As a benchmark, ZPR-6/7 was devoid of many 'real' reactor features, such as simulated control rods and multiple enrichment zones, in its reference form. Those kinds of features were investigated experimentally in variants of the reference ZPR-6/7 or in other critical assemblies in the Demonstration Reactor Benchmark Program.« less

  14. Performance Evaluation and Benchmarking of Next Intelligent Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    del Pobil, Angel; Madhavan, Raj; Bonsignorio, Fabio

    Performance Evaluation and Benchmarking of Intelligent Systems presents research dedicated to the subject of performance evaluation and benchmarking of intelligent systems by drawing from the experiences and insights of leading experts gained both through theoretical development and practical implementation of intelligent systems in a variety of diverse application domains. This contributed volume offers a detailed and coherent picture of state-of-the-art, recent developments, and further research areas in intelligent systems. The chapters cover a broad range of applications, such as assistive robotics, planetary surveying, urban search and rescue, and line tracking for automotive assembly. Subsystems or components described in this bookmore » include human-robot interaction, multi-robot coordination, communications, perception, and mapping. Chapters are also devoted to simulation support and open source software for cognitive platforms, providing examples of the type of enabling underlying technologies that can help intelligent systems to propagate and increase in capabilities. Performance Evaluation and Benchmarking of Intelligent Systems serves as a professional reference for researchers and practitioners in the field. This book is also applicable to advanced courses for graduate level students and robotics professionals in a wide range of engineering and related disciplines including computer science, automotive, healthcare, manufacturing, and service robotics.« less

  15. Design and development of a community carbon cycle benchmarking system for CMIP5 models

    NASA Astrophysics Data System (ADS)

    Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Randerson, J. T.

    2013-12-01

    Benchmarking has been widely used to assess the ability of atmosphere, ocean, sea ice, and land surface models to capture the spatial and temporal variability of observations during the historical period. For the carbon cycle and terrestrial ecosystems, the design and development of an open-source community platform has been an important goal as part of the International Land Model Benchmarking (ILAMB) project. Here we designed and developed a software system that enables the user to specify the models, benchmarks, and scoring systems so that results can be tailored to specific model intercomparison projects. We used this system to evaluate the performance of CMIP5 Earth system models (ESMs). Our scoring system used information from four different aspects of climate, including the climatological mean spatial pattern of gridded surface variables, seasonal cycle dynamics, the amplitude of interannual variability, and long-term decadal trends. We used this system to evaluate burned area, global biomass stocks, net ecosystem exchange, gross primary production, and ecosystem respiration from CMIP5 historical simulations. Initial results indicated that the multi-model mean often performed better than many of the individual models for most of the observational constraints.

  16. The PAC-MAN model: Benchmark case for linear acoustics in computational physics

    NASA Astrophysics Data System (ADS)

    Ziegelwanger, Harald; Reiter, Paul

    2017-10-01

    Benchmark cases in the field of computational physics, on the one hand, have to contain a certain complexity to test numerical edge cases and, on the other hand, require the existence of an analytical solution, because an analytical solution allows the exact quantification of the accuracy of a numerical simulation method. This dilemma causes a need for analytical sound field formulations of complex acoustic problems. A well known example for such a benchmark case for harmonic linear acoustics is the ;Cat's Eye model;, which describes the three-dimensional sound field radiated from a sphere with a missing octant analytically. In this paper, a benchmark case for two-dimensional (2D) harmonic linear acoustic problems, viz., the ;PAC-MAN model;, is proposed. The PAC-MAN model describes the radiated and scattered sound field around an infinitely long cylinder with a cut out sector of variable angular width. While the analytical calculation of the 2D sound field allows different angular cut-out widths and arbitrarily positioned line sources, the computational cost associated with the solution of this problem is similar to a 1D problem because of a modal formulation of the sound field in the PAC-MAN model.

  17. Acoustic hemostasis of porcine superficial femoral artery: Simulation and in-vivo experimental studies

    NASA Astrophysics Data System (ADS)

    Zeng, Xiaozheng; Mitchell, Stuart; Miller, Matthew; Barnes, Stephen; Hopple, Jerry; Kook, John; Moreau-Gobard, Romain; Hsu, Stephen; Ahiekpor-Dravi, Alexis; Crum, Lawrence A.; Eaton, John; Wong, Keith; Sekins, K. Michael

    2012-10-01

    In-vivo focused ultrasound studies were computationally simulated and conducted experimentally with the aim of occluding porcine superficial femoral arteries (SFA) via thermal coagulation. A multi-array HIFU applicator was used which electronically scanned multiple beam foci around the target point. The spatio-temporally averaged acoustic and temperature fields were simulated in a fluid dynamics and acousto-thermal finite element model with representative tissue fields, including muscle, vessel and blood. Simulations showed that with an acoustic power of 200W and a dose time of 60s, perivascular tissue reached 91°C; and yet blood reached a maximum 59°C, below the coagulation objective for this dose regime (75°C). Per simulations, acoustic-streaming induced velocity in blood reached 6.1cm/s. In in-vivo experiments, several arteries were treated. As simulated, thermal lesions were observed in muscle surrounding SFA in all cases. In dosing limited to 30 to 60 seconds, it required 257W to provide occlusion (one complete and one partial occlusion). Angiography and histology showed evidence of thrombogenesis and collagen shrinkage-based vessel constriction at these doses.

  18. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.

  19. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  20. Comparative study on gene set and pathway topology-based enrichment methods.

    PubMed

    Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim

    2015-10-22

    Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both types of methods for enrichment analysis require further improvements in order to deal with the problem of pathway overlaps.

  1. HRSSA – Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchetti, Luca, E-mail: marchetti@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; University of Trento, Department of Mathematics

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance andmore » accuracy of HRSSA against other state of the art algorithms.« less

  2. Quantitative Oxygenation Venography from MRI Phase

    PubMed Central

    Fan, Audrey P.; Bilgic, Berkin; Gagnon, Louis; Witzel, Thomas; Bhat, Himanshu; Rosen, Bruce R.; Adalsteinsson, Elfar

    2014-01-01

    Purpose To demonstrate acquisition and processing methods for quantitative oxygenation venograms that map in vivo oxygen saturation (SvO2) along cerebral venous vasculature. Methods Regularized quantitative susceptibility mapping (QSM) is used to reconstruct susceptibility values and estimate SvO2 in veins. QSM with ℓ1 and ℓ2 regularization are compared in numerical simulations of vessel structures with known magnetic susceptibility. Dual-echo, flow-compensated phase images are collected in three healthy volunteers to create QSM images. Bright veins in the susceptibility maps are vectorized and used to form a three-dimensional vascular mesh, or venogram, along which to display SvO2 values from QSM. Results Quantitative oxygenation venograms that map SvO2 along brain vessels of arbitrary orientation and geometry are shown in vivo. SvO2 values in major cerebral veins lie within the normal physiological range reported by 15O positron emission tomography. SvO2 from QSM is consistent with previous MR susceptometry methods for vessel segments oriented parallel to the main magnetic field. In vessel simulations, ℓ1 regularization results in less than 10% SvO2 absolute error across all vessel tilt orientations and provides more accurate SvO2 estimation than ℓ2 regularization. Conclusion The proposed analysis of susceptibility images enables reliable mapping of quantitative SvO2 along venograms and may facilitate clinical use of venous oxygenation imaging. PMID:24006229

  3. Stress Rupture Life Reliability Measures for Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Thesken, John C.; Phoenix, S. Leigh; Grimes-Ledesma, Lorie

    2007-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are often used for storing pressurant gases onboard spacecraft. Kevlar (DuPont), glass, carbon and other more recent fibers have all been used as overwraps. Due to the fact that overwraps are subjected to sustained loads for an extended period during a mission, stress rupture failure is a major concern. It is therefore important to ascertain the reliability of these vessels by analysis, since the testing of each flight design cannot be completed on a practical time scale. The present paper examines specifically a Weibull statistics based stress rupture model and considers the various uncertainties associated with the model parameters. The paper also examines several reliability estimate measures that would be of use for the purpose of recertification and for qualifying flight worthiness of these vessels. Specifically, deterministic values for a point estimate, mean estimate and 90/95 percent confidence estimates of the reliability are all examined for a typical flight quality vessel under constant stress. The mean and the 90/95 percent confidence estimates are computed using Monte-Carlo simulation techniques by assuming distribution statistics of model parameters based also on simulation and on the available data, especially the sample sizes represented in the data. The data for the stress rupture model are obtained from the Lawrence Livermore National Laboratories (LLNL) stress rupture testing program, carried out for the past 35 years. Deterministic as well as probabilistic sensitivities are examined.

  4. Modeling of turbulent separated flows for aerodynamic applications

    NASA Technical Reports Server (NTRS)

    Marvin, J. G.

    1983-01-01

    Steady, high speed, compressible separated flows modeled through numerical simulations resulting from solutions of the mass-averaged Navier-Stokes equations are reviewed. Emphasis is placed on benchmark flows that represent simplified (but realistic) aerodynamic phenomena. These include impinging shock waves, compression corners, glancing shock waves, trailing edge regions, and supersonic high angle of attack flows. A critical assessment of modeling capabilities is provided by comparing the numerical simulations with experiment. The importance of combining experiment, numerical algorithm, grid, and turbulence model to effectively develop this potentially powerful simulation technique is stressed.

  5. Approach for Configuring a Standardized Vessel for Processing Radioactive Waste Slurries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bamberger, Judith A.; Enderlin, Carl W.; Minette, Michael J.

    2015-09-10

    A standardized vessel design is being considered at the Waste Treatment and Immobilization Plant (WTP) that is under construction at Hanford, Washington. The standardized vessel design will be used for storing, blending, and chemical processing of slurries that exhibit a variable process feed including Newtonian to non-Newtonian rheologies over a range of solids loadings. Developing a standardized vessel is advantageous and reduces the testing required to evaluate the performance of the design. The objectives of this paper are to: 1) present a design strategy for developing a standard vessel mixing system design for the pretreatment portion of the waste treatmentmore » plant that must process rheologically and physically challenging process streams, 2) identify performance criteria that the design for the standard vessel must satisfy, 3) present parameters that are to be used for assessing the performance criteria, and 4) describe operation of the selected technology. Vessel design performance will be assessed for both Newtonian and non-Newtonian simulants which represent a range of waste types expected during operation. Desired conditions for the vessel operations are the ability to shear the slurry so that flammable gas does not accumulate within the vessel, that settled solids will be mobilized, that contents can be blended, and that contents can be transferred from the vessel. A strategy is presented for adjusting the vessel configuration to ensure that all these conditions are met.« less

  6. Validation of the second-generation Olympus colonoscopy simulator for skills assessment.

    PubMed

    Haycock, A V; Bassett, P; Bladen, J; Thomas-Gibson, S

    2009-11-01

    Simulators have potential value in providing objective evidence of technical skill for procedures within medicine. The aim of this study was to determine face and construct validity for the Olympus colonoscopy simulator and to establish which assessment measures map to clinical benchmarks of expertise. Thirty-four participants were recruited: 10 novices with no prior colonoscopy experience, 13 intermediate (trainee) endoscopists with fewer than 1000 previous colonoscopies, and 11 experienced endoscopists with more than 1000 previous colonoscopies. All participants completed three standardized cases on the simulator and experts gave feedback regarding the realism of the simulator. Forty metrics recorded automatically by the simulator were analyzed for their ability to distinguish between the groups. The simulator discriminated participants by experience level for 22 different parameters. Completion rates were lower for novices than for trainees and experts (37 % vs. 79 % and 88 % respectively, P < 0.001) and both novices and trainees took significantly longer to reach all major landmarks than the experts. Several technical aspects of competency were discriminatory; pushing with an embedded tip ( P = 0.03), correct use of the variable stiffness function ( P = 0.004), number of sigmoid N-loops ( P = 0.02); size of sigmoid N-loops ( P = 0.01), and time to remove alpha loops ( P = 0.004). Out of 10, experts rated the realism of movement at 6.4, force feedback at 6.6, looping at 6.6, and loop resolution at 6.8. The Olympus colonoscopy simulator has good face validity and excellent construct validity. It provides an objective assessment of colonoscopic skill on multiple measures and benchmarks have been set to allow its use as both a formative and a summative assessment tool. Georg Thieme Verlag KG Stuttgart. New York.

  7. Increasing the relevance of GCM simulations for Climate Services

    NASA Astrophysics Data System (ADS)

    Smith, L. A.; Suckling, E.

    2012-12-01

    The design and interpretation of model simulations for climate services differ significantly from experimental design for the advancement of the fundamental research on predictability that underpins it. Climate services consider the sources of best information available today; this calls for a frank evaluation of model skill in the face of statistical benchmarks defined by empirical models. The fact that Physical simulation models are thought to provide the only reliable method for extrapolating into conditions not previously observed has no bearing on whether or not today's simulation models outperform empirical models. Evidence on the length scales on which today's simulation models fail to outperform empirical benchmarks is presented; it is illustrated that this occurs even on global scales in decadal prediction. At all timescales considered thus far (as of July 2012), predictions based on simulation models are improved by blending with the output of statistical models. Blending is shown to be more interesting in the climate context than it is in the weather context, where blending with a history-based climatology is straightforward. As GCMs improve and as the Earth's climate moves further from that of the last century, the skill from simulation models and their relevance to climate services is expected to increase. Examples from both seasonal and decadal forecasting will be used to discuss a third approach that may increase the role of current GCMs more quickly. Specifically, aspects of the experimental design in previous hind cast experiments are shown to hinder the use of GCM simulations for climate services. Alternative designs are proposed. The value in revisiting Thompson's classic approach to improving weather forecasting in the fifties in the context of climate services is discussed.

  8. Dimethyl methylphosphonate adsorption and decomposition on MoO2 as studied by ambient pressure x-ray photoelectron spectroscopy and DFT calculations

    NASA Astrophysics Data System (ADS)

    Head, Ashley R.; Tsyshevsky, Roman; Trotochaud, Lena; Yu, Yi; Karslıoǧlu, Osman; Eichhorn, Bryan; Kuklja, Maija M.; Bluhm, Hendrik

    2018-04-01

    Organophosphonates range in their toxicity and are used as pesticides, herbicides, and chemical warfare agents (CWAs). Few laboratories are equipped to handle the most toxic molecules, thus simulants such as dimethyl methylphosphonate (DMMP), are used as a first step in studying adsorption and reactivity on materials. Benchmarked by combined experimental and theoretical studies of simulants, calculations offer an opportunity to understand how molecular interactions with a surface changes upon using a CWA. However, most calculations of DMMP and CWAs on surfaces are limited to adsorption studies on clusters of atoms, which may differ markedly from the behavior on bulk solid-state materials with extended surfaces. We have benchmarked our solid-state periodic calculations of DMMP adsorption and reactivity on MoO2 with ambient pressure x-ray photoelectron spectroscopy studies (APXPS). DMMP is found to interact strongly with a MoO2 film, a model system for the MoO x component in the ASZM-TEDA© gas filtration material. Density functional theory modeling of several adsorption and decomposition mechanisms assist the assignment of APXPS peaks. Our results show that some of the adsorbed DMMP decomposes, with all the products remaining on the surface. The rigorous calculations benchmarked with experiments pave a path to reliable and predictive theoretical studies of CWA interactions with surfaces.

  9. Benchmark of the local drift-kinetic models for neoclassical transport simulation in helical plasmas

    NASA Astrophysics Data System (ADS)

    Huang, B.; Satake, S.; Kanno, R.; Sugama, H.; Matsuoka, S.

    2017-02-01

    The benchmarks of the neoclassical transport codes based on the several local drift-kinetic models are reported here. Here, the drift-kinetic models are zero orbit width (ZOW), zero magnetic drift, DKES-like, and global, as classified in Matsuoka et al. [Phys. Plasmas 22, 072511 (2015)]. The magnetic geometries of Helically Symmetric Experiment, Large Helical Device (LHD), and Wendelstein 7-X are employed in the benchmarks. It is found that the assumption of E ×B incompressibility causes discrepancy of neoclassical radial flux and parallel flow among the models when E ×B is sufficiently large compared to the magnetic drift velocities. For example, Mp≤0.4 where Mp is the poloidal Mach number. On the other hand, when E ×B and the magnetic drift velocities are comparable, the tangential magnetic drift, which is included in both the global and ZOW models, fills the role of suppressing unphysical peaking of neoclassical radial-fluxes found in the other local models at Er≃0 . In low collisionality plasmas, in particular, the tangential drift effect works well to suppress such unphysical behavior of the radial transport caused in the simulations. It is demonstrated that the ZOW model has the advantage of mitigating the unphysical behavior in the several magnetic geometries, and that it also implements the evaluation of bootstrap current in LHD with the low computation cost compared to the global model.

  10. FDA Benchmark Medical Device Flow Models for CFD Validation.

    PubMed

    Malinauskas, Richard A; Hariharan, Prasanna; Day, Steven W; Herbertson, Luke H; Buesen, Martin; Steinseifer, Ulrich; Aycock, Kenneth I; Good, Bryan C; Deutsch, Steven; Manning, Keefe B; Craven, Brent A

    Computational fluid dynamics (CFD) is increasingly being used to develop blood-contacting medical devices. However, the lack of standardized methods for validating CFD simulations and blood damage predictions limits its use in the safety evaluation of devices. Through a U.S. Food and Drug Administration (FDA) initiative, two benchmark models of typical device flow geometries (nozzle and centrifugal blood pump) were tested in multiple laboratories to provide experimental velocities, pressures, and hemolysis data to support CFD validation. In addition, computational simulations were performed by more than 20 independent groups to assess current CFD techniques. The primary goal of this article is to summarize the FDA initiative and to report recent findings from the benchmark blood pump model study. Discrepancies between CFD predicted velocities and those measured using particle image velocimetry most often occurred in regions of flow separation (e.g., downstream of the nozzle throat, and in the pump exit diffuser). For the six pump test conditions, 57% of the CFD predictions of pressure head were within one standard deviation of the mean measured values. Notably, only 37% of all CFD submissions contained hemolysis predictions. This project aided in the development of an FDA Guidance Document on factors to consider when reporting computational studies in medical device regulatory submissions. There is an accompanying podcast available for this article. Please visit the journal's Web site (www.asaiojournal.com) to listen.

  11. Virtual-Reality Simulator System for Double Interventional Cardiac Catheterization Using Fractional-Order Vascular Access Tracker and Haptic Force Producer

    PubMed Central

    Chen, Guan-Chun; Lin, Chia-Hung; Hsieh, Kai-Sheng; Du, Yi-Chun; Chen, Tainsong

    2015-01-01

    This study proposes virtual-reality (VR) simulator system for double interventional cardiac catheterization (ICC) using fractional-order vascular access tracker and haptic force producer. An endoscope or a catheter for diagnosis and surgery of cardiovascular disease has been commonly used in minimally invasive surgery. It needs specific skills and experiences for young surgeons or postgraduate year (PGY) students to operate a Berman catheter and a pigtail catheter in the inside of the human body and requires avoiding damaging vessels. To improve the training in inserting catheters, a double-catheter mechanism is designed for the ICC procedures. A fractional-order vascular access tracker is used to trace the senior surgeons' consoled trajectories and transmit the frictional feedback and visual feedback during the insertion of catheters. Based on the clinical feeling through the aortic arch, vein into the ventricle, or tortuous blood vessels, haptic force producer is used to mock the elasticity of the vessel wall using voice coil motors (VCMs). The VR establishment with surgeons' consoled vessel trajectories and hand feeling is achieved, and the experimental results show the effectiveness for the double ICC procedures. PMID:26171419

  12. Numerical simulation of vessel dynamics in manoeuvrability and seakeeping problems

    NASA Astrophysics Data System (ADS)

    Blishchik, A. E.; Taranov, A. E.

    2018-05-01

    This paper deals with some examples of numerical modelling for ship's dynamics problems and data comparison with corresponding experimental results. It was considered two kinds of simulation: self-propelled turning motion of crude carrier KVLCC2 and changing position of container carrier S 175 due to wave loadings. Mesh generation and calculation were made in STAR-CCM+ package. URANS equations were used as system of equations closed by k-w SST turbulence model. The vessel had several degrees of freedom, which depend on task. Based on the results of this research, the conclusion was made concerning the applicability of used numerical methods.

  13. Formation of acrylamide at temperatures lower than 100°C: the case of prunes and a model study

    PubMed Central

    Becalski, A.; Brady, B.; Feng, S.; Gauthier, B.R.; Zhao, T.

    2011-01-01

    Acrylamide concentrations in prune products – baby strained prunes (range = 75–265 μg kg−−1), baby apple/prune juice (33–61 μg kg−−1), prune juice (186–916 μg kg−−1) and prunes (58–332 μg kg−−1) – on the Canadian market were determined. The formation of acrylamide in a simulated plum juice was also investigated under ‘drying conditions’ in an open vessel at temperatures <100°C for 24 h and under ‘wet conditions’ in a closed vessel at a temperature of 120°C for 1 h. Acrylamide was produced in a simulated plum juice under ‘drying conditions’ in amounts comparable with those found in prunes and prune juices. Acrylamide was not produced in simulated plum juice under ‘wet conditions’ in a closed vessel at temperature of 120°C for 1 h, but under the same condition an authentic prune juice doubled its acrylamide concentration. Formation of acrylamide in prune products was attributed to the presence of asparagine and sugars in the starting materials. PMID:21623495

  14. A three-dimensional numerical simulation of cell behavior in a flow chamber based on fluid-solid interaction.

    PubMed

    Bai, Long; Cui, Yuhong; Zhang, Yixia; Zhao, Na

    2014-01-01

    The mechanical behavior of blood cells in the vessels has a close relationship with the physical characteristics of the blood and the cells. In this paper, a numerical simulation method was proposed to understand a single-blood cell's behavior in the vessels based on fluid-solid interaction method, which was conducted under adaptive time step and fixed time step, respectively. The main programme was C++ codes, which called FLUENT and ANSYS software, and UDF and APDL acted as a messenger to connect FLUENT and ANSYS for exchanging data. The computing results show: (1) the blood cell moved towards the bottom of the flow chamber in the beginning due to the influence of gravity, then it began to jump up when reached a certain height rather than touching the bottom. It could move downwards again after jump up, the blood cell could keep this way of moving like dancing continuously in the vessels; (2) the blood cell was rolling and deforming all the time; the rotation had oscillatory changes and the deformation became conspicuously when the blood cell was dancing. This new simulation method and results can be widely used in the researches of cytology, blood, cells, etc.

  15. Influence of simulated microgravity on the longevity of insect-cell culture

    NASA Technical Reports Server (NTRS)

    Cowger, N. L.; O'Connor, K. C.; Bivins, J. E.

    1997-01-01

    Simulated microgravity within the NASA High Aspect Rotating-Wall Vessel (HARV) provides a quiescent environment to culture fragile insect cells. In this vessel, the duration of stationary and death phase for cultures of Spodoptera frugiperda cells was greatly extended over that achieved in shaker-flask controls. For both HARV and control cultures, S. frugiperda cells grew to concentrations in excess of 1 x 10(7) viable cells ml-1 with viabilities greater than 90%. In the HARV, stationary phase was maintained 9-15 days in contrast to 4-5 days in the shaker flask. Furthermore, the rate of cell death was reduced in the HARV by a factor of 20-90 relative to the control culture and was characterized with a death rate constant of 0.01-0.02 day-1. Beginning in the stationary phase and continuing in the death phase, there was a significant decrease in population size in the HARV versus an increase in the shaker flask. This phenomenon could represent cell adaptation to simulated microgravity and/or a change in the ratio of apoptotic to necrotic cells. Differences observed in this research between the HARV and its control were attributed to a reduction in hydrodynamic forces in the microgravity vessel.

  16. Simulation of Containment Atmosphere Mixing and Stratification Experiment in the ThAI Facility with a CFD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babic, Miroslav; Kljenak, Ivo; Mavko, Borut

    2006-07-01

    The CFD code CFX4.4 was used to simulate an experiment in the ThAI facility, which was designed for investigation of thermal-hydraulic processes during a severe accident inside a Light Water Reactor containment. In the considered experiment, air was initially present in the vessel, and helium and steam were injected during different phases of the experiment at various mass flow rates and at different locations. The main purpose of the proposed work was to assess the capabilities of the CFD code to reproduce the atmosphere structure with a three-dimensional model, coupled with condensation models proposed by the authors. A three-dimensional modelmore » of the ThAI vessel for the CFX4.4 code was developed. The flow in the simulation domain was modeled as single-phase. Steam condensation on vessel walls was modeled as a sink of mass and energy using a correlation that was originally developed for an integral approach. A simple model of bulk phase change was also included. Calculated time-dependent variables together with temperature and volume fraction distributions at the end of different experiment phases are compared to experimental results. (authors)« less

  17. ALE3D Simulation and Measurement of Violence in a Fast Cookoff Experiment with LX-10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClelland, M A; Maienschein, J L; Howard, W M

    We performed a computational and experimental analysis of fast cookoff of LX-10 (94.7% HMX, 5.3% Viton A) confined in a 2 kbar steel tube with reinforced end caps. A Scaled-Thermal-Explosion-eXperiment (STEX) was completed in which three radiant heaters were used to heat the vessel until ignition, resulting in a moderately violent explosion after 20.4 minutes. Thermocouple measurements showed tube temperatures as high as 340 C at ignition and LX-10 surface temperatures as high as 279 C, which is near the melting point of HMX. Three micro-power radar systems were used to measure mean fragment velocities of 840 m/s. Photonics Dopplermore » Velocimeters (PDVs) showed a rapid acceleration of fragments over 80 {micro}s. A one-dimensional ALE3D cookoff model at the vessel midplane was used to simulate the heating, thermal expansion, LX-10 decomposition composition, and closing of the gap between the HE (High Explosive) and vessel wall. Although the ALE3D simulation terminated before ignition, the model provided a good representation of heat transfer through the case and across the dynamic gap to the explosive.« less

  18. Initial Development of an Exploding Aerosol Can Simulator

    DOT National Transportation Integrated Search

    1998-04-01

    A device was constructed to simulate an exploding aerosol can. The device consisted of a cylindrical pressure vessel for storage of flammable propellants and base product and a high-rate discharge (HRD) valve for quick release of the constituents. Si...

  19. New Turbulent Multiphase Flow Facilities for Simulation Benchmarking

    NASA Astrophysics Data System (ADS)

    Teoh, Chee Hau; Salibindla, Ashwanth; Masuk, Ashik Ullah Mohammad; Ni, Rui

    2017-11-01

    The Fluid Transport Lab at Penn State has devoted last few years on developing new experimental facilities to unveil the underlying physics of coupling between solid-gas and gas-liquid multiphase flow in a turbulent environment. In this poster, I will introduce one bubbly flow facility and one dusty flow facility for validating and verifying simulation results. Financial support for this project was provided by National Science Foundation under Grant Number: 1653389 and 1705246.

  20. An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications

    DTIC Science & Technology

    2011-05-01

    in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in

  1. MEqTrees Telescope and Radio-sky Simulations and CPU Benchmarking

    NASA Astrophysics Data System (ADS)

    Shanmugha Sundaram, G. A.

    2009-09-01

    MEqTrees is a Python-based implementation of the classical Measurement Equation, wherein the various 2×2 Jones matrices are parametrized representations in the spatial and sky domains for any generic radio telescope. Customized simulations of radio-source sky models and corrupt Jones terms are demonstrated based on a policy framework, with performance estimates derived for array configurations, ``dirty''-map residuals and processing power requirements for such computations on conventional platforms.

  2. Fluid-Structure Model of Lymphatic Valve and Vessel

    NASA Astrophysics Data System (ADS)

    Wolf, Ki; Ballard, Matthew; Nepiyushchikh, Zhanna; Razavi, Mohammad; Dixon, Brandon; Alexeev, Alexander

    The lymphatic system is a part of the circulatory system that performs a range of important functions such as transportation of interstitial fluid, fatty acid, and immune cells. The lymphatic vessels are composed of contractile walls to pump lymph against adverse pressure gradient and lymphatic valves that prevent back flow. Despite the importance of lymphatic system, the contribution of mechanical and geometric changes of lymphatic valves and vessels in pathologies of lymphatic dysfunction, such as lymphedema, is not well understood. We developed a coupled fluid-solid computational model to simultaneously simulate a lymphatic vessel, valve, and flow. A lattice Boltzmann model is used to represent the fluid component, while lattice spring model is used for the solid component of the lymphatic vessel, whose mechanical properties are derived experimentally. Behaviors such as lymph flow pattern and lymphatic valve performance against backflow and adverse pressure gradient under varied parameters of lymphatic valve and vessel geometry and mechanical properties are investigated to provide a better insight into the dynamics of lymphatic vessels, valves, and system and give insight into how they might fail in disease. NSF CMMI-1635133.

  3. MCNP Simulation Benchmarks for a Portable Inspection System for Narcotics, Explosives, and Nuclear Material Detection

    NASA Astrophysics Data System (ADS)

    Alfonso, Krystal; Elsalim, Mashal; King, Michael; Strellis, Dan; Gozani, Tsahi

    2013-04-01

    MCNPX simulations have been used to guide the development of a portable inspection system for narcotics, explosives, and special nuclear material (SNM) detection. The system seeks to address these threats to national security by utilizing a high-yield, compact neutron source to actively interrogate the threats and produce characteristic signatures that can then be detected by radiation detectors. The portability of the system enables rapid deployment and proximity to threats concealed in small spaces. Both dD and dT electronic neutron generators (ENG) were used to interrogate ammonium nitrate fuel oil (ANFO) and cocaine hydrochloride, and the detector response of NaI, CsI, and LaBr3 were compared. The effect of tungsten shielding on the neutron flux in the gamma ray detectors was investigated, while carbon, beryllium, and polyethylene ENG moderator materials were optimized by determining the reaction rate density in the threats. In order to benchmark the modeling results, experimental measurements are compared with MCNPX simulations. In addition, the efficiency and die-away time of a portable differential die-away analysis (DDAA) detector using 3He proportional counters for SNM detection has been determined.

  4. Control strategies for nitrous oxide emissions reduction on wastewater treatment plants operation.

    PubMed

    Santín, I; Barbu, M; Pedret, C; Vilanova, R

    2017-11-15

    The present paper focused on reducing greenhouse gases emissions in wastewater treatment plants operation by application of suitable control strategies. Specifically, the objective is to reduce nitrous oxide emissions during the nitrification process. Incomplete nitrification in the aerobic tanks can lead to an accumulation of nitrite that triggers the nitrous oxide emissions. In order to avoid the peaks of nitrous oxide emissions, this paper proposes a cascade control configuration by manipulating the dissolved oxygen set-points in the aerobic tanks. This control strategy is combined with ammonia cascade control already applied in the literature. This is performed with the objective to take also into account effluent pollutants and operational costs. In addition, other greenhouse gases emissions sources are also evaluated. Results have been obtained by simulation, using a modified version of Benchmark Simulation Model no. 2, which takes into account greenhouse gases emissions. This is called Benchmark Simulation Model no. 2 Gas. The results show that the proposed control strategies are able to reduce by 29.86% of nitrous oxide emissions compared to the default control strategy, while maintaining a satisfactory trade-off between water quality and costs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Experimental validation of the TOPAS Monte Carlo system for passive scattering proton therapy

    PubMed Central

    Testa, M.; Schümann, J.; Lu, H.-M.; Shin, J.; Faddegon, B.; Perl, J.; Paganetti, H.

    2013-01-01

    Purpose: TOPAS (TOol for PArticle Simulation) is a particle simulation code recently developed with the specific aim of making Monte Carlo simulations user-friendly for research and clinical physicists in the particle therapy community. The authors present a thorough and extensive experimental validation of Monte Carlo simulations performed with TOPAS in a variety of setups relevant for proton therapy applications. The set of validation measurements performed in this work represents an overall end-to-end testing strategy recommended for all clinical centers planning to rely on TOPAS for quality assurance or patient dose calculation and, more generally, for all the institutions using passive-scattering proton therapy systems. Methods: The authors systematically compared TOPAS simulations with measurements that are performed routinely within the quality assurance (QA) program in our institution as well as experiments specifically designed for this validation study. First, the authors compared TOPAS simulations with measurements of depth-dose curves for spread-out Bragg peak (SOBP) fields. Second, absolute dosimetry simulations were benchmarked against measured machine output factors (OFs). Third, the authors simulated and measured 2D dose profiles and analyzed the differences in terms of field flatness and symmetry and usable field size. Fourth, the authors designed a simple experiment using a half-beam shifter to assess the effects of multiple Coulomb scattering, beam divergence, and inverse square attenuation on lateral and longitudinal dose profiles measured and simulated in a water phantom. Fifth, TOPAS’ capabilities to simulate time dependent beam delivery was benchmarked against dose rate functions (i.e., dose per unit time vs time) measured at different depths inside an SOBP field. Sixth, simulations of the charge deposited by protons fully stopping in two different types of multilayer Faraday cups (MLFCs) were compared with measurements to benchmark the nuclear interaction models used in the simulations. Results: SOBPs’ range and modulation width were reproduced, on average, with an accuracy of +1, −2 and ±3 mm, respectively. OF simulations reproduced measured data within ±3%. Simulated 2D dose-profiles show field flatness and average field radius within ±3% of measured profiles. The field symmetry resulted, on average in ±3% agreement with commissioned profiles. TOPAS accuracy in reproducing measured dose profiles downstream the half beam shifter is better than 2%. Dose rate function simulation reproduced the measurements within ∼2% showing that the four-dimensional modeling of the passively modulation system was implement correctly and millimeter accuracy can be achieved in reproducing measured data. For MLFCs simulations, 2% agreement was found between TOPAS and both sets of experimental measurements. The overall results show that TOPAS simulations are within the clinical accepted tolerances for all QA measurements performed at our institution. Conclusions: Our Monte Carlo simulations reproduced accurately the experimental data acquired through all the measurements performed in this study. Thus, TOPAS can reliably be applied to quality assurance for proton therapy and also as an input for commissioning of commercial treatment planning systems. This work also provides the basis for routine clinical dose calculations in patients for all passive scattering proton therapy centers using TOPAS. PMID:24320505

  6. A robustness test of the braided device foreshortening algorithm

    NASA Astrophysics Data System (ADS)

    Moyano, Raquel Kale; Fernandez, Hector; Macho, Juan M.; Blasco, Jordi; San Roman, Luis; Narata, Ana Paula; Larrabide, Ignacio

    2017-11-01

    Different computational methods have been recently proposed to simulate the virtual deployment of a braided stent inside a patient vasculature. Those methods are primarily based on the segmentation of the region of interest to obtain the local vessel morphology descriptors. The goal of this work is to evaluate the influence of the segmentation quality on the method named "Braided Device Foreshortening" (BDF). METHODS: We used the 3DRA images of 10 aneurysmatic patients (cases). The cases were segmented by applying a marching cubes algorithm with a broad range of thresholds in order to generate 10 surface models each. We selected a braided device to apply the BDF algorithm to each surface model. The range of the computed flow diverter lengths for each case was obtained to calculate the variability of the method against the threshold segmentation values. RESULTS: An evaluation study over 10 clinical cases indicates that the final length of the deployed flow diverter in each vessel model is stable, shielding maximum difference of 11.19% in vessel diameter and maximum of 9.14% in the simulated stent length for the threshold values. The average coefficient of variation was found to be 4.08 %. CONCLUSION: A study evaluating how the threshold segmentation affects the simulated length of the deployed FD, was presented. The segmentation algorithm used to segment intracranial aneurysm 3D angiography images presents small variation in the resulting stent simulation.

  7. Why do veins appear blue? A new look at an old question

    NASA Astrophysics Data System (ADS)

    Kienle, Alwin; Hibst, Raimund; Steiner, Rudolf; Lilge, Lothar; Vitkin, I. Alex; Wilson, Brian C.; Patterson, Michael S.

    1996-03-01

    We investigate why vessels that contain blood, which has a red or a dark red color, may look bluish in human tissue. A CCD camera was used to make images of diffusely reflected light at different wavelengths. Measurements of reflectance that are due to model blood vessels in scattering media and of human skin containing a prominent vein are presented. Monte Carlo simulations were used to calculate the spatially resolved diffuse reflectance for both situations. We show that the color of blood vessels is scattering and absorption characteristics of skin at different wavelengths, (ii) the oxygenation state of blood, which affects its absorption properties, (iii) the diameter and the depth of the vessels, and (iv) the visual perception process.

  8. Manure nutrient management effects in the Leon River Watershed

    USDA-ARS?s Scientific Manuscript database

    The Leon River Watershed (LRW) in central Texas is a Benchmark and Special Emphasis watershed within the Conservation Effects Assessment Project (CEAP) located in central Texas. Model simulations from 1977 through 2006 were used to evaluate six manure nutrient management scenarios that reflect reali...

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Jay Prakash

    The objectives of this project are to calibrate the Advanced Experimental Fuel Counter (AEFC), benchmark MCNP simulations using experimental results, investigate the effects of change in fuel assembly geometry, and finally to show the boost in doubles count rates with 252Cf active soruces due to the time correlated induced fission (TCIF) effect.

  10. Simulations of hypervelocity impacts for asteroid deflection studies

    NASA Astrophysics Data System (ADS)

    Heberling, T.; Ferguson, J. M.; Gisler, G. R.; Plesko, C. S.; Weaver, R.

    2016-12-01

    The possibility of kinetic-impact deflection of threatening near-earth asteroids will be tested for the first time in the proposed AIDA (Asteroid Impact Deflection Assessment) mission, involving two independent spacecraft, NASAs DART (Double Asteroid Redirection Test) and ESAs AIM (Asteroid Impact Mission). The impact of the DART spacecraft onto the secondary of the binary asteroid 65803 Didymos, at a speed of 5 to 7 km/s, is expected to alter the mutual orbit by an observable amount. The velocity imparted to the secondary depends on the geometry and dynamics of the impact, and especially on the momentum enhancement factor, conventionally called beta. We use the Los Alamos hydrocodes Rage and Pagosa to estimate beta in laboratory-scale benchmark experiments and in the large-scale asteroid deflection test. Simulations are performed in two- and three-dimensions, using a variety of equations of state and strength models for both the lab-scale and large-scale cases. This work is being performed as part of a systematic benchmarking study for the AIDA mission that includes other hydrocodes.

  11. Adaptive Grid Refinement for Atmospheric Boundary Layer Simulations

    NASA Astrophysics Data System (ADS)

    van Hooft, Antoon; van Heerwaarden, Chiel; Popinet, Stephane; van der linden, Steven; de Roode, Stephan; van de Wiel, Bas

    2017-04-01

    We validate and benchmark an adaptive mesh refinement (AMR) algorithm for numerical simulations of the atmospheric boundary layer (ABL). The AMR technique aims to distribute the computational resources efficiently over a domain by refining and coarsening the numerical grid locally and in time. This can be beneficial for studying cases in which length scales vary significantly in time and space. We present the results for a case describing the growth and decay of a convective boundary layer. The AMR results are benchmarked against two runs using a fixed, fine meshed grid. First, with the same numerical formulation as the AMR-code and second, with a code dedicated to ABL studies. Compared to the fixed and isotropic grid runs, the AMR algorithm can coarsen and refine the grid such that accurate results are obtained whilst using only a fraction of the grid cells. Performance wise, the AMR run was cheaper than the fixed and isotropic grid run with similar numerical formulations. However, for this specific case, the dedicated code outperformed both aforementioned runs.

  12. Benchmarking electrophysiological models of human atrial myocytes

    PubMed Central

    Wilhelms, Mathias; Hettmann, Hanne; Maleckar, Mary M.; Koivumäki, Jussi T.; Dössel, Olaf; Seemann, Gunnar

    2013-01-01

    Mathematical modeling of cardiac electrophysiology is an insightful method to investigate the underlying mechanisms responsible for arrhythmias such as atrial fibrillation (AF). In past years, five models of human atrial electrophysiology with different formulations of ionic currents, and consequently diverging properties, have been published. The aim of this work is to give an overview of strengths and weaknesses of these models depending on the purpose and the general requirements of simulations. Therefore, these models were systematically benchmarked with respect to general mathematical properties and their ability to reproduce certain electrophysiological phenomena, such as action potential (AP) alternans. To assess the models' ability to replicate modified properties of human myocytes and tissue in cardiac disease, electrical remodeling in chronic atrial fibrillation (cAF) was chosen as test case. The healthy and remodeled model variants were compared with experimental results in single-cell, 1D and 2D tissue simulations to investigate AP and restitution properties, as well as the initiation of reentrant circuits. PMID:23316167

  13. Similarity indices of meteo-climatic gauging stations: definition and comparison.

    PubMed

    Barca, Emanuele; Bruno, Delia Evelina; Passarella, Giuseppe

    2016-07-01

    Space-time dependencies among monitoring network stations have been investigated to detect and quantify similarity relationships among gauging stations. In this work, besides the well-known rank correlation index, two new similarity indices have been defined and applied to compute the similarity matrix related to the Apulian meteo-climatic monitoring network. The similarity matrices can be applied to address reliably the issue of missing data in space-time series. In order to establish the effectiveness of the similarity indices, a simulation test was then designed and performed with the aim of estimating missing monthly rainfall rates in a suitably selected gauging station. The results of the simulation allowed us to evaluate the effectiveness of the proposed similarity indices. Finally, the multiple imputation by chained equations method was used as a benchmark to have an absolute yardstick for comparing the outcomes of the test. In conclusion, the new proposed multiplicative similarity index resulted at least as reliable as the selected benchmark.

  14. An experimental phylogeny to benchmark ancestral sequence reconstruction

    PubMed Central

    Randall, Ryan N.; Radford, Caelan E.; Roof, Kelsey A.; Natarajan, Divya K.; Gaucher, Eric A.

    2016-01-01

    Ancestral sequence reconstruction (ASR) is a still-burgeoning method that has revealed many key mechanisms of molecular evolution. One criticism of the approach is an inability to validate its algorithms within a biological context as opposed to a computer simulation. Here we build an experimental phylogeny using the gene of a single red fluorescent protein to address this criticism. The evolved phylogeny consists of 19 operational taxonomic units (leaves) and 17 ancestral bifurcations (nodes) that display a wide variety of fluorescent phenotypes. The 19 leaves then serve as ‘modern' sequences that we subject to ASR analyses using various algorithms and to benchmark against the known ancestral genotypes and ancestral phenotypes. We confirm computer simulations that show all algorithms infer ancient sequences with high accuracy, yet we also reveal wide variation in the phenotypes encoded by incorrectly inferred sequences. Specifically, Bayesian methods incorporating rate variation significantly outperform the maximum parsimony criterion in phenotypic accuracy. Subsampling of extant sequences had minor effect on the inference of ancestral sequences. PMID:27628687

  15. The MCUCN simulation code for ultracold neutron physics

    NASA Astrophysics Data System (ADS)

    Zsigmond, G.

    2018-02-01

    Ultracold neutrons (UCN) have very low kinetic energies 0-300 neV, thereby can be stored in specific material or magnetic confinements for many hundreds of seconds. This makes them a very useful tool in probing fundamental symmetries of nature (for instance charge-parity violation by neutron electric dipole moment experiments) and contributing important parameters for the Big Bang nucleosynthesis (neutron lifetime measurements). Improved precision experiments are in construction at new and planned UCN sources around the world. MC simulations play an important role in the optimization of such systems with a large number of parameters, but also in the estimation of systematic effects, in benchmarking of analysis codes, or as part of the analysis. The MCUCN code written at PSI has been extensively used for the optimization of the UCN source optics and in the optimization and analysis of (test) experiments within the nEDM project based at PSI. In this paper we present the main features of MCUCN and interesting benchmark and application examples.

  16. Mathematical model and metaheuristics for simultaneous balancing and sequencing of a robotic mixed-model assembly line

    NASA Astrophysics Data System (ADS)

    Li, Zixiang; Janardhanan, Mukund Nilakantan; Tang, Qiuhua; Nielsen, Peter

    2018-05-01

    This article presents the first method to simultaneously balance and sequence robotic mixed-model assembly lines (RMALB/S), which involves three sub-problems: task assignment, model sequencing and robot allocation. A new mixed-integer programming model is developed to minimize makespan and, using CPLEX solver, small-size problems are solved for optimality. Two metaheuristics, the restarted simulated annealing algorithm and co-evolutionary algorithm, are developed and improved to address this NP-hard problem. The restarted simulated annealing method replaces the current temperature with a new temperature to restart the search process. The co-evolutionary method uses a restart mechanism to generate a new population by modifying several vectors simultaneously. The proposed algorithms are tested on a set of benchmark problems and compared with five other high-performing metaheuristics. The proposed algorithms outperform their original editions and the benchmarked methods. The proposed algorithms are able to solve the balancing and sequencing problem of a robotic mixed-model assembly line effectively and efficiently.

  17. Analysis of the effects of gravity and wall thickness in a model of blood flow through axisymmetric vessels.

    PubMed

    Payne, S J

    2004-11-01

    The effects of gravitational forces and wall thickness on the behaviour of a model of blood flow through axisymmetric vessels were studied. The governing fluid dynamic equations were derived from the Navier-Stokes equations for an incompressible fluid and linked to a simple model of the vessel wall. A closed form of the hyperbolic partial differential equations was found, including a significant source term from the gravitational forces. The inclination of the vessel is modelled using a slope parameter that varied between -1 and 1. The wave speed was shown to be related to the wall thickness, and the time to first shock formation was shown to be directly proportional to this thickness. Two non-dimensional parameters were derived for the ratio of gravitational forces to viscous and momentum forces, respectively, and their values were calculated for the different types of vessel found in the human vasculature, showing that gravitational forces were significant in comparison with either viscous or momentum forces for every type of vessel. The steady-state solution of the governing equations showed that gravitational forces cause an increase in area of approximately 5% per metre per unit slope. Numerical simulations of the flow field in the aorta showed that a positive slope causes a velocity pulse to change in amplitude approximately linearly with distance: -4% per metre and +5% per metre for vessels inclined vertically upwards and downwards, respectively, in comparison with only +0.5% for a horizontal vessel. These simulations also showed that the change relative to the zero slope condition in the maximum rate of change of area with distance, which was taken to be a measure of the rate of shock formation, is proportional to both the slope and the wall thickness-to-inner radius ratio, with a constant of proportionality of 1.2. At a ratio of 0.25, typical of that found in human arteries, the distance to shock formation is thus decreased and increased by 30% for vessels inclined vertically downwards and upwards, respectively. Gravity and wall thickness thus have a significant impact on a number of aspects of the fluid and wall behaviour, despite conventionally being neglected.

  18. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics.

    PubMed

    Strehl, Robert; Ilie, Silvana

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated on three benchmarking systems, with special focus on approximation accuracy and efficiency.

  19. A simple design for microwave assisted digestion vessel with low reagent consumption suitable for food and environmental samples

    NASA Astrophysics Data System (ADS)

    Gholami, Mehrdad; Behkami, Shima; Zain, Sharifuddin Md.; Bakirdere, Sezgin

    2016-11-01

    The objective of this work is to prepare a cost-effective, low reagent consumption and high performance polytetrafluoroethylene (PTFE) vessel that is capable to work in domestic microwave for digesting food and environmental samples. The designed vessel has a relatively thicker wall compared to that of commercial vessels. In this design, eight vessels are placed in an acrylonitrile butadiene styrene (ABS) holder to keep them safe and stable. This vessel needs only 2.0 mL of HNO3 and 1.0 mL H2O2 to digest 100 mg of biological sample. The performance of this design is then evaluated with an ICP-MS instrument in the analysis of the several NIST standard reference material of milk 1849a, rice flour 1568b, spinach leave 1570a and Peach Leaves 1547 in a domestic microwave oven with inverter technology. Outstanding agreement to (SRM) values are observed by using the suggested power to time microwave program, which simulates the reflux action occurring in this closed vessel. Taking into account the high cost of commercial microwave vessels and the volume of chemicals needed for various experiments (8-10 mL), this simple vessel is cost effective and suitable for digesting food and environmental samples.

  20. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE PAGES

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain; ...

    2017-09-23

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  1. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  2. A preliminary assessment of the effects of heat flux distribution and penetration on the creep rupture of a reactor vessel lower head

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, T.Y.; Bentz, J.; Simpson, R.

    1997-02-01

    The objective of the Lower Head Failure (LHF) Experiment Program is to experimentally investigate and characterize the failure of the reactor vessel lower head due to thermal and pressure loads under severe accident conditions. The experiment is performed using 1/5-scale models of a typical PWR pressure vessel. Experiments are performed for various internal pressure and imposed heat flux distributions with and without instrumentation guide tube penetrations. The experimental program is complemented by a modest modeling program based on the application of vessel creep rupture codes developed in the TMI Vessel Investigation Project. The first three experiments under the LHF programmore » investigated the creep rupture of simulated reactor pressure vessels without penetrations. The heat flux distributions for the three experiments are uniform (LHF-1), center-peaked (LHF-2), and side-peaked (LHF-3), respectively. For all the experiments, appreciable vessel deformation was observed to initiate at vessel wall temperatures above 900K and the vessel typically failed at approximately 1000K. The size of failure was always observed to be smaller than the heated region. For experiments with non-uniform heat flux distributions, failure typically occurs in the region of peak temperature. A brief discussion of the effect of penetration is also presented.« less

  3. Absolute probability estimates of lethal vessel strikes to North Atlantic right whales in Roseway Basin, Scotian Shelf.

    PubMed

    van der Hoop, Julie M; Vanderlaan, Angelia S M; Taggart, Christopher T

    2012-10-01

    Vessel strikes are the primary source of known mortality for the endangered North Atlantic right whale (Eubalaena glacialis). Multi-institutional efforts to reduce mortality associated with vessel strikes include vessel-routing amendments such as the International Maritime Organization voluntary "area to be avoided" (ATBA) in the Roseway Basin right whale feeding habitat on the southwestern Scotian Shelf. Though relative probabilities of lethal vessel strikes have been estimated and published, absolute probabilities remain unknown. We used a modeling approach to determine the regional effect of the ATBA, by estimating reductions in the expected number of lethal vessel strikes. This analysis differs from others in that it explicitly includes a spatiotemporal analysis of real-time transits of vessels through a population of simulated, swimming right whales. Combining automatic identification system (AIS) vessel navigation data and an observationally based whale movement model allowed us to determine the spatial and temporal intersection of vessels and whales, from which various probability estimates of lethal vessel strikes are derived. We estimate one lethal vessel strike every 0.775-2.07 years prior to ATBA implementation, consistent with and more constrained than previous estimates of every 2-16 years. Following implementation, a lethal vessel strike is expected every 41 years. When whale abundance is held constant across years, we estimate that voluntary vessel compliance with the ATBA results in an 82% reduction in the per capita rate of lethal strikes; very similar to a previously published estimate of 82% reduction in the relative risk of a lethal vessel strike. The models we developed can inform decision-making and policy design, based on their ability to provide absolute, population-corrected, time-varying estimates of lethal vessel strikes, and they are easily transported to other regions and situations.

  4. Influence of Assimilation of Subsurface Temperature Measurements on Simulations of Equatorial Undercurrent and South Equatorial Current Along the Pacific Equator

    NASA Technical Reports Server (NTRS)

    Halpern, David; Leetmaan, Ants; Reynolds, Richard W.; Ji, Ming

    1997-01-01

    Equatorial Pacific current and temperature fields were simulated with and without assimilation of subsurface temperature measurements for April 1992 - March 1995, and compared with moored bouy and research vessel current measurements.

  5. Distributed sensing of Composite Over-wrapped Pressure Vessel using Fiber-Bragg Gratings at Ambient and Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Grant, Joseph

    2005-01-01

    Fiber Bragg gratings are use to monitor the structural properties of composite pressure vessels. These gratings optically inscribed into the core of a single mode fiber are used as a tool to monitor the stress strain relation in laminate structure. The fiber Bragg sensors are both embedded within the composite laminates and bonded to the surface of the vessel with varying orientations with respect to the carbon fiber in the epoxy matrix. The response of these fiber-optic sensors is investigated by pressurizing the cylinder up to its burst pressure of around 2800 psi. This is done at both ambient and cryogenic temperatures using water and liquid nitrogen. The recorded response is compared with the response from conventional strain gauge also present on the vessel. Additionally, several vessels were tested that had been damaged to simulate different type of events, such as cut tow, delimitation and impact damage.

  6. Distributed Sensing of Composite Over-wrapped Pressure Vessel Using Fiber-Bragg Gratings at Ambient and Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Grant, Joseph

    2004-01-01

    Fiber Bragg gratings are use to monitor the structural properties of composite pressure vessels. These gratings optically inscribed into the core of a single mode fiber are used as a tool to monitor the stress strain relation in laminate structure. The fiber Bragg sensors are both embedded within the composite laminates and bonded to the surface of the vessel with varying orientations with respect to the carbon fiber in the epoxy matrix. The response of these fiber-optic sensors is investigated by pressurizing the cylinder up to its burst pressure of around 2800 psi. This is done at both ambient and cryogenic temperatures using water and liquid nitrogen. The recorded response is compared with the response from conventional strain gauge also present on the vessel. Additionally, several vessels were tested that had been damaged to simulate different type of events, such as cut tow, delimitation and impact damage.

  7. Two-Dimensional Self-Consistent Radio Frequency Plasma Simulations Relevant to the Gaseous Electronics Conference RF Reference Cell

    PubMed Central

    Lymberopoulos, Dimitris P.; Economou, Demetre J.

    1995-01-01

    Over the past few years multidimensional self-consistent plasma simulations including complex chemistry have been developed which are promising tools for furthering our understanding of reactive gas plasmas and for reactor design and optimization. These simulations must be benchmarked against experimental data obtained in well-characterized systems such as the Gaseous Electronics Conference (GEC) reference cell. Two-dimensional simulations relevant to the GEC Cell are reviewed in this paper with emphasis on fluid simulations. Important features observed experimentally, such as off-axis maxima in the charge density and hot spots of metastable species density near the electrode edges in capacitively-coupled GEC cells, have been captured by these simulations. PMID:29151756

  8. New Multi-group Transport Neutronics (PHISICS) Capabilities for RELAP5-3D and its Application to Phase I of the OECD/NEA MHTGR-350 MW Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Cristian Rabiti; Andrea Alfonsi

    2012-10-01

    PHISICS is a neutronics code system currently under development at the Idaho National Laboratory (INL). Its goal is to provide state of the art simulation capability to reactor designers. The different modules for PHISICS currently under development are a nodal and semi-structured transport core solver (INSTANT), a depletion module (MRTAU) and a cross section interpolation (MIXER) module. The INSTANT module is the most developed of the mentioned above. Basic functionalities are ready to use, but the code is still in continuous development to extend its capabilities. This paper reports on the effort of coupling the nodal kinetics code package PHISICSmore » (INSTANT/MRTAU/MIXER) to the thermal hydraulics system code RELAP5-3D, to enable full core and system modeling. This will enable the possibility to model coupled (thermal-hydraulics and neutronics) problems with more options for 3D neutron kinetics, compared to the existing diffusion theory neutron kinetics module in RELAP5-3D (NESTLE). In the second part of the paper, an overview of the OECD/NEA MHTGR-350 MW benchmark is given. This benchmark has been approved by the OECD, and is based on the General Atomics 350 MW Modular High Temperature Gas Reactor (MHTGR) design. The benchmark includes coupled neutronics thermal hydraulics exercises that require more capabilities than RELAP5-3D with NESTLE offers. Therefore, the MHTGR benchmark makes extensive use of the new PHISICS/RELAP5-3D coupling capabilities. The paper presents the preliminary results of the three steady state exercises specified in Phase I of the benchmark using PHISICS/RELAP5-3D.« less

  9. Computer simulation of multigrid body dynamics and control

    NASA Technical Reports Server (NTRS)

    Swaminadham, M.; Moon, Young I.; Venkayya, V. B.

    1990-01-01

    The objective is to set up and analyze benchmark problems on multibody dynamics and to verify the predictions of two multibody computer simulation codes. TREETOPS and DISCOS have been used to run three example problems - one degree-of-freedom spring mass dashpot system, an inverted pendulum system, and a triple pendulum. To study the dynamics and control interaction, an inverted planar pendulum with an external body force and a torsional control spring was modeled as a hinge connected two-rigid body system. TREETOPS and DISCOS affected the time history simulation of this problem. System state space variables and their time derivatives from two simulation codes were compared.

  10. AGREEMENT AND COVERAGE OF INDICATORS OF RESPONSE TO INTERVENTION: A MULTI-METHOD COMPARISON AND SIMULATION

    PubMed Central

    Fletcher, Jack M.; Stuebing, Karla K.; Barth, Amy E.; Miciak, Jeremy; Francis, David J.; Denton, Carolyn A.

    2013-01-01

    Purpose Agreement across methods for identifying students as inadequate responders or as learning disabled is often poor. We report (1) an empirical examination of final status (post-intervention benchmarks) and dual-discrepancy growth methods based on growth during the intervention and final status for assessing response to intervention; and (2) a statistical simulation of psychometric issues that may explain low agreement. Methods After a Tier 2 intervention, final status benchmark criteria were used to identify 104 inadequate and 85 adequate responders to intervention, with comparisons of agreement and coverage for these methods and a dual-discrepancy method. Factors affecting agreement were investigated using computer simulation to manipulate reliability, the intercorrelation between measures, cut points, normative samples, and sample size. Results Identification of inadequate responders based on individual measures showed that single measures tended not to identify many members of the pool of 104 inadequate responders. Poor to fair levels of agreement for identifying inadequate responders were apparent between pairs of measures In the simulation, comparisons across two simulated measures generated indices of agreement (kappa) that were generally low because of multiple psychometric issues inherent in any test. Conclusions Expecting excellent agreement between two correlated tests with even small amounts of unreliability may not be realistic. Assessing outcomes based on multiple measures, such as level of CBM performance and short norm-referenced assessments of fluency may improve the reliability of diagnostic decisions. PMID:25364090

  11. ff14ipq: A Self-Consistent Force Field for Condensed-Phase Simulations of Proteins

    PubMed Central

    2015-01-01

    We present the ff14ipq force field, implementing the previously published IPolQ charge set for simulations of complete proteins. Minor modifications to the charge derivation scheme and van der Waals interactions between polar atoms are introduced. Torsion parameters are developed through a generational learning approach, based on gas-phase MP2/cc-pVTZ single-point energies computed of structures optimized by the force field itself rather than the quantum benchmark. In this manner, we sacrifice information about the true quantum minima in order to ensure that the force field maintains optimal agreement with the MP2/cc-pVTZ benchmark for the ensembles it will actually produce in simulations. A means of making the gas-phase torsion parameters compatible with solution-phase IPolQ charges is presented. The ff14ipq model is an alternative to ff99SB and other Amber force fields for protein simulations in programs that accommodate pair-specific Lennard–Jones combining rules. The force field gives strong performance on α-helical and β-sheet oligopeptides as well as globular proteins over microsecond time scale simulations, although it has not yet been tested in conjunction with lipid and nucleic acid models. We show how our choices in parameter development influence the resulting force field and how other choices that may have appeared reasonable would actually have led to poorer results. The tools we developed may also aid in the development of future fixed-charge and even polarizable biomolecular force fields. PMID:25328495

  12. Cell culture for three-dimensional modeling in rotating-wall vessels: an application of simulated microgravity

    NASA Technical Reports Server (NTRS)

    Schwarz, R. P.; Goodwin, T. J.; Wolf, D. A.

    1992-01-01

    High-density, three-dimensional cell cultures are difficult to grow in vitro. The rotating-wall vessel (RWV) described here has cultured BHK-21 cells to a density of 1.1 X 10(7) cells/ml. Cells on microcarriers were observed to grow with enhanced bridging in this batch culture system. The RWV is a horizontally rotated tissue culture vessel with silicon membrane oxygenation. This design results in a low-turbulence, low-shear cell culture environment with abundant oxygenation. The RWV has the potential to culture a wide variety of normal and neoplastic cells.

  13. Continuum mathematical modelling of pathological growth of blood vessels

    NASA Astrophysics Data System (ADS)

    Stadnik, N. E.; Dats, E. P.

    2018-04-01

    The present study is devoted to the mathematical modelling of a human blood vessel pathological growth. The vessels are simulated as the thin-walled circular tube. The boundary value problem of the surface growth of an elastic thin-walled cylinder is solved. The analytical solution is obtained in terms of velocities of stress strain state parameters. The condition of thinness allows us to study finite displacements of cylinder surfaces by means of infinitesimal deformations. The stress-strain state characteristics, which depend on the mechanical parameters of the biological processes, are numerically computed and graphically analysed.

  14. Development of a Pebble-Bed Liquid-Nitrogen Evaporator/Superheater for the BRL 1/6th Scale Large Blast/Thermal Simulator Test Bed. Phase 1. Prototype Design and Analysis

    DTIC Science & Technology

    1991-08-01

    specifications are taken primarily from the 1983 version of the ASME Boiler and Pressure Vessel Code . Other design requirements were developea from standard safe...rules and practices of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code to provide a safe and reliable system

  15. Benchmarking the evaluated proton differential cross sections suitable for the EBS analysis of natSi and 16O

    NASA Astrophysics Data System (ADS)

    Kokkoris, M.; Dede, S.; Kantre, K.; Lagoyannis, A.; Ntemou, E.; Paneta, V.; Preketes-Sigalas, K.; Provatas, G.; Vlastou, R.; Bogdanović-Radović, I.; Siketić, Z.; Obajdin, N.

    2017-08-01

    The evaluated proton differential cross sections suitable for the Elastic Backscattering Spectroscopy (EBS) analysis of natSi and 16O, as obtained from SigmaCalc 2.0, have been benchmarked over a wide energy and angular range at two different accelerator laboratories, namely at N.C.S.R. 'Demokritos', Athens, Greece and at Ruđer Bošković Institute (RBI), Zagreb, Croatia, using a variety of high-purity thick targets of known stoichiometry. The results are presented in graphical and tabular forms, while the observed discrepancies, as well as, the limits in accuracy of the benchmarking procedure, along with target related effects, are thoroughly discussed and analysed. In the case of oxygen the agreement between simulated and experimental spectra was generally good, while for silicon serious discrepancies were observed above Ep,lab = 2.5 MeV, suggesting that a further tuning of the appropriate nuclear model parameters in the evaluated differential cross-section datasets is required.

  16. OPTIMIZATION OF DEEP DRILLING PERFORMANCE--DEVELOPMENT AND BENCHMARK TESTING OF ADVANCED DIAMOND PRODUCT DRILL BITS & HP/HT FLUIDS TO SIGNIFICANTLY IMPROVE RATES OF PENETRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alan Black; Arnis Judzis

    2004-10-01

    The industry cost shared program aims to benchmark drilling rates of penetration in selected simulated deep formations and to significantly improve ROP through a team development of aggressive diamond product drill bit--fluid system technologies. Overall the objectives are as follows: Phase 1--Benchmark ''best in class'' diamond and other product drilling bits and fluids and develop concepts for a next level of deep drilling performance; Phase 2--Develop advanced smart bit-fluid prototypes and test at large scale; and Phase 3--Field trial smart bit-fluid concepts, modify as necessary and commercialize products. As of report date, TerraTek has concluded all major preparations for themore » high pressure drilling campaign. Baker Hughes encountered difficulties in providing additional pumping capacity before TerraTek's scheduled relocation to another facility, thus the program was delayed further to accommodate the full testing program.« less

  17. Impact of chemistry on Standard High Solids Vessel Design mixing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poirier, M.

    2016-03-02

    The plan for resolving technical issues regarding mixing performance within vessels of the Hanford Waste Treatment Plant Pretreatment Facility directs a chemical impact study to be performed. The vessels involved are those that will process higher (e.g., 5 wt % or more) concentrations of solids. The mixing equipment design for these vessels includes both pulse jet mixers (PJM) and air spargers. This study assesses the impact of feed chemistry on the effectiveness of PJM mixing in the Standard High Solids Vessel Design (SHSVD). The overall purpose of this study is to complement the Properties that Matter document in helping tomore » establish an acceptable physical simulant for full-scale testing. The specific objectives for this study are (1) to identify the relevant properties and behavior of the in-process tank waste that control the performance of the system being tested, (2) to assess the solubility limits of key components that are likely to precipitate or crystallize due to PJM and sparger interaction with the waste feeds, (3) to evaluate the impact of waste chemistry on rheology and agglomeration, (4) to assess the impact of temperature on rheology and agglomeration, (5) to assess the impact of organic compounds on PJM mixing, and (6) to provide the technical basis for using a physical-rheological simulant rather than a physical-rheological-chemical simulant for full-scale vessel testing. Among the conclusions reached are the following: The primary impact of precipitation or crystallization of salts due to interactions between PJMs or spargers and waste feeds is to increase the insoluble solids concentration in the slurries, which will increase the slurry yield stress. Slurry yield stress is a function of pH, ionic strength, insoluble solids concentration, and particle size. Ionic strength and chemical composition can affect particle size. Changes in temperature can affect SHSVD mixing through its effect on properties such as viscosity, yield stress, solubility, and vapor pressure, or chemical reactions that occur at high temperatures. Organic compounds will affect SHSVD mixing through their effect on properties such as rheology, particle agglomeration/size, particle density, and particle concentration.« less

  18. Consideration of Real World Factors Influencing Greenhouse Gas Emissions in ALPHA

    EPA Science Inventory

    Discuss a variety of factors that influence the simulated fuel economy and GHG emissions that are often overlooked and updates made to ALPHA based on actual benchmarking data observed across a range of vehicles and transmissions. ALPHA model calibration is also examined, focusin...

  19. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the targetmore » vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.« less

  20. NRL 1989 Beam Propagation Studies in Support of the ATA Multi-Pulse Propagation Experiment

    DTIC Science & Technology

    1990-08-31

    papers presented here were all written prior to the completion of the experiment. The first of these papers presents simulation results which modeled ...beam stability and channel evolution for an entire five pulse burst. The second paper describes a new air chemistry model used in the SARLAC...Experiment: A new air chemistry model for use in the propagation codes simulating the MPPE was developed by making analytic fits to benchmark runs with

  1. A partial entropic lattice Boltzmann MHD simulation of the Orszag-Tang vortex

    NASA Astrophysics Data System (ADS)

    Flint, Christopher; Vahala, George

    2018-02-01

    Karlin has introduced an analytically determined entropic lattice Boltzmann (LB) algorithm for Navier-Stokes turbulence. Here, this is partially extended to an LB model of magnetohydrodynamics, on using the vector distribution function approach of Dellar for the magnetic field (which is permitted to have field reversal). The partial entropic algorithm is benchmarked successfully against standard simulations of the Orszag-Tang vortex [Orszag, S.A.; Tang, C.M. J. Fluid Mech. 1979, 90 (1), 129-143].

  2. MCNP modelling of scintillation-detector gamma-ray spectra from natural radionuclides.

    PubMed

    Hendriks, P H G M; Maucec, M; de Meijer, R J

    2002-09-01

    gamma-ray spectra of natural radionuclides are simulated for a BGO detector in a borehole geometry using the Monte Carlo code MCNP. All gamma-ray emissions of the decay of 40K and the series of 232Th and 238U are used to describe the source. A procedure is proposed which excludes the time-consuming electron tracking in less relevant areas of the geometry. The simulated gamma-ray spectra are benchmarked against laboratory data.

  3. Target Lagrangian kinematic simulation for particle-laden flows.

    PubMed

    Murray, S; Lightstone, M F; Tullis, S

    2016-09-01

    The target Lagrangian kinematic simulation method was motivated as a stochastic Lagrangian particle model that better synthesizes turbulence structure, relative to stochastic separated flow models. By this method, the trajectories of particles are constructed according to synthetic turbulent-like fields, which conform to a target Lagrangian integral timescale. In addition to recovering the expected Lagrangian properties of fluid tracers, this method is shown to reproduce the crossing trajectories and continuity effects, in agreement with an experimental benchmark.

  4. Study on the Depth, Rate, Shape, and Strength of Pulse with Cardiovascular Simulator.

    PubMed

    Lee, Ju-Yeon; Jang, Min; Shin, Sang-Hoon

    2017-01-01

    Pulse diagnosis is important in oriental medicine. The purpose of this study is explaining the mechanisms of pulse with a cardiovascular simulator. The simulator is comprised of the pulse generating part, the vessel part, and the measurement part. The pulse generating part was composed of motor, slider-crank mechanism, and piston pump. The vessel part, which was composed with the aorta and a radial artery, was fabricated with silicon to implement pulse wave propagation. The pulse parameters, such as the depth, rate, shape, and strength, were simulated. With changing the mean pressure, the floating pulse and the sunken pulse were generated. The change of heart rate generated the slow pulse and the rapid pulse. The control of the superposition time of the reflected wave generated the string-like pulse and the slippery pulse. With changing the pulse pressure, the vacuous pulse and the replete pulse were generated. The generated pulses showed good agreements with the typical pulses.

  5. Simulation of magnetic island dynamics under resonant magnetic perturbation with the TEAR code and validation of the results on T-10 tokamak data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, N. V.; Kakurin, A. M.

    2014-10-15

    Simulation of the magnetic island evolution under Resonant Magnetic Perturbation (RMP) in rotating T-10 tokamak plasma is presented with intent of TEAR code experimental validation. In the T-10 experiment chosen for simulation, the RMP consists of a stationary error field, a magnetic field of the eddy current in the resistive vacuum vessel and magnetic field of the externally applied controlled halo current in the plasma scrape-off layer (SOL). The halo-current loop consists of a rail limiter, plasma SOL, vacuum vessel, and external part of the circuit. Effects of plasma resistivity, viscosity, and RMP are taken into account in the TEARmore » code based on the two-fluid MHD approximation. Radial distribution of the magnetic flux perturbation is calculated with account of the externally applied RMP. A good agreement is obtained between the simulation results and experimental data for the cases of preprogrammed and feedback-controlled halo current in the plasma SOL.« less

  6. Investigating the Transonic Flutter Boundary of the Benchmark Supercritical Wing

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chwalowski, Pawel

    2017-01-01

    This paper builds on the computational aeroelastic results published previously and generated in support of the second Aeroelastic Prediction Workshop for the NASA Benchmark Supercritical Wing configuration. The computational results are obtained using FUN3D, an unstructured grid Reynolds-Averaged Navier-Stokes solver developed at the NASA Langley Research Center. The analysis results focus on understanding the dip in the transonic flutter boundary at a single Mach number (0.74), exploring an angle of attack range of ??1 to 8 and dynamic pressures from wind off to beyond flutter onset. The rigid analysis results are examined for insights into the behavior of the aeroelastic system. Both static and dynamic aeroelastic simulation results are also examined.

  7. Test and Verification of AES Used for Image Encryption

    NASA Astrophysics Data System (ADS)

    Zhang, Yong

    2018-03-01

    In this paper, an image encryption program based on AES in cipher block chaining mode was designed with C language. The encryption/decryption speed and security performance of AES based image cryptosystem were tested and used to compare the proposed cryptosystem with some existing image cryptosystems based on chaos. Simulation results show that AES can apply to image encryption, which refutes the widely accepted point of view that AES is not suitable for image encryption. This paper also suggests taking the speed of AES based image encryption as the speed benchmark of image encryption algorithms. And those image encryption algorithms whose speeds are lower than the benchmark should be discarded in practical communications.

  8. A method for increasing the homogeneity of the temperature distribution during magnetic fluid hyperthermia with a Fe-Cr-Nb-B alloy in the presence of blood vessels

    NASA Astrophysics Data System (ADS)

    Tang, Yundong; Flesch, Rodolfo C. C.; Jin, Tao

    2017-06-01

    Magnetic hyperthermia ablates tumor cells by absorbing the thermal energy from magnetic nanoparticles (MNPs) under an external alternating magnetic field. The blood vessels (BVs) within tumor region can generally reduce treatment effectiveness due to the cooling effect of blood flow. This paper aims to investigate the cooling effect of BVs on the temperature field of malignant tumor regions using a complex geometric model and numerical simulation. For deriving the model, the Navier-Stokes equation for blood flow is combined with Pennes bio-heat transfer equation for human tissue. The effects on treatment temperature caused by two different BV distributions inside a mammary tumor are analyzed through numerical simulation under different conditions of flow rate considering a Fe-Cr-Nb-B alloy, which has low Curie temperature ranging from 42 °C to 45 °C. Numerical results show that the multi-vessel system has more obvious cooling effects than the single vessel one on the temperature field distribution for hyperthermia. Besides, simulation results show that the temperature field within tumor area can also be influenced by the velocity and diameter of BVs. To minimize the cooling effect, this article proposes a treatment method based on the increase of the thermal energy provided to MNPs associated with the adoption of low Curie temperature particles recently reported in literature. Results demonstrate that this approach noticeably improves the uniformity of the temperature field, and shortens the treatment time in a Fe-Cr-Nb-B system, thus reducing the side effects to the patient.

  9. Numerical investigations of the unsteady blood flow in the end-to-side arteriovenous fistula for hemodialysis.

    PubMed

    Jodko, Daniel; Obidowski, Damian; Reorowicz, Piotr; Jóźwik, Krzysztof

    2016-01-01

    The aim of this study was to investigate the blood flow in the end-to-side arteriovenous (a-v) fistula, taking into account its pulsating nature and the patient-specific geometry of blood vessels. Computational Fluid Dynamics (CFD) methods were used for this analysis. DICOM images of the fistula, obtained from the angio-computed tomography, were a source of the data applied to develop a 3D geometrical model of the fistula. The model was meshed, then the ANSYS CFX v. 15.0 code was used to perform simulations of the flow in the vessels under analysis. Mesh independence tests were conducted. The non-Newtonian rheological model of blood and the Shear Stress Transport model of turbulence were employed. Blood vessel walls were assumed to be rigid. Flow patterns, velocity fields, the volume flow rate, the wall shear stress (WSS) propagation on particular blood vessel walls were shown versus time. The maximal value of the blood velocity was identified in the anastomosis - the place where the artery is connected to the vein. The flow rate was calculated for all veins receiving blood. The blood flow in the geometrically complicated a-v fistula was simulated. The values and oscillations of the WSS are the largest in the anastomosis, much lower in the artery and the lowest in the cephalic vein. A strong influence of the mesh on the results concerning the maximal and area-averaged WSS was shown. The relation between simulations of the pulsating and stationary flow under time-averaged flow conditions was presented.

  10. Visible-light OCT to quantify retinal oxygen metabolism (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Zhang, Hao F.; Yi, Ji; Chen, Siyu; Liu, Wenzhong; Soetikno, Brian T.

    2016-03-01

    We explored, both numerically and experimentally, whether OCT can be a good candidate to accurately measure retinal oxygen metabolism. We first used statistical methods to numerically simulate photon transport in the retina to mimic OCT working under different spectral ranges. Then we analyze accuracy of OCT oximetry subject to parameter variations such as vessel size, pigmentation, and oxygenation. We further developed an experimental OCT system based on the spectral range identified by our simulation work. We applied the newly developed OCT to measure both retinal hemoglobin oxygen saturation (sO2) and retinal retinal flow. After obtaining the retinal sO2 and blood velocity, we further measured retinal vessel diameter and calculated the retinal oxygen metabolism rate (MRO2). To test the capability of our OCT, we imaged wild-type Long-Evans rats ventilated with both normal air and air mixtures with various oxygen concentrations. Our simulation suggested that OCT working within visible spectral range is able to provide accurate measurement of retinal MRO2 using inverse Fourier transform spectral reconstruction. We called this newly developed technology vis-OCT, and showed that vis-OCT was able to measure the sO2 value in every single major retinal vessel around the optical disk as well as in micro retinal vessels. When breathing normal air, the averaged sO2 in arterial and venous blood in Long-Evans rats was measured to be 95% and 72%, respectively. When we challenge the rats using air mixtures with different oxygen concentrations, vis-OCT measurement followed analytical models of retinal oxygen diffusion and pulse oximeter well.

  11. On the linear stability of blood flow through model capillary networks.

    PubMed

    Davis, Jeffrey M

    2014-12-01

    Under the approximation that blood behaves as a continuum, a numerical implementation is presented to analyze the linear stability of capillary blood flow through model tree and honeycomb networks that are based on the microvascular structures of biological tissues. The tree network is comprised of a cascade of diverging bifurcations, in which a parent vessel bifurcates into two descendent vessels, while the honeycomb network also contains converging bifurcations, in which two parent vessels merge into one descendent vessel. At diverging bifurcations, a cell partitioning law is required to account for the nonuniform distribution of red blood cells as a function of the flow rate of blood into each descendent vessel. A linearization of the governing equations produces a system of delay differential equations involving the discharge hematocrit entering each network vessel and leads to a nonlinear eigenvalue problem. All eigenvalues in a specified region of the complex plane are captured using a transformation based on contour integrals to construct a linear eigenvalue problem with identical eigenvalues, which are then determined using a standard QR algorithm. The predicted value of the dimensionless exponent in the cell partitioning law at the instability threshold corresponds to a supercritical Hopf bifurcation in numerical simulations of the equations governing unsteady blood flow. Excellent agreement is found between the predictions of the linear stability analysis and nonlinear simulations. The relaxation of the assumption of plug flow made in previous stability analyses typically has a small, quantitative effect on the stability results that depends on the specific network structure. This implementation of the stability analysis can be applied to large networks with arbitrary structure provided only that the connectivity among the network segments is known.

  12. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Effect of Rolling Massage on Particle Moving Behaviour in Blood Vessels

    NASA Astrophysics Data System (ADS)

    Yi, Hou-Hui; Fan, Li-Juan; Yang, Xiao-Feng; Chen, Yan-Yan

    2008-09-01

    The rolling massage manipulation is a classic Chinese massage, which is expected to eliminate many diseases. Here the effect of the rolling massage on the particle moving property in the blood vessels under the rolling massage manipulation is studied by the lattice Boltzmann simulation. The simulation results show that the particle moving behaviour depends on the rolling velocity, the distance between particle position and rolling position. The average values, including particle translational velocity and angular velocity, increase as the rolling velocity increases almost linearly. The result is helpful to understand the mechanism of the massage and develop the rolling techniques.

  13. Investigation of laser Doppler techniques using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ruetten, Walter; Gellekum, Thomas; Jessen, Katrin

    1995-01-01

    Laser Doppler techniques are increasingly used in research and clinical applications to study perfusion phenomena in the skin, yet the influences of changing scattering parameters and geometry on the measure of perfusion are not well explored. To investigate these influences, a simulation program based on the Monte Carlo method was developed, which is capable of determining the Doppler spectra caused by moving red blood cells. The simulation model allows for the definition of arbitrary networks of blood vessels with individual velocities. The volume is represented by a voxel tree with adaptive spatial resolution which contains references to the optical properties and is used to store the location dependent photon fluence determined during the simulation. Two evaluation methods for Doppler spectra from biological tissue described in the literate were investigated with the simulation program. The results obtained suggest that both methods give a measure of perfusion nearly proportional to the velocity of the red blood cells. However, simulations done with different geometries of the blood vessels seem to indicate a nonlinear behavior concerning the concentration of red blood cells in the measurement volume. Nevertheless these simulation results may help in the interpretation of measurements obtained from devices using the investigated evaluation methods.

  14. The Eighth Industrial Fluids Properties Simulation Challenge

    PubMed Central

    Schultz, Nathan E.; Ahmad, Riaz; Brennan, John K.; Frankel, Kevin A.; Moore, Jonathan D.; Moore, Joshua D.; Mountain, Raymond D.; Ross, Richard B.; Thommes, Matthias; Shen, Vincent K.; Siderius, Daniel W.; Smith, Kenneth D.

    2016-01-01

    The goal of the eighth industrial fluid properties simulation challenge was to test the ability of molecular simulation methods to predict the adsorption of organic adsorbates in activated carbon materials. In particular, the eighth challenge focused on the adsorption of perfluorohexane in the activated carbon BAM-109. Entrants were challenged to predict the adsorption in the carbon at 273 K and relative pressures of 0.1, 0.3, and 0.6. The predictions were judged by comparison to a benchmark set of experimentally determined values. Overall good agreement and consistency were found between the predictions of most entrants. PMID:27840542

  15. Monte Carlo errors with less errors

    NASA Astrophysics Data System (ADS)

    Wolff, Ulli; Alpha Collaboration

    2004-01-01

    We explain in detail how to estimate mean values and assess statistical errors for arbitrary functions of elementary observables in Monte Carlo simulations. The method is to estimate and sum the relevant autocorrelation functions, which is argued to produce more certain error estimates than binning techniques and hence to help toward a better exploitation of expensive simulations. An effective integrated autocorrelation time is computed which is suitable to benchmark efficiencies of simulation algorithms with regard to specific observables of interest. A Matlab code is offered for download that implements the method. It can also combine independent runs (replica) allowing to judge their consistency.

  16. CAFE simulation of columnar-to-equiaxed transition in Al-7wt%Si alloys directionally solidified under microgravity

    NASA Astrophysics Data System (ADS)

    Liu, D. R.; Mangelinck-Noël, N.; Gandin, Ch-A.; Zimmermann, G.; Sturz, L.; Nguyen Thi, H.; Billia, B.

    2016-03-01

    A two-dimensional multi-scale cellular automaton - finite element (CAFE) model is used to simulate grain structure evolution and microsegregation formation during solidification of refined Al-7wt%Si alloys under microgravity. The CAFE simulations are first qualitatively compared with the benchmark experimental data under microgravity. Qualitative agreement is obtained for the position of columnar to equiaxed transition (CET) and the CET transition mode (sharp or progressive). Further comparisons of the distributions of grain elongation factor and equivalent diameter are conducted and reveal a fair quantitative agreement.

  17. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  18. A conservative approach to parallelizing the Sharks World simulation

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Riffe, Scott E.

    1990-01-01

    Parallelizing a benchmark problem for parallel simulation, the Sharks World, is described. The described solution is conservative, in the sense that no state information is saved, and no 'rollbacks' occur. The used approach illustrates both the principal advantage and principal disadvantage of conservative parallel simulation. The advantage is that by exploiting lookahead an approach was found that dramatically improves the serial execution time, and also achieves excellent speedups. The disadvantage is that if the model rules are changed in such a way that the lookahead is destroyed, it is difficult to modify the solution to accommodate the changes.

  19. Benchmarking Gas Path Diagnostic Methods: A Public Approach

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Bird, Jeff; Davison, Craig; Volponi, Al; Iverson, R. Eugene

    2008-01-01

    Recent technology reviews have identified the need for objective assessments of engine health management (EHM) technology. The need is two-fold: technology developers require relevant data and problems to design and validate new algorithms and techniques while engine system integrators and operators need practical tools to direct development and then evaluate the effectiveness of proposed solutions. This paper presents a publicly available gas path diagnostic benchmark problem that has been developed by the Propulsion and Power Systems Panel of The Technical Cooperation Program (TTCP) to help address these needs. The problem is coded in MATLAB (The MathWorks, Inc.) and coupled with a non-linear turbofan engine simulation to produce "snap-shot" measurements, with relevant noise levels, as if collected from a fleet of engines over their lifetime of use. Each engine within the fleet will experience unique operating and deterioration profiles, and may encounter randomly occurring relevant gas path faults including sensor, actuator and component faults. The challenge to the EHM community is to develop gas path diagnostic algorithms to reliably perform fault detection and isolation. An example solution to the benchmark problem is provided along with associated evaluation metrics. A plan is presented to disseminate this benchmark problem to the engine health management technical community and invite technology solutions.

  20. PPI4DOCK: large scale assessment of the use of homology models in free docking over more than 1000 realistic targets.

    PubMed

    Yu, Jinchao; Guerois, Raphaël

    2016-12-15

    Protein-protein docking methods are of great importance for understanding interactomes at the structural level. It has become increasingly appealing to use not only experimental structures but also homology models of unbound subunits as input for docking simulations. So far we are missing a large scale assessment of the success of rigid-body free docking methods on homology models. We explored how we could benefit from comparative modelling of unbound subunits to expand docking benchmark datasets. Starting from a collection of 3157 non-redundant, high X-ray resolution heterodimers, we developed the PPI4DOCK benchmark containing 1417 docking targets based on unbound homology models. Rigid-body docking by Zdock showed that for 1208 cases (85.2%), at least one correct decoy was generated, emphasizing the efficiency of rigid-body docking in generating correct assemblies. Overall, the PPI4DOCK benchmark contains a large set of realistic cases and provides new ground for assessing docking and scoring methodologies. Benchmark sets can be downloaded from http://biodev.cea.fr/interevol/ppi4dock/ CONTACT: guerois@cea.frSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

Top