Sample records for four-component streamline-based simulator

  1. Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.

    2010-11-01

    We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.

  2. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  3. Regimes of Flow over Complex Structures of Endothelial Glycocalyx: A Molecular Dynamics Simulation Study.

    PubMed

    Jiang, Xi Zhuo; Feng, Muye; Ventikos, Yiannis; Luo, Kai H

    2018-04-10

    Flow patterns on surfaces grafted with complex structures play a pivotal role in many engineering and biomedical applications. In this research, large-scale molecular dynamics (MD) simulations are conducted to study the flow over complex surface structures of an endothelial glycocalyx layer. A detailed structure of glycocalyx has been adopted and the flow/glycocalyx system comprises about 5,800,000 atoms. Four cases involving varying external forces and modified glycocalyx configurations are constructed to reveal intricate fluid behaviour. Flow profiles including temporal evolutions and spatial distributions of velocity are illustrated. Moreover, streamline length and vorticity distributions under the four scenarios are compared and discussed to elucidate the effects of external forces and glycocalyx configurations on flow patterns. Results show that sugar chain configurations affect streamline length distributions but their impact on vorticity distributions is statistically insignificant, whilst the influence of the external forces on both streamline length and vorticity distributions are trivial. Finally, a regime diagram for flow over complex surface structures is proposed to categorise flow patterns.

  4. 3D aquifer characterization using stochastic streamline calibration

    NASA Astrophysics Data System (ADS)

    Jang, Minchul

    2007-03-01

    In this study, a new inverse approach, stochastic streamline calibration is proposed. Using both a streamline concept and a stochastic technique, stochastic streamline calibration optimizes an identified field to fit in given observation data in a exceptionally fast and stable fashion. In the stochastic streamline calibration, streamlines are adopted as basic elements not only for describing fluid flow but also for identifying the permeability distribution. Based on the streamline-based inversion by Agarwal et al. [Agarwal B, Blunt MJ. Streamline-based method with full-physics forward simulation for history matching performance data of a North sea field. SPE J 2003;8(2):171-80], Wang and Kovscek [Wang Y, Kovscek AR. Streamline approach for history matching production data. SPE J 2000;5(4):353-62], permeability is modified rather along streamlines than at the individual gridblocks. Permeabilities in the gridblocks which a streamline passes are adjusted by being multiplied by some factor such that we can match flow and transport properties of the streamline. This enables the inverse process to achieve fast convergence. In addition, equipped with a stochastic module, the proposed technique supportively calibrates the identified field in a stochastic manner, while incorporating spatial information into the field. This prevents the inverse process from being stuck in local minima and helps search for a globally optimized solution. Simulation results indicate that stochastic streamline calibration identifies an unknown permeability exceptionally quickly. More notably, the identified permeability distribution reflected realistic geological features, which had not been achieved in the original work by Agarwal et al. with the limitations of the large modifications along streamlines for matching production data only. The constructed model by stochastic streamline calibration forecasted transport of plume which was similar to that of a reference model. By this, we can expect the proposed approach to be applied to the construction of an aquifer model and forecasting of the aquifer performances of interest.

  5. The design of the Comet streamliner: An electric land speed record motorcycle

    NASA Astrophysics Data System (ADS)

    McMillan, Ethan Alexander

    The development of the land speed record electric motorcycle streamliner, the Comet, is discussed herein. Its design process includes a detailed literary review of past and current motorcycle streamliners in an effort to highlight the main components of such a vehicle's design, while providing baseline data for performance comparisons. A new approach to balancing a streamliner at low speeds is also addressed, a system henceforth referred to as landing gear, which has proven an effective means for allowing the driver to control the low speed instabilities of the vehicle with relative ease compared to tradition designs. This is accompanied by a dynamic stability analysis conducted on a test chassis that was developed for the primary purpose of understanding the handling dynamics of streamliners, while also providing a test bed for the implementation of the landing gear system and a means to familiarize the driver to the operation and handling of such a vehicle. Data gathered through the use of GPS based velocity tracking, accelerometers, and a linear potentiometer provided a means to validate a dynamic stability analysis of the weave and wobble modes of the vehicle through linearization of a streamliner model developed in the BikeSIM software suite. Results indicate agreement between the experimental data and the simulation, indicating that the conventional recumbent design of a streamliner chassis is in fact highly stable throughout the performance envelope beyond extremely low speeds. A computational fluid dynamics study was also performed, utilized in the development of the body of the Comet to which a series of tests were conducted in order to develop a shape that was both practical to transport and highly efficient. By creating a hybrid airfoil from a NACA 0018 and NACA 66-018, a drag coefficient of 0.1 and frontal area of 0.44 m2 has been found for the final design. Utilizing a performance model based on the proposed vehicle's motor, its rolling resistance, and the body's aerodynamic drag, the top speed is predicted at 226 mph. Further design considerations are also addressed, including the development of the component level layout of the motorcycle, weighing factors such as safety and ease of fabrication with that of performance and accessibility. At the time of composition, the Comet had started the fabrication process, and it is the intent of the author that the finished product competes in the 2016 Bonneville Motorcycle Speed Trials to set the first world record for a single track electric motorcycle streamliner.

  6. STEADY GENERAL RELATIVISTIC MAGNETOHYDRODYNAMIC INFLOW/OUTFLOW SOLUTION ALONG LARGE-SCALE MAGNETIC FIELDS THAT THREAD A ROTATING BLACK HOLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pu, Hung-Yi; Nakamura, Masanori; Hirotani, Kouichi

    2015-03-01

    General relativistic magnetohydrodynamic (GRMHD) flows along magnetic fields threading a black hole can be divided into inflow and outflow parts, according to the result of the competition between the black hole gravity and magneto-centrifugal forces along the field line. Here we present the first self-consistent, semi-analytical solution for a cold, Poynting flux–dominated (PFD) GRMHD flow, which passes all four critical (inner and outer, Alfvén, and fast magnetosonic) points along a parabolic streamline. By assuming that the dominating (electromagnetic) component of the energy flux per flux tube is conserved at the surface where the inflow and outflow are separated, the outflowmore » part of the solution can be constrained by the inflow part. The semi-analytical method can provide fiducial and complementary solutions for GRMHD simulations around the rotating black hole, given that the black hole spin, global streamline, and magnetizaion (i.e., a mass loading at the inflow/outflow separation) are prescribed. For reference, we demonstrate a self-consistent result with the work by McKinney in a quantitative level.« less

  7. Computational fluid dynamics modeling of laminar, transitional, and turbulent flows with sensitivity to streamline curvature and rotational effects

    NASA Astrophysics Data System (ADS)

    Chitta, Varun

    Modeling of complex flows involving the combined effects of flow transition and streamline curvature using two advanced turbulence models, one in the Reynolds-averaged Navier-Stokes (RANS) category and the other in the hybrid RANS-Large eddy simulation (LES) category is considered in this research effort. In the first part of the research, a new scalar eddy-viscosity model (EVM) is proposed, designed to exhibit physically correct responses to flow transition, streamline curvature, and system rotation effects. The four equation model developed herein is a curvature-sensitized version of a commercially available three-equation transition-sensitive model. The physical effects of rotation and curvature (RC) enter the model through the added transport equation, analogous to a transverse turbulent velocity scale. The eddy-viscosity has been redefined such that the proposed model is constrained to reduce to the original transition-sensitive model definition in nonrotating flows or in regions with negligible RC effects. In the second part of the research, the developed four-equation model is combined with a LES technique using a new hybrid modeling framework, dynamic hybrid RANS-LES. The new framework is highly generalized, allowing coupling of any desired LES model with any given RANS model and addresses several deficiencies inherent in most current hybrid models. In the present research effort, the DHRL model comprises of the proposed four-equation model for RANS component and the MILES scheme for LES component. Both the models were implemented into a commercial computational fluid dynamics (CFD) solver and tested on a number of engineering and generic flow problems. Results from both the RANS and hybrid models show successful resolution of the combined effects of transition and curvature with reasonable engineering accuracy, and for only a small increase in computational cost. In addition, results from the hybrid model indicate significant levels of turbulent fluctuations in the flowfield, improved accuracy compared to RANS models predictions, and are obtained at a significant reduction of computational cost compared to full LES models. The results suggest that the advanced turbulence modeling techniques presented in this research effort have potential as practical tools for solving low/high Re flows over blunt/curved bodies for the prediction of transition and RC effects.

  8. Second-Moment RANS Model Verification and Validation Using the Turbulence Modeling Resource Website (Invited)

    NASA Technical Reports Server (NTRS)

    Eisfeld, Bernhard; Rumsey, Chris; Togiti, Vamshi

    2015-01-01

    The implementation of the SSG/LRR-omega differential Reynolds stress model into the NASA flow solvers CFL3D and FUN3D and the DLR flow solver TAU is verified by studying the grid convergence of the solution of three different test cases from the Turbulence Modeling Resource Website. The model's predictive capabilities are assessed based on four basic and four extended validation cases also provided on this website, involving attached and separated boundary layer flows, effects of streamline curvature and secondary flow. Simulation results are compared against experimental data and predictions by the eddy-viscosity models of Spalart-Allmaras (SA) and Menter's Shear Stress Transport (SST).

  9. Two-dimensional Lagrangian simulation of suspended sediment

    USGS Publications Warehouse

    Schoellhamer, David H.

    1988-01-01

    A two-dimensional laterally averaged model for suspended sediment transport in steady gradually varied flow that is based on the Lagrangian reference frame is presented. The layered Lagrangian transport model (LLTM) for suspended sediment performs laterally averaged concentration. The elevations of nearly horizontal streamlines and the simulation time step are selected to optimize model stability and efficiency. The computational elements are parcels of water that are moved along the streamlines in the Lagrangian sense and are mixed with neighboring parcels. Three applications show that the LLTM can accurately simulate theoretical and empirical nonequilibrium suspended sediment distributions and slug injections of suspended sediment in a laboratory flume.

  10. Turbulence modeling in simulation of gas-turbine flow and heat transfer.

    PubMed

    Brereton, G; Shih, T I

    2001-05-01

    The popular k-epsilon type two-equation turbulence models, which are calibrated by experimental data from simple shear flows, are analyzed for their ability to predict flows involving shear and an extra strain--flow with shear and rotation and flow with shear and streamline curvature. The analysis is based on comparisons between model predictions and those from measurements and large-eddy simulations of homogenous flows involving shear and an extra strain, either from rotation or from streamline curvature. Parameters are identified, which show the conditions under which performance of k-epsilon type models can be expected to be poor.

  11. Experimental evaluation of neural probe’s insertion induced injury based on digital image correlation method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Wenguang, E-mail: zhwg@sjtu.edu.cn; Ma, Yakun; Li, Zhengwei

    Purpose: The application of neural probes in clinic has been challenged by probes’ short lifetime when implanted into brain tissue. The primary goal is to develop an evaluation system for testing brain tissue injury induced by neural probe’s insertion using microscope based digital image correlation method. Methods: A brain tissue phantom made of silicone rubber with speckle pattern on its surface was fabricated. To obtain the optimal speckle pattern, mean intensity gradient parameter was used for quality assessment. The designed testing system consists of three modules: (a) load module for simulating neural electrode implantation process; (b) data acquisition module tomore » capture micrographs of speckle pattern and to obtain reactive forces during the insertion of the probe; (c) postprocessing module for extracting tissue deformation information from the captured speckle patterns. On the basis of the evaluation system, the effects of probe wedge angle, insertion speed, and probe streamline on insertion induced tissue injury were investigated. Results: The optimal quality speckle pattern can be attained by the following fabrication parameters: spin coating rate—1000 r/min, silicone rubber component A: silicone rubber component B: softener: graphite = 5 ml: 5 ml: 2 ml: 0.6 g. The probe wedge angle has a significant effect on tissue injury. Compared to wedge angle 40° and 20°, maximum principal strain of 60° wedge angle was increased by 40.3% and 87.5%, respectively; compared with a relatively higher speed (500 μm/s), the maximum principle strain within the tissue induced by slow insertion speed (100 μm/s) was increased by 14.3%; insertion force required by probe with convex streamline was smaller than the force of traditional probe. Based on the experimental results, a novel neural probe that has a rounded tip covered by a biodegradable silk protein coating with convex streamline was proposed, which has both lower insertion and micromotion induced tissue injury. Conclusions: The established evaluation system has provided a simulation environment for testing brain tissue injury produced by various insertion conditions. At the same time, it eliminates the adverse effect of biological factors on tissue deformation during the experiment, improving the repeatability of measurement results. As a result, the evaluation system will provide support on novel neural probe design that can reduce the acute tissue injury during the implantation of the probe.« less

  12. Designing simulator-based training: an approach integrating cognitive task analysis and four-component instructional design.

    PubMed

    Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.

  13. Can streamlined multi-criteria decision analysis be used to implement shared decision making for colorectal cancer screening?

    PubMed Central

    Dolan, James G.; Boohaker, Emily; Allison, Jeroan; Imperiale, Thomas F.

    2013-01-01

    Background Current US colorectal cancer screening guidelines that call for shared decision making regarding the choice among several recommended screening options are difficult to implement. Multi-criteria decision analysis (MCDA) is an established methodology well suited for supporting shared decision making. Our study goal was to determine if a streamlined form of MCDA using rank order based judgments can accurately assess patients’ colorectal cancer screening priorities. Methods We converted priorities for four decision criteria and three sub-criteria regarding colorectal cancer screening obtained from 484 average risk patients using the Analytic Hierarchy Process (AHP) in a prior study into rank order-based priorities using rank order centroids. We compared the two sets of priorities using Spearman rank correlation and non-parametric Bland-Altman limits of agreement analysis. We assessed the differential impact of using the rank order-based versus the AHP-based priorities on the results of a full MCDA comparing three currently recommended colorectal cancer screening strategies. Generalizability of the results was assessed using Monte Carlo simulation. Results Correlations between the two sets of priorities for the seven criteria ranged from 0.55 to 0.92. The proportions of absolute differences between rank order-based and AHP-based priorities that were more than ± 0.15 ranged from 1% to 16%. Differences in the full MCDA results were minimal and the relative rankings of the three screening options were identical more than 88% of the time. The Monte Carlo simulation results were similar. Conclusion Rank order-based MCDA could be a simple, practical way to guide individual decisions and assess population decision priorities regarding colorectal cancer screening strategies. Additional research is warranted to further explore the use of these methods for promoting shared decision making. PMID:24300851

  14. Exploring the Biotic Pump Hypothesis along Non-linear Transects in Tropical South America

    NASA Astrophysics Data System (ADS)

    Molina, R.; Bettin, D. M.; Salazar, J. F.; Villegas, J. C.

    2014-12-01

    Forests might actively transport atmospheric moisture from the oceans, according to the biotic pump of atmospheric moisture (BiPAM) hypothesis. The BiPAM hypothesis appears to be supported by the fact that precipitation drops exponentially with distance from ocean along non-forested land transects, but not on their forested counterparts. Yet researchers have discussed the difficulty in defining proper transects for BiPAM studies. Previous studies calculate precipitation gradients either along linear transects maximizing distance to the ocean, or along polylines following specific atmospheric pathways (e.g., aerial rivers). In this study we analyzed precipitation gradients along curvilinear streamlines of wind in tropical South America. Wind streamlines were computed using long-term quarterly averages of meridional and zonal wind components from the ERA-Interim and NCEP/NCAR reanalyses. Total precipitation along streamlines was obtained from four data sources: TRMM, UDEL, ERA-Interim, and NCEP/NCAR. Precipitation on land versus distance from the ocean was analyzed along selected streamlines for each data source. As predicted by BiPAM, precipitation gradients did not decrease exponentially along streamlines in the vicinity of the Amazon forest, but dropped rapidly as distance from the forest increased. Remarkably, precipitation along streamlines in some areas outside the Amazon forest did not decrease exponentially either. This was possibly owing to convergence of moisture conveyed by low level jets (LLJs) in those areas (e.g., streamlines driven by the Caribbean and CHOCO jets on the Pacific coast of Colombia). Significantly, BiPAM held true even along long transects displaying strong sinuosity. In fact, the general conclusions of previous studies remain valid. Yet effects of LLJs on precipitation gradients need to be thoroughly considered in future BiPAM studies.

  15. A streamlined failure mode and effects analysis.

    PubMed

    Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-06-01

    Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  16. An Integrated Approach to Characterizing Bypassed Oil in Heterogeneous and Fractured Reservoirs Using Partitioning Tracers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhil Datta-Gupta

    2006-12-31

    We explore the use of efficient streamline-based simulation approaches for modeling partitioning interwell tracer tests in hydrocarbon reservoirs. Specifically, we utilize the unique features of streamline models to develop an efficient approach for interpretation and history matching of field tracer response. A critical aspect here is the underdetermined and highly ill-posed nature of the associated inverse problems. We have investigated the relative merits of the traditional history matching ('amplitude inversion') and a novel travel time inversion in terms of robustness of the method and convergence behavior of the solution. We show that the traditional amplitude inversion is orders of magnitudemore » more non-linear and the solution here is likely to get trapped in local minimum, leading to inadequate history match. The proposed travel time inversion is shown to be extremely efficient and robust for practical field applications. The streamline approach is generalized to model water injection in naturally fractured reservoirs through the use of a dual media approach. The fractures and matrix are treated as separate continua that are connected through a transfer function, as in conventional finite difference simulators for modeling fractured systems. A detailed comparison with a commercial finite difference simulator shows very good agreement. Furthermore, an examination of the scaling behavior of the computation time indicates that the streamline approach is likely to result in significant savings for large-scale field applications. We also propose a novel approach to history matching finite-difference models that combines the advantage of the streamline models with the versatility of finite-difference simulation. In our approach, we utilize the streamline-derived sensitivities to facilitate history matching during finite-difference simulation. The use of finite-difference model allows us to account for detailed process physics and compressibility effects. The approach is very fast and avoids much of the subjective judgments and time-consuming trial-and-errors associated with manual history matching. We demonstrate the power and utility of our approach using a synthetic example and two field examples. We have also explored the use of a finite difference reservoir simulator, UTCHEM, for field-scale design and optimization of partitioning interwell tracer tests. The finite-difference model allows us to include detailed physics associated with reactive tracer transport, particularly those related with transverse and cross-streamline mechanisms. We have investigated the potential use of downhole tracer samplers and also the use of natural tracers for the design of partitioning tracer tests. Finally, we discuss several alternative ways of using partitioning interwell tracer tests (PITTs) in oil fields for the calculation of oil saturation, swept pore volume and sweep efficiency, and assess the accuracy of such tests under a variety of reservoir conditions.« less

  17. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    PubMed

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  18. Using sequential self-calibration method to identify conductivity distribution: Conditioning on tracer test data

    USGS Publications Warehouse

    Hu, B.X.; He, C.

    2008-01-01

    An iterative inverse method, the sequential self-calibration method, is developed for mapping spatial distribution of a hydraulic conductivity field by conditioning on nonreactive tracer breakthrough curves. A streamline-based, semi-analytical simulator is adopted to simulate solute transport in a heterogeneous aquifer. The simulation is used as the forward modeling step. In this study, the hydraulic conductivity is assumed to be a deterministic or random variable. Within the framework of the streamline-based simulator, the efficient semi-analytical method is used to calculate sensitivity coefficients of the solute concentration with respect to the hydraulic conductivity variation. The calculated sensitivities account for spatial correlations between the solute concentration and parameters. The performance of the inverse method is assessed by two synthetic tracer tests conducted in an aquifer with a distinct spatial pattern of heterogeneity. The study results indicate that the developed iterative inverse method is able to identify and reproduce the large-scale heterogeneity pattern of the aquifer given appropriate observation wells in these synthetic cases. ?? International Association for Mathematical Geology 2008.

  19. Simulation of Cold Flow in a Truncated Ideal Nozzle with Film Cooling

    NASA Technical Reports Server (NTRS)

    Braman, Kalen; Ruf, Joseph

    2015-01-01

    Flow transients during rocket start-up and shut-down can lead to significant side loads on rocket nozzles. The capability to estimate these side loads computationally can streamline the nozzle design process. Towards this goal, the flow in a truncated ideal contour (TIC) nozzle has been simulated for a range of nozzle pressure ratios (NPRs) aimed to match a series of cold flow experiments performed at the NASA MSFC Nozzle Test Facility. These simulations were performed with varying turbulence model choices and with four different versions of the TIC nozzle model geometry, each of which was created with a different simplification to the test article geometry.

  20. Recognition of white matter bundles using local and global streamline-based registration and clustering.

    PubMed

    Garyfallidis, Eleftherios; Côté, Marc-Alexandre; Rheault, Francois; Sidhu, Jasmeen; Hau, Janice; Petit, Laurent; Fortin, David; Cunanne, Stephen; Descoteaux, Maxime

    2018-04-15

    Virtual dissection of diffusion MRI tractograms is cumbersome and needs extensive knowledge of white matter anatomy. This virtual dissection often requires several inclusion and exclusion regions-of-interest that make it a process that is very hard to reproduce across experts. Having automated tools that can extract white matter bundles for tract-based studies of large numbers of people is of great interest for neuroscience and neurosurgical planning. The purpose of our proposed method, named RecoBundles, is to segment white matter bundles and make virtual dissection easier to perform. This can help explore large tractograms from multiple persons directly in their native space. RecoBundles leverages latest state-of-the-art streamline-based registration and clustering to recognize and extract bundles using prior bundle models. RecoBundles uses bundle models as shape priors for detecting similar streamlines and bundles in tractograms. RecoBundles is 100% streamline-based, is efficient to work with millions of streamlines and, most importantly, is robust and adaptive to incomplete data and bundles with missing components. It is also robust to pathological brains with tumors and deformations. We evaluated our results using multiple bundles and showed that RecoBundles is in good agreement with the neuroanatomical experts and generally produced more dense bundles. Across all the different experiments reported in this paper, RecoBundles was able to identify the core parts of the bundles, independently from tractography type (deterministic or probabilistic) or size. Thus, RecoBundles can be a valuable method for exploring tractograms and facilitating tractometry studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A streamlined failure mode and effects analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and usedmore » to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.« less

  2. Visualizing Time-Varying Phenomena In Numerical Simulations Of Unsteady Flows

    NASA Technical Reports Server (NTRS)

    Lane, David A.

    1996-01-01

    Streamlines, contour lines, vector plots, and volume slices (cutting planes) are commonly used for flow visualization. These techniques are sometimes referred to as instantaneous flow visualization techniques because calculations are based on an instant of the flowfield in time. Although instantaneous flow visualization techniques are effective for depicting phenomena in steady flows,they sometimes do not adequately depict time-varying phenomena in unsteady flows. Streaklines and timelines are effective visualization techniques for depicting vortex shedding, vortex breakdown, and shock waves in unsteady flows. These techniques are examples of time-dependent flow visualization techniques, which are based on many instants of the flowfields in time. This paper describes the algorithms for computing streaklines and timelines. Using numerically simulated unsteady flows, streaklines and timelines are compared with streamlines, contour lines, and vector plots. It is shown that streaklines and timelines reveal vortex shedding and vortex breakdown more clearly than instantaneous flow visualization techniques.

  3. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  4. An N-body Integrator for Planetary Rings

    NASA Astrophysics Data System (ADS)

    Hahn, Joseph M.

    2011-04-01

    A planetary ring that is disturbed by a satellite's resonant perturbation can respond in an organized way. When the resonance lies in the ring's interior, the ring responds via an m-armed spiral wave, while a ring whose edge is confined by the resonance exhibits an m-lobed scalloping along the ring-edge. The amplitude of these disturbances are sensitive to ring surface density and viscosity, so modelling these phenomena can provide estimates of the ring's properties. However a brute force attempt to simulate a ring's full azimuthal extent with an N-body code will likely fail because of the large number of particles needed to resolve the ring's behavior. Another impediment is the gravitational stirring that occurs among the simulated particles, which can wash out the ring's organized response. However it is possible to adapt an N-body integrator so that it can simulate a ring's collective response to resonant perturbations. The code developed here uses a few thousand massless particles to trace streamlines within the ring. Particles are close in a radial sense to these streamlines, which allows streamlines to be treated as straight wires of constant linear density. Consequently, gravity due to these streamline is a simple function of the particle's radial distance to all streamlines. And because particles are responding to smooth gravitating streamlines, rather than discrete particles, this method eliminates the stirring that ordinarily occurs in brute force N-body calculations. Note also that ring surface density is now a simple function of streamline separations, so effects due to ring pressure and viscosity are easily accounted for, too. A poster will describe this N-body method in greater detail. Simulations of spiral density waves and scalloped ring-edges are executed in typically ten minutes on a desktop PC, and results for Saturn's A and B rings will be presented at conference time.

  5. MDA-based EHR application security services.

    PubMed

    Blobel, Bernd; Pharow, Peter

    2004-01-01

    Component-oriented, distributed, virtual EHR systems have to meet enhanced security and privacy requirements. In the context of advanced architectural paradigms such as component-orientation, model-driven, and knowledge-based, standardised security services needed have to be specified and implemented in an integrated way following the same paradigm. This concerns the deployment of formal models, meta-languages, reference models such as the ISO RM-ODP, and development as well as implementation tools. International projects' results presented proceed on that streamline.

  6. Information Environments

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia

    2003-01-01

    The objective of GRC CNIS/IE work is to build a plug-n-play infrastructure that provides the Grand Challenge Applications with a suite of tools for coupling codes together, numerical zooming between fidelity of codes and gaining deployment of these simulations onto the Information Power Grid. The GRC CNIS/IE work will streamline and improve this process by providing tighter integration of various tools through the use of object oriented design of component models and data objects and through the use of CORBA (Common Object Request Broker Architecture).

  7. SimBRS: A University/Industry Consortium Focused on Simulation Based Solutions for Ground Vehicles

    DTIC Science & Technology

    2009-07-29

    plan is to use the SimBRS contract mechanism to streamline a process that applies research funds into a managed program, that is cognizant to the... designs . Therefore, the challenge for the SimBRS team is to establish an approach based on the capacity of measured data and simulations to support ...by systematically relating appropriate results from measurements and applied research in engineering and science. In turn, basic research and

  8. Iron Catalyst Chemistry in High Pressure Carbon Monoxide Nanotube Reactor

    NASA Technical Reports Server (NTRS)

    Scott, Carl D.; Povitsky, Alexander; Dateo, Christopher; Gokcen, Tahir; Smalley, Richard E.

    2001-01-01

    The high-pressure carbon monoxide (HiPco) technique for producing single wall carbon nanotubes (SWNT) is analyzed using a chemical reaction model coupled with properties calculated along streamlines. Streamline properties for mixing jets are calculated by the FLUENT code using the k-e turbulent model for pure carbon monixide. The HiPco process introduces cold iron pentacarbonyl diluted in CO, or alternatively nitrogen, at high pressure, ca. 30 atmospheres into a conical mixing zone. Hot CO is also introduced via three jets at angles with respect to the axis of the reactor. Hot CO decomposes the Fe(CO)5 to release atomic Fe. Cluster reaction rates are from Krestinin, et aI., based on shock tube measurements. Another model is from classical cluster theory given by Girshick's team. The calculations are performed on streamlines that assume that a cold mixture of Fe(CO)5 in CO is introduced along the reactor axis. Then iron forms clusters that catalyze the formation of SWNTs from the Boudouard reaction on Fe-containing clusters by reaction with CO. To simulate the chemical process along streamlines that were calculated by the fluid dynamics code FLUENT, a time history of temperature and dilution are determined along streamlines. Alternative catalyst injection schemes are also evaluated.

  9. Developing and utilizing an Euler computational method for predicting the airframe/propulsion effects for an aft-mounted turboprop transport. Volume 2: User guide

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Neback, H. E.; Kao, T. J.; Yu, N. Y.; Kusunose, K.

    1991-01-01

    This manual explains how to use an Euler based computational method for predicting the airframe/propulsion integration effects for an aft-mounted turboprop transport. The propeller power effects are simulated by the actuator disk concept. This method consists of global flow field analysis and the embedded flow solution for predicting the detailed flow characteristics in the local vicinity of an aft-mounted propfan engine. The computational procedure includes the use of several computer programs performing four main functions: grid generation, Euler solution, grid embedding, and streamline tracing. This user's guide provides information for these programs, including input data preparations with sample input decks, output descriptions, and sample Unix scripts for program execution in the UNICOS environment.

  10. Prediction of Gas Injection Performance for Heterogeneous Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blunt, Martin J.; Orr, Franklin M.

    This report describes research carried out in the Department of Petroleum Engineering at Stanford University from September 1997 - September 1998 under the second year of a three-year grant from the Department of Energy on the "Prediction of Gas Injection Performance for Heterogeneous Reservoirs." The research effort is an integrated study of the factors affecting gas injection, from the pore scale to the field scale, and involves theoretical analysis, laboratory experiments, and numerical simulation. The original proposal described research in four areas: (1) Pore scale modeling of three phase flow in porous media; (2) Laboratory experiments and analysis of factorsmore » influencing gas injection performance at the core scale with an emphasis on the fundamentals of three phase flow; (3) Benchmark simulations of gas injection at the field scale; and (4) Development of streamline-based reservoir simulator. Each state of the research is planned to provide input and insight into the next stage, such that at the end we should have an integrated understanding of the key factors affecting field scale displacements.« less

  11. LOADING SIMULATION PROGRAM C

    EPA Pesticide Factsheets

    LSPC is the Loading Simulation Program in C++, a watershed modeling system that includes streamlined Hydrologic Simulation Program Fortran (HSPF) algorithms for simulating hydrology, sediment, and general water quality

  12. Streamlines behind curved shock waves in axisymmetric flow fields

    NASA Astrophysics Data System (ADS)

    Filippi, A. A.; Skews, B. W.

    2018-07-01

    Streamlines behind axisymmetric curved shock waves were used to predict the internal surfaces that produced them. Axisymmetric ring wedge models with varying internal radii of curvature and leading-edge angles were used to produce numerical results. Said numerical simulations were validated using experimental shadowgraph results for a series of ring wedge test pieces. The streamlines behind curved shock waves for lower leading-edge angles are examined at Mach 3.4, whereas the highest leading-edge angle cases are explored at Mach 2.8 and 3.4. Numerical and theoretical streamlines are compared for the highest leading-edge angle cases at Mach 3.6. It was found that wall-bounding theoretical streamlines did not match the internal curved surface. This was due to extreme streamline curvature curving the streamlines when the shock angle approached the Mach angle at lower leading-edge angles. Increased Mach number and internal radius of curvature produced more reasonable results. Very good agreement was found between the theoretical and numerical streamlines at lower curvatures before the influence of the trailing edge expansion fan.

  13. Flow and transport in digitized images of Berea sandstone: ergodicity, stationarity and upscaling

    NASA Astrophysics Data System (ADS)

    Puyguiraud, A.; Dentz, M.; Gouze, P.

    2017-12-01

    We perform Stokes flow simulations on digitized images of a Berea sandstone sample obtained through micro-tomography imaging and segmentation processes. We obtain accurate information on the transport using a streamline reconstruction algorithm which uses the velocity field obtained from the flow simulation as input data. This technique is based on the method proposed by Pollock (Groundwater, 1988) but employs a quadratic interpolation near the rock mesh cells of the domain similarly to Mostaghimi et al. (SPE, 2012). This allows an accurate resolution of the velocity field near the solid interface which plays an important role on the transport characteristics, such as the probability density of first arrival times and the growth of the mean squared displacement, among others, which exhibit non-Fickian behavior. We analyze Lagrangian and Eulerian velocity statistics and their relation, and then focus on the ergodicity and the stationarity properties of the transport.We analyze the temporal evolution of Lagrangian velocity statistics for different injection conditions, and findd quick convergence to a limiting velocity distribution, indicating the transport to be near-stationary. The equivalence between velocity samplings within and across streamlines, as well as the independency of the statistics on the number of sampled streamlines, lead as to conclude that the transport may be modeled as ergodic.These characteristics then allow us to upscale the 3-dimensional simulations using a 1-dimensional Continuous Time Random Walk model. This model, parametrized by the velocity results and the characteristic correlation length obtained from the above mentioned simulations, is able to efficiently reproduce the results and to predict larger scale behaviors.

  14. RTDS-Based Design and Simulation of Distributed P-Q Power Resources in Smart Grid

    NASA Astrophysics Data System (ADS)

    Taylor, Zachariah David

    In this Thesis, we propose to utilize a battery system together with its power electronics interfaces and bidirectional charger as a distributed P-Q resource in power distribution networks. First, we present an optimization-based approach to operate such distributed P-Q resources based on the characteristics of the battery and charger system as well as the features and needs of the power distribution network. Then, we use the RTDS Simulator, which is an industry-standard simulation tool of power systems, to develop two RTDS-based design approaches. The first design is based on an ideal four-quadrant distributed P-Q power resource. The second design is based on a detailed four-quadrant distributed P-Q power resource that is developed using power electronics components. The hardware and power electronics circuitry as well as the control units are explained for the second design. After that, given the two-RTDS designs, we conducted extensive RTDS simulations to assess the performance of the designed distributed P-Q Power Resource in an IEEE 13 bus test system. We observed that the proposed design can noticeably improve the operational performance of the power distribution grid in at least four key aspects: reducing power loss, active power peak load shaving at substation, reactive power peak load shaving at substation, and voltage regulation. We examine these performance measures across three design cases: Case 1: There is no P-Q Power Resource available on the power distribution network. Case 2: The installed P-Q Power Resource only supports active power, i.e., it only utilizes its battery component. Case 3: The installed P-Q Power Resource supports both active and reactive power, i.e., it utilizes both its battery component and its power electronics charger component. In the end, we present insightful interpretations on the simulation results and suggest some future works.

  15. Event-based hydrological modeling for detecting dominant hydrological process and suitable model strategy for semi-arid catchments

    NASA Astrophysics Data System (ADS)

    Huang, Pengnian; Li, Zhijia; Chen, Ji; Li, Qiaoling; Yao, Cheng

    2016-11-01

    To simulate the hydrological processes in semi-arid areas properly is still challenging. This study assesses the impact of different modeling strategies on simulating flood processes in semi-arid catchments. Four classic hydrological models, TOPMODEL, XINANJIANG (XAJ), SAC-SMA and TANK, were selected and applied to three semi-arid catchments in North China. Based on analysis and comparison of the simulation results of these classic models, four new flexible models were constructed and used to further investigate the suitability of various modeling strategies for semi-arid environments. Numerical experiments were also designed to examine the performances of the models. The results show that in semi-arid catchments a suitable model needs to include at least one nonlinear component to simulate the main process of surface runoff generation. If there are more than two nonlinear components in the hydrological model, they should be arranged in parallel, rather than in series. In addition, the results show that the parallel nonlinear components should be combined by multiplication rather than addition. Moreover, this study reveals that the key hydrological process over semi-arid catchments is the infiltration excess surface runoff, a non-linear component.

  16. Evaluating methods to visualize patterns of genetic differentiation on a landscape.

    PubMed

    House, Geoffrey L; Hahn, Matthew W

    2018-05-01

    With advances in sequencing technology, research in the field of landscape genetics can now be conducted at unprecedented spatial and genomic scales. This has been especially evident when using sequence data to visualize patterns of genetic differentiation across a landscape due to demographic history, including changes in migration. Two recent model-based visualization methods that can highlight unusual patterns of genetic differentiation across a landscape, SpaceMix and EEMS, are increasingly used. While SpaceMix's model can infer long-distance migration, EEMS' model is more sensitive to short-distance changes in genetic differentiation, and it is unclear how these differences may affect their results in various situations. Here, we compare SpaceMix and EEMS side by side using landscape genetics simulations representing different migration scenarios. While both methods excel when patterns of simulated migration closely match their underlying models, they can produce either un-intuitive or misleading results when the simulated migration patterns match their models less well, and this may be difficult to assess in empirical data sets. We also introduce unbundled principal components (un-PC), a fast, model-free method to visualize patterns of genetic differentiation by combining principal components analysis (PCA), which is already used in many landscape genetics studies, with the locations of sampled individuals. Un-PC has characteristics of both SpaceMix and EEMS and works well with simulated and empirical data. Finally, we introduce msLandscape, a collection of tools that streamline the creation of customizable landscape-scale simulations using the popular coalescent simulator ms and conversion of the simulated data for use with un-PC, SpaceMix and EEMS. © 2017 John Wiley & Sons Ltd.

  17. STELLAR SURFACE MAGNETO-CONVECTION AS A SOURCE OF ASTROPHYSICAL NOISE. I. MULTI-COMPONENT PARAMETERIZATION OF ABSORPTION LINE PROFILES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cegla, H. M.; Shelyag, S.; Watson, C. A.

    2013-02-15

    We outline our techniques to characterize photospheric granulation as an astrophysical noise source. A four-component parameterization of granulation is developed that can be used to reconstruct stellar line asymmetries and radial velocity shifts due to photospheric convective motions. The four components are made up of absorption line profiles calculated for granules, magnetic intergranular lanes, non-magnetic intergranular lanes, and magnetic bright points at disk center. These components are constructed by averaging Fe I 6302 A magnetically sensitive absorption line profiles output from detailed radiative transport calculations of the solar photosphere. Each of the four categories adopted is based on magnetic fieldmore » and continuum intensity limits determined from examining three-dimensional magnetohydrodynamic simulations with an average magnetic flux of 200 G. Using these four-component line profiles we accurately reconstruct granulation profiles, produced from modeling 12 Multiplication-Sign 12 Mm{sup 2} areas on the solar surface, to within {approx} {+-}20 cm s{sup -1} on a {approx}100 m s{sup -1} granulation signal. We have also successfully reconstructed granulation profiles from a 50 G simulation using the parameterized line profiles from the 200 G average magnetic field simulation. This test demonstrates applicability of the characterization to a range of magnetic stellar activity levels.« less

  18. NICER Mission

    NASA Image and Video Library

    2017-12-08

    This video previews the Neutron star Interior Composition Explorer (NICER). NICER is an Astrophysics Mission of Opportunity within NASA’s Explorer program, which provides frequent flight opportunities for world-class scientific investigations from space utilizing innovative, streamlined and efficient management approaches within the heliophysics and astrophysics science areas. NASA’s Space Technology Mission Directorate supports the SEXTANT component of the mission, demonstrating pulsar-based spacecraft navigation. NICER is an upcoming International Space Station payload scheduled to launch in June 2017. Learn more about the mission at nasa.gov/nicer NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Recirculating, passive micromixer with a novel sawtooth structure.

    PubMed

    Nichols, Kevin P; Ferullo, Julia R; Baeumner, Antje J

    2006-02-01

    A microfluidic device capable of recirculating nano to microlitre volumes in order to efficiently mix solutions is described. The device consists of molded polydimethyl siloxane (PDMS) channels with pressure inlet and outlet holes sealed by a glass lid. Recirculation is accomplished by a repeatedly reciprocated flow over an iterated sawtooth structure. The sawtooth structure serves to change the fluid velocity of individual streamlines differently depending on whether the fluid is flowing backwards or forward over the structure. Thus, individual streamlines can be accelerated or decelerated relative to the other streamlines to allow sections of the fluid to interact that would normally be linearly separated. Low Reynolds numbers imply that the process is reversible, neglecting diffusion. Computer simulations were carried out using FLUENT. Subsequently, fluorescent indicators were employed to experimentally verify these numerical simulations of the recirculation principal. Finally, mixing of a carboxyfluorescein labeled DMSO plug with an unlabeled DMSO plug across an immiscible hydrocarbon plug was investigated. At cycling rates of 1 Hz across five sawtooth units, the time was recorded to reach steady state in the channels, i.e. until both DMSO plugs had the same fluorescence intensity. In the case of the sawtooth structures, steady state was reached five times faster than in channels without sawtooth structures, which verified what would be expected based on numerical simulations. The microfluidic mixer is unique due to its versatility with respect to scaling, its potential to also mix solutions containing small particles such as beads and cells, and its ease of fabrication and use.

  20. Modeling surgical tool selection patterns as a "traveling salesman problem" for optimizing a modular surgical tool system.

    PubMed

    Nelson, Carl A; Miller, David J; Oleynikov, Dmitry

    2008-01-01

    As modular systems come into the forefront of robotic telesurgery, streamlining the process of selecting surgical tools becomes an important consideration. This paper presents a method for optimal queuing of tools in modular surgical tool systems, based on patterns in tool-use sequences, in order to minimize time spent changing tools. The solution approach is to model the set of tools as a graph, with tool-change frequency expressed as edge weights in the graph, and to solve the Traveling Salesman Problem for the graph. In a set of simulations, this method has shown superior performance at optimizing tool arrangements for streamlining surgical procedures.

  1. An investigation of the internal and external aerodynamics of cattle trucks

    NASA Technical Reports Server (NTRS)

    Muirhead, V. U.

    1983-01-01

    Wind tunnel tests were conducted on a one-tenth scale model of a conventional tractor trailer livestock hauler to determine the air flow through the trailer and the drag of the vehicle. These tests were conducted with the trailer empty and with a full load of simulated cattle. Additionally, the drag was determined for six configurations, of which details for three are documented herein. These are: (1) conventional livestock trailer empty, (2) conventional trailer with smooth sides (i.e., without ventilation openings), and (3) a stream line tractor with modified livestock trailer (cab streamlining and gap fairing). The internal flow of the streamlined modification with simulated cattle was determined with two different ducting systems: a ram air inlet over the cab and NACA submerged inlets between the cab and trailer. The air flow within the conventional trailer was random and variable. The streamline vehicle with ram air inlet provided a nearly uniform air flow which could be controlled. The streamline vehicle with NACA submerged inlets provided better flow conditions than the conventional livestock trailer but not as uniform or controllable as the ram inlet configuration.

  2. HEAVY AND THERMAL OIL RECOVERY PRODUCTION MECHANISMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony R. Kovscek

    2003-04-01

    This technical progress report describes work performed from January 1 through March 31, 2003 for the project ''Heavy and Thermal Oil Recovery Production Mechanisms,'' DE-FC26-00BC15311. In this project, a broad spectrum of research is undertaken related to thermal and heavy-oil recovery. The research tools and techniques span from pore-level imaging of multiphase fluid flow to definition of reservoir-scale features through streamline-based history matching techniques. During this period, previous analysis of experimental data regarding multidimensional imbibition to obtain shape factors appropriate for dual-porosity simulation was verified by comparison among analytic, dual-porosity simulation, and fine-grid simulation. We continued to study the mechanismsmore » by which oil is produced from fractured porous media at high pressure and high temperature. Temperature has a beneficial effect on recovery and reduces residual oil saturation. A new experiment was conducted on diatomite core. Significantly, we show that elevated temperature induces fines release in sandstone cores and this behavior may be linked to wettability. Our work in the area of primary production of heavy oil continues with field cores and crude oil. On the topic of reservoir definition, work continued on developing techniques that integrate production history into reservoir models using streamline-based properties.« less

  3. From the track to the ocean: Using flow control to improve marine bio-logging tags for cetaceans

    PubMed Central

    Fiore, Giovani; Anderson, Erik; Garborg, C. Spencer; Murray, Mark; Johnson, Mark; Moore, Michael J.; Howle, Laurens

    2017-01-01

    Bio-logging tags are an important tool for the study of cetaceans, but superficial tags inevitably increase hydrodynamic loading. Substantial forces can be generated by tags on fast-swimming animals, potentially affecting behavior and energetics or promoting early tag removal. Streamlined forms have been used to reduce loading, but these designs can accelerate flow over the top of the tag. This non-axisymmetric flow results in large lift forces (normal to the animal) that become the dominant force component at high speeds. In order to reduce lift and minimize total hydrodynamic loading this work presents a new tag design (Model A) that incorporates a hydrodynamic body, a channel to reduce fluid speed differences above and below the housing and wing to redirect flow to counter lift. Additionally, three derivatives of the Model A design were used to examine the contribution of individual flow control features to overall performance. Hydrodynamic loadings of four models were compared using computational fluid dynamics (CFD). The Model A design eliminated all lift force and generated up to ~30 N of downward force in simulated 6 m/s aligned flow. The simulations were validated using particle image velocimetry (PIV) to experimentally characterize the flow around the tag design. The results of these experiments confirm the trends predicted by the simulations and demonstrate the potential benefit of flow control elements for the reduction of tag induced forces on the animal. PMID:28196148

  4. 3D Visualization of Global Ocean Circulation

    NASA Astrophysics Data System (ADS)

    Nelson, V. G.; Sharma, R.; Zhang, E.; Schmittner, A.; Jenny, B.

    2015-12-01

    Advanced 3D visualization techniques are seldom used to explore the dynamic behavior of ocean circulation. Streamlines are an effective method for visualization of flow, and they can be designed to clearly show the dynamic behavior of a fluidic system. We employ vector field editing and extraction software to examine the topology of velocity vector fields generated by a 3D global circulation model coupled to a one-layer atmosphere model simulating preindustrial and last glacial maximum (LGM) conditions. This results in a streamline-based visualization along multiple density isosurfaces on which we visualize points of vertical exchange and the distribution of properties such as temperature and biogeochemical tracers. Previous work involving this model examined the change in the energetics driving overturning circulation and mixing between simulations of LGM and preindustrial conditions. This visualization elucidates the relationship between locations of vertical exchange and mixing, as well as demonstrates the effects of circulation and mixing on the distribution of tracers such as carbon isotopes.

  5. AN INTEGRATED APPROACH TO CHARACTERIZING BYPASSED OIL IN HETEROGENEOUS AND FRACTURED RESERVOIRS USING PARTITIONING TRACERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhil Datta-Gupta

    2003-08-01

    We explore the use of efficient streamline-based simulation approaches for modeling partitioning interwell tracer tests in hydrocarbon reservoirs. Specifically, we utilize the unique features of streamline models to develop an efficient approach for interpretation and history matching of field tracer response. A critical aspect here is the underdetermined and highly ill-posed nature of the associated inverse problems. We have adopted an integrated approach whereby we combine data from multiple sources to minimize the uncertainty and non-uniqueness in the interpreted results. For partitioning interwell tracer tests, these are primarily the distribution of reservoir permeability and oil saturation distribution. A novel approachmore » to multiscale data integration using Markov Random Fields (MRF) has been developed to integrate static data sources from the reservoir such as core, well log and 3-D seismic data. We have also explored the use of a finite difference reservoir simulator, UTCHEM, for field-scale design and optimization of partitioning interwell tracer tests. The finite-difference model allows us to include detailed physics associated with reactive tracer transport, particularly those related with transverse and cross-streamline mechanisms. We have investigated the potential use of downhole tracer samplers and also the use of natural tracers for the design of partitioning tracer tests. Finally, the behavior of partitioning tracer tests in fractured reservoirs is investigated using a dual-porosity finite-difference model.« less

  6. Stripping Away the Forest; Sweden's Glacially Streamlined Landscape Evaluated through Lidar

    NASA Astrophysics Data System (ADS)

    Dowling, T.; Spagnolo, M.; Moller, P.

    2014-12-01

    The newly available Swedish National Height Model (SNHM) is a 2.0 m horizontal, and 0.1 m vertical resolution digital elevation model (DEM) that is free at the point of use for researchers based at Swedish institutions. With coverage currently at ~80% of the country and due to be completed by 2015 this spatially extensive, high resolution dataset has opened up new avenues of research for Quaternary geology in the country. The work presented here utilises the SNHM to map and evaluate more than 10,000 glacially streamlined landforms in the south-east of Sweden. The subsequently extracted morphological variables of length, width and height are then used to investiagte three areas; to test recent conclusions drawn from the glacially streamlined landscapes of Great Britain and North America/Canada, to assess the impact of different core types on the morphological expression of said features and to attempt to calculate which morphological variable best accounts for the variability seen in the dataset. It is found that in common with drumlins found in the British Isles, and elsewhere, their characteristics can be described by a log-normal distribution. However the long tail of the features characteristic distributions can cause problems for many of the commonly applied statistical methods of evaluation. Furthermore a re-appraisal of some conclusions drawn by previous works as to the presence of a fundamental scaling law in streamlined feature elongation is necessary due to evidence gathered here. Additionally; based on a limited sample size it has been found that it is not possible to differentiate a streamlined landform's core type based on their morphological characteristics alone. Larger 'known'-core data sets may be able to do so, based upon the length of a feature for example, however the sample size here was not sufficient to allow significant differences to come to the fore should they exist. And lastly, the extracted variable 'height' was found to account for the vast majoirty of the variance seen in the dataset when subject to a principle component analysis (PCA).

  7. Four Single-Page Learning Models.

    ERIC Educational Resources Information Center

    Hlynka, Denis

    1979-01-01

    Identifies four models of single-page learning systems that can streamline lengthy, complex prose: Information Mapping, Focal Press Model, Behavioral Objectives Model, and School Mathematics Model. (CMV)

  8. New insights into the folding of a β-sheet miniprotein in a reduced space of collective hydrogen bond variables: application to a hydrodynamic analysis of the folding flow.

    PubMed

    Kalgin, Igor V; Caflisch, Amedeo; Chekmarev, Sergei F; Karplus, Martin

    2013-05-23

    A new analysis of the 20 μs equilibrium folding/unfolding molecular dynamics simulations of the three-stranded antiparallel β-sheet miniprotein (beta3s) in implicit solvent is presented. The conformation space is reduced in dimensionality by introduction of linear combinations of hydrogen bond distances as the collective variables making use of a specially adapted principal component analysis (PCA); i.e., to make structured conformations more pronounced, only the formed bonds are included in determining the principal components. It is shown that a three-dimensional (3D) subspace gives a meaningful representation of the folding behavior. The first component, to which eight native hydrogen bonds make the major contribution (four in each beta hairpin), is found to play the role of the reaction coordinate for the overall folding process, while the second and third components distinguish the structured conformations. The representative points of the trajectory in the 3D space are grouped into conformational clusters that correspond to locally stable conformations of beta3s identified in earlier work. A simplified kinetic network based on the three components is constructed, and it is complemented by a hydrodynamic analysis. The latter, making use of "passive tracers" in 3D space, indicates that the folding flow is much more complex than suggested by the kinetic network. A 2D representation of streamlines shows there are vortices which correspond to repeated local rearrangement, not only around minima of the free energy surface but also in flat regions between minima. The vortices revealed by the hydrodynamic analysis are apparently not evident in folding pathways generated by transition-path sampling. Making use of the fact that the values of the collective hydrogen bond variables are linearly related to the Cartesian coordinate space, the RMSD between clusters is determined. Interestingly, the transition rates show an approximate exponential correlation with distance in the hydrogen bond subspace. Comparison with the many published studies shows good agreement with the present analysis for the parts that can be compared, supporting the robust character of our understanding of this "hydrogen atom" of protein folding.

  9. A case study of development and application of a streamlined control and response modeling system for PM2.5 attainment assessment in China.

    PubMed

    Long, Shicheng; Zhu, Yun; Jang, Carey; Lin, Che-Jen; Wang, Shuxiao; Zhao, Bin; Gao, Jian; Deng, Shuang; Xie, Junping; Qiu, Xuezhen

    2016-03-01

    This article describes the development and application of a streamlined air control and response modeling system with a novel response surface modeling-linear coupled fitting method and a new module to provide streamlined model data for PM2.5 attainment assessment in China. This method is capable of significantly reducing the dimensions required to establish a response surface model, as well as capturing more realistic response of PM2.5 to emission changes with a limited number of model simulations. The newly developed module establishes a data link between the system and the Software for Model Attainment Test-Community Edition (SMAT-CE), and has the ability to rapidly provide model responses to emission control scenarios for SMAT-CE using a simple interface. The performance of this streamlined system is demonstrated through a case study of the Yangtze River Delta (YRD) in China. Our results show that this system is capable of reproducing the Community Multi-Scale Air Quality (CMAQ) model simulation results with maximum mean normalized error<3.5%. It is also demonstrated that primary emissions make a major contribution to ambient levels of PM2.5 in January and August (e.g., more than 50% contributed by primary emissions in Shanghai), and Shanghai needs to have regional emission control both locally and in its neighboring provinces to meet China's annual PM2.5 National Ambient Air Quality Standard. The streamlined system provides a real-time control/response assessment to identify the contributions of major emission sources to ambient PM2.5 (and potentially O3 as well) and streamline air quality data for SMAT-CE to perform attainment assessments. Copyright © 2015. Published by Elsevier B.V.

  10. SUPG Finite Element Simulations of Compressible Flows

    NASA Technical Reports Server (NTRS)

    Kirk, Brnjamin, S.

    2006-01-01

    The Streamline-Upwind Petrov-Galerkin (SUPG) finite element simulations of compressible flows is presented. The topics include: 1) Introduction; 2) SUPG Galerkin Finite Element Methods; 3) Applications; and 4) Bibliography.

  11. Simulation of Cold Flow in a Truncated Ideal Nozzle with Film Cooling

    NASA Technical Reports Server (NTRS)

    Braman, K. E.; Ruf, J. H.

    2015-01-01

    Flow transients during rocket start-up and shut-down can lead to significant side loads on rocket nozzles. The capability to estimate these side loads computationally can streamline the nozzle design process. Towards this goal, the flow in a truncated ideal contour (TIC) nozzle has been simulated using RANS and URANS for a range of nozzle pressure ratios (NPRs) aimed to match a series of cold flow experiments performed at the NASA MSFC Nozzle Test Facility. These simulations were performed with varying turbulence model choices and for four approximations of the supersonic film injection geometry, each of which was created with a different simplification of the test article geometry. The results show that although a reasonable match to experiment can be obtained with varying levels of geometric fidelity, the modeling choices made do not fully represent the physics of flow separation in a TIC nozzle with film cooling.

  12. Quantifying effects of humans and climate on groundwater resources of Hawaii through sharp-interface modeling

    NASA Astrophysics Data System (ADS)

    Rotzoll, K.; Izuka, S. K.; Nishikawa, T.; Fienen, M. N.; El-Kadi, A. I.

    2016-12-01

    Some of the volcanic-rock aquifers of the islands of Hawaii are substantially developed, leading to concerns related to the effects of groundwater withdrawals on saltwater intrusion and stream base-flow reduction. A numerical modeling analysis using recent available information (e.g., recharge, withdrawals, hydrogeologic framework, and conceptual models of groundwater flow) advances current understanding of groundwater flow and provides insight into the effects of human activity and climate change on Hawaii's water resources. Three island-wide groundwater-flow models (Kauai, Oahu, and Maui) were constructed using MODFLOW 2005 coupled with the Seawater-Intrusion Package (SWI2), which simulates the transition between saltwater and freshwater in the aquifer as a sharp interface. This approach allowed coarse vertical discretization (maximum of two layers) without ignoring the freshwater-saltwater system at the regional scale. Model construction (FloPy3), parameter estimation (PEST), and analysis of results were streamlined using Python scripts. Model simulations included pre-development (1870) and recent (average of 2001-10) scenarios for each island. Additionally, scenarios for future withdrawals and climate change were simulated for Oahu. We present our streamlined approach and results showing estimated effects of human activity on the groundwater resource by quantifying decline in water levels, rise of the freshwater-saltwater interface, and reduction in stream base flow. Water-resource managers can use this information to evaluate consequences of groundwater development that can constrain future groundwater availability.

  13. Streamlining DOD Acquisitions: Balancing Schedule with Complexity

    DTIC Science & Technology

    2006-09-01

    from them has a distinct industrial flavor: streamlined processes, benchmarking, and business models . The requirements generation com- munity led by... model ), and the Department of the Navy assumed program lead. [Stable Program Inputs (-)] By 1984, the program goals included delivery of 913 V-22...they subsequently specified a crew of two. [Stable Program Input (-)] The contractor team won in a “fly-off” solely via modeling and simulation

  14. Efficient seeding and defragmentation of curvature streamlines for colonic polyp detection

    NASA Astrophysics Data System (ADS)

    Zhao, Lingxiao; Botha, Charl P.; Truyen, Roel; Vos, Frans M.; Post, Frits H.

    2008-03-01

    Many computer aided diagnosis (CAD) schemes have been developed for colon cancer detection using Virtual Colonoscopy (VC). In earlier work, we developed an automatic polyp detection method integrating flow visualization techniques, that forms part of the CAD functionality of an existing Virtual Colonoscopy pipeline. Curvature streamlines were used to characterize polyp surface shape. Features derived from curvature streamlines correlated highly with true polyp detections. During testing with a large number of patient data sets, we found that the correlation between streamline features and true polyps could be affected by noise and our streamline generation technique. The seeding and spacing constraints and CT noise could lead to streamline fragmentation, which reduced the discriminating power of our streamline features. In this paper, we present two major improvements of our curvature streamline generation. First, we adapted our streamline seeding strategy to the local surface properties and made the streamline generation faster. It generates a significantly smaller number of seeds but still results in a comparable and suitable streamline distribution. Second, based on our observation that longer streamlines are better surface shape descriptors, we improved our streamline tracing algorithm to produce longer streamlines. Our improved techniques are more effcient and also guide the streamline geometry to correspond better to colonic surface shape. These two adaptations support a robust and high correlation between our streamline features and true positive detections and lead to better polyp detection results.

  15. Developing and upgrading of solar system thermal energy storage simulation models. Technical progress report, March 1, 1979-February 29, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, J K; von Fuchs, G F; Zob, A P

    1980-05-01

    Two water tank component simulation models have been selected and upgraded. These models are called the CSU Model and the Extended SOLSYS Model. The models have been standardized and links have been provided for operation in the TRNSYS simulation program. The models are described in analytical terms as well as in computer code. Specific water tank tests were performed for the purpose of model validation. Agreement between model data and test data is excellent. A description of the limitations has also been included. Streamlining results and criteria for the reduction of computer time have also been shown for both watermore » tank computer models. Computer codes for the models and instructions for operating these models in TRNSYS have also been included, making the models readily available for DOE and industry use. Rock bed component simulation models have been reviewed and a model selected and upgraded. This model is a logical extension of the Mumma-Marvin model. Specific rock bed tests have been performed for the purpose of validation. Data have been reviewed for consistency. Details of the test results concerned with rock characteristics and pressure drop through the bed have been explored and are reported.« less

  16. Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)

    1999-01-01

    Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.

  17. Graphical simulation for aerospace manufacturing

    NASA Technical Reports Server (NTRS)

    Babai, Majid; Bien, Christopher

    1994-01-01

    Simulation software has become a key technological enabler for integrating flexible manufacturing systems and streamlining the overall aerospace manufacturing process. In particular, robot simulation and offline programming software is being credited for reducing down time and labor cost, while boosting quality and significantly increasing productivity.

  18. Adapting large batteries of research measures for immigrants.

    PubMed

    Aroian, Karen J

    2013-06-01

    A four-step, streamlined process to adapt a large battery of measures for a study of mother-child adjustment in Arab Muslim immigrants and the lessons learned are described. The streamlined process includes adapting content, translation, pilot testing, and extensive psychometric evaluation but omits in-depth qualitative inquiry to identify the full content domain of the constructs of interest and cognitive interviews to assess how respondents interpret items. Lessons learned suggest that the streamlined process is not sufficient for certain measures, particularly when there is little published information about how the measure performs with different groups, the measure requires substantial item revision to achieve content equivalence, and the measure is both challenging to translate and has little to no redundancy. When these conditions are present, condition-specific procedures need to be added to the streamlined process.

  19. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.

  20. Impingement of Droplets in 90 deg Elbows with Potential Flow

    NASA Technical Reports Server (NTRS)

    Hacker, Paul T.; Brun, Rinaldo J.; Boyd, Bemrose

    1953-01-01

    Trajectories were determined for droplets in air flowing through 90 deg elbows especially designed for two-dimensional potential motion with low pressure losses. The elbows were established by selecting as walls of each elbow two streamlines of the flow field produced by a complex potential function that establishes a two-dimensional flow around a 90 deg bend. An unlimited number of elbows with slightly different shapes can be established by selecting different pairs of streamlines as walls. The elbows produced by the complex potential function selected are suitable for use in aircraft air-intake ducts. The droplet impingement data derived from the trajectories are presented along with equations in such a manner that the collection efficiency, the area, the rate, and the distribution of droplet impingement can be determined for any elbow defined by any pair of streamlines within a portion of the flow field established by the complex potential function. Coordinates for some typical streamlines of the flow field and velocity components for several points along these streamlines are presented in tabular form.

  1. Real-time dynamics and control strategies for space operations of flexible structures

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alvin, K. F.; Alexander, S.

    1993-01-01

    This project (NAG9-574) was meant to be a three-year research project. However, due to NASA's reorganizations during 1992, the project was funded only for one year. Accordingly, every effort was made to make the present final report as if the project was meant to be for one-year duration. Originally, during the first year we were planning to accomplish the following: we were to start with a three dimensional flexible manipulator beam with articulated joints and with a linear control-based controller applied at the joints; using this simple example, we were to design the software systems requirements for real-time processing, introduce the streamlining of various computational algorithms, perform the necessary reorganization of the partitioned simulation procedures, and assess the potential speed-up realization of the solution process by parallel computations. The three reports included as part of the final report address: the streamlining of various computational algorithms; the necessary reorganization of the partitioned simulation procedures, in particular the observer models; and an initial attempt of reconfiguring the flexible space structures.

  2. Interactive Streamline Exploration and Manipulation Using Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, Xin; Chen, Chun-Ming; Shen, Han-Wei

    2015-01-12

    Occlusion presents a major challenge in visualizing three-dimensional flow fields with streamlines. Displaying too many streamlines at once makes it difficult to locate interesting regions, but displaying too few streamlines risks missing important features. A more ideal streamline exploration model is to allow the viewer to freely move across the field that has been populated with interesting streamlines and pull away the streamlines that cause occlusion so that the viewer can inspect the hidden ones in detail. In this paper, we present a streamline deformation algorithm that supports such user-driven interaction with three-dimensional flow fields. We define a view-dependent focus+contextmore » technique that moves the streamlines occluding the focus area using a novel displacement model. To preserve the context surrounding the user-chosen focus area, we propose two shape models to define the transition zone for the surrounding streamlines, and the displacement of the contextual streamlines is solved interactively with a goal of preserving their shapes as much as possible. Based on our deformation model, we design an interactive streamline exploration tool using a lens metaphor. Our system runs interactively so that users can move their focus and examine the flow field freely.« less

  3. An investigation of the flow characteristics in the blade endwall corner region

    NASA Technical Reports Server (NTRS)

    Hazarika, Birinchi K.; Raj, Rishi S.

    1987-01-01

    Studies were undertaken to determine the structure of the flow in the blade end wall corner region simulated by attaching two uncambered airfoils on either side of a flat plate with a semicircular leading edge. Detailed measurements of the corner flow were obtained with conventional pressure probes, hot wire anemometry, and flow visualization. The mean velocity profiles and six components of the Reynolds stress tensor were obtained with an inclined single sensor hot wire probe whereas power spectra were obtained with a single sensor oriented normal to the flow. Three streamwise vortices were identified based on the surface streamlines, distortion of total pressure profiles, and variation of mean velocity components in the corner. A horseshoe vortex formed near the leading edge of the airfoil. Within a short distance downstream, a corner vortex was detected between the horseshoe vortex and the surfaces forming the corner. A third vortex was formed at the rear portion of the corner between the corner vortex and the surface of the flat plate. Turbulent shear stress and production of turbulence are negligibly small. A region of negative turbulent shear stress was also observed near the region of low turbulence intensity from the vicinity of the flat plate.

  4. Vision and air flow combine to streamline flying honeybees

    PubMed Central

    Taylor, Gavin J.; Luu, Tien; Ball, David; Srinivasan, Mandyam V.

    2013-01-01

    Insects face the challenge of integrating multi-sensory information to control their flight. Here we study a ‘streamlining' response in honeybees, whereby honeybees raise their abdomen to reduce drag. We find that this response, which was recently reported to be mediated by optic flow, is also strongly modulated by the presence of air flow simulating a head wind. The Johnston's organs in the antennae were found to play a role in the measurement of the air speed that is used to control the streamlining response. The response to a combination of visual motion and wind is complex and can be explained by a model that incorporates a non-linear combination of the two stimuli. The use of visual and mechanosensory cues increases the strength of the streamlining response when the stimuli are present concurrently. We propose this multisensory integration will make the response more robust to transient disturbances in either modality. PMID:24019053

  5. AxTract: Toward microstructure informed tractography.

    PubMed

    Girard, Gabriel; Daducci, Alessandro; Petit, Laurent; Thiran, Jean-Philippe; Whittingstall, Kevin; Deriche, Rachid; Wassermann, Demian; Descoteaux, Maxime

    2017-11-01

    Diffusion-weighted (DW) magnetic resonance imaging (MRI) tractography has become the tool of choice to probe the human brain's white matter in vivo. However, tractography algorithms produce a large number of erroneous streamlines (false positives), largely due to complex ambiguous tissue configurations. Moreover, the relationship between the resulting streamlines and the underlying white matter microstructure characteristics remains poorly understood. In this work, we introduce a new approach to simultaneously reconstruct white matter fascicles and characterize the apparent distribution of axon diameters within fascicles. To achieve this, our method, AxTract, takes full advantage of the recent development DW-MRI microstructure acquisition, modeling, and reconstruction techniques. This enables AxTract to separate parallel fascicles with different microstructure characteristics, hence reducing ambiguities in areas of complex tissue configuration. We report a decrease in the incidence of erroneous streamlines compared to the conventional deterministic tractography algorithms on simulated data. We also report an average increase in streamline density over 15 known fascicles of the 34 healthy subjects. Our results suggest that microstructure information improves tractography in crossing areas of the white matter. Moreover, AxTract provides additional microstructure information along the fascicle that can be studied alongside other streamline-based indices. Overall, AxTract provides the means to distinguish and follow white matter fascicles using their microstructure characteristics, bringing new insights into the white matter organization. This is a step forward in microstructure informed tractography, paving the way to a new generation of algorithms able to deal with intricate configurations of white matter fibers and providing quantitative brain connectivity analysis. Hum Brain Mapp 38:5485-5500, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Self-diffusion in the non-Newtonian regime of shearing liquid crystal model systems based on the Gay-Berne potential

    NASA Astrophysics Data System (ADS)

    Sarman, Sten; Wang, Yong-Lei; Laaksonen, Aatto

    2016-02-01

    The self-diffusion coefficients of nematic phases of various model systems consisting of regular convex calamitic and discotic ellipsoids and non-convex bodies such as bent-core molecules and soft ellipsoid strings have been obtained as functions of the shear rate in a shear flow. Then the self-diffusion coefficient is a second rank tensor with three different diagonal components and two off-diagonal components. These coefficients were found to be determined by a combination of two mechanisms, which previously have been found to govern the self-diffusion of shearing isotropic liquids, namely, (i) shear alignment enhancing the diffusion in the direction parallel to the streamlines and hindering the diffusion in the perpendicular directions and (ii) the distortion of the shell structure in the liquid whereby a molecule more readily can escape from a surrounding shell of nearest neighbors, so that the mobility increases in every direction. Thus, the diffusion parallel to the streamlines always increases with the shear rate since these mechanisms cooperate in this direction. In the perpendicular directions, these mechanisms counteract each other so that the behaviour becomes less regular. In the case of the nematic phases of the calamitic and discotic ellipsoids and of the bent core molecules, mechanism (ii) prevails so that the diffusion coefficients increase. However, the diffusion coefficients of the soft ellipsoid strings decrease in the direction of the velocity gradient because the broadsides of these molecules are oriented perpendicularly to this direction due the shear alignment (i). The cross coupling coefficient relating a gradient of tracer particles in the direction of the velocity gradient and their flow in the direction of the streamlines is negative and rather large, whereas the other coupling coefficient relating a gradient in the direction of the streamlines and a flow in the direction of the velocity gradient is very small.

  7. A Conformal, Fully-Conservative Approach for Predicting Blast Effects on Ground Vehicles

    DTIC Science & Technology

    2014-04-01

    time integration  Approximate Riemann Fluxes (HLLE, HLLC) ◦ Robust mixture model for multi-material flows  Multiple Equations of State ◦ Perfect Gas...Loci/CHEM: Chemically reacting compressible flow solver . ◦ Currently in production use by NASA for the simulation of rocket motors, plumes, and...vehicles  Loci/DROPLET: Eulerian and Lagrangian multiphase solvers  Loci/STREAM: pressure-based solver ◦ Developed by Streamline Numerics and

  8. Quantifying effects of humans and climate on groundwater resources through modeling of volcanic-rock aquifers of Hawaii

    NASA Astrophysics Data System (ADS)

    Rotzoll, K.; Izuka, S. K.; Nishikawa, T.; Fienen, M. N.; El-Kadi, A. I.

    2015-12-01

    The volcanic-rock aquifers of Kauai, Oahu, and Maui are heavily developed, leading to concerns related to the effects of groundwater withdrawals on saltwater intrusion and streamflow. A numerical modeling analysis using the most recently available data (e.g., information on recharge, withdrawals, hydrogeologic framework, and conceptual models of groundwater flow) will substantially advance current understanding of groundwater flow and provide insight into the effects of human activity and climate change on Hawaii's water resources. Three island-wide groundwater-flow models were constructed using MODFLOW 2005 coupled with the Seawater-Intrusion Package (SWI2), which simulates the transition between saltwater and freshwater in the aquifer as a sharp interface. This approach allowed relatively fast model run times without ignoring the freshwater-saltwater system at the regional scale. Model construction (FloPy3), automated-parameter estimation (PEST), and analysis of results were streamlined using Python scripts. Model simulations included pre-development (1870) and current (average of 2001-10) scenarios for each island. Additionally, scenarios for future withdrawals and climate change were simulated for Oahu. We present our streamlined approach and preliminary results showing estimated effects of human activity on the groundwater resource by quantifying decline in water levels, reduction in stream base flow, and rise of the freshwater-saltwater interface.

  9. Off-design Performance Analysis of Multi-Stage Transonic Axial Compressors

    NASA Astrophysics Data System (ADS)

    Du, W. H.; Wu, H.; Zhang, L.

    Because of the complex flow fields and component interaction in modern gas turbine engines, they require extensive experiment to validate performance and stability. The experiment process can become expensive and complex. Modeling and simulation of gas turbine engines are way to reduce experiment costs, provide fidelity and enhance the quality of essential experiment. The flow field of a transonic compressor contains all the flow aspects, which are difficult to present-boundary layer transition and separation, shock-boundary layer interactions, and large flow unsteadiness. Accurate transonic axial compressor off-design performance prediction is especially difficult, due in large part to three-dimensional blade design and the resulting flow field. Although recent advancements in computer capacity have brought computational fluid dynamics to forefront of turbomachinery design and analysis, the grid and turbulence model still limit Reynolds-average Navier-Stokes (RANS) approximations in the multi-stage transonic axial compressor flow field. Streamline curvature methods are still the dominant numerical approach as an important tool for turbomachinery to analyze and design, and it is generally accepted that streamline curvature solution techniques will provide satisfactory flow prediction as long as the losses, deviation and blockage are accurately predicted.

  10. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  11. Toward Robust Estimation of the Components of Forest Population Change

    Treesearch

    Francis A. Roesch

    2014-01-01

    Multiple levels of simulation are used to test the robustness of estimators of the components of change. I first created a variety of spatial-temporal populations based on, but more variable than, an actual forest monitoring data set and then sampled those populations under a variety of sampling error structures. The performance of each of four estimation approaches is...

  12. Novel diffusion tensor imaging technique reveals developmental streamline volume changes in the corticospinal tract associated with leg motor control

    PubMed Central

    Kamson, David O.; Juhász, Csaba; Chugani, Harry T.; Jeong, Jeong-Won

    2014-01-01

    Background Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. Subjects and methods DTI data of 31 children (1–14years; mean age: 6±4years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Results Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=−.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. Conclusion These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. PMID:25027193

  13. Novel diffusion tensor imaging technique reveals developmental streamline volume changes in the corticospinal tract associated with leg motor control.

    PubMed

    Kamson, David O; Juhász, Csaba; Chugani, Harry T; Jeong, Jeong-Won

    2015-04-01

    Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. DTI data of 31 children (1-14 years; mean age: 6±4 years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=-.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. Copyright © 2014 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  14. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    PubMed

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  15. Overview of the Helios Version 2.0 Computational Platform for Rotorcraft Simulations

    NASA Technical Reports Server (NTRS)

    Sankaran, Venkateswaran; Wissink, Andrew; Datta, Anubhav; Sitaraman, Jayanarayanan; Jayaraman, Buvna; Potsdam, Mark; Katz, Aaron; Kamkar, Sean; Roget, Beatrice; Mavriplis, Dimitri; hide

    2011-01-01

    This article summarizes the capabilities and development of the Helios version 2.0, or Shasta, software for rotary wing simulations. Specific capabilities enabled by Shasta include off-body adaptive mesh refinement and the ability to handle multiple interacting rotorcraft components such as the fuselage, rotors, flaps and stores. In addition, a new run-mode to handle maneuvering flight has been added. Fundamental changes of the Helios interfaces have been introduced to streamline the integration of these capabilities. Various modifications have also been carried out in the underlying modules for near-body solution, off-body solution, domain connectivity, rotor fluid structure interface and comprehensive analysis to accommodate these interfaces and to enhance operational robustness and efficiency. Results are presented to demonstrate the mesh adaptation features of the software for the NACA0015 wing, TRAM rotor in hover and the UH-60A in forward flight.

  16. JELC-LITE: Unconventional Instructional Design for Special Operations Training

    NASA Technical Reports Server (NTRS)

    Friedman, Mark

    2012-01-01

    Current special operations staff training is based on the Joint Event Life Cycle (JELC). It addresses operational level tasks in multi-week, live military exercises which are planned over a 12 to 18 month timeframe. As the military experiences changing global mission sets, shorter training events using distributed technologies will increasingly be needed to augment traditional training. JELC-Lite is a new approach for providing relevant training between large scale exercises. This new streamlined, responsive training model uses distributed and virtualized training technologies to establish simulated scenarios. It keeps proficiency levels closer to optimal levels -- thereby reducing the performance degradation inherent in periodic training. It can be delivered to military as well as under-reached interagency groups to facilitate agile, repetitive training events. JELC-Lite is described by four phases paralleling the JELC, differing mostly in scope and scale. It has been successfully used with a Theater Special Operations Command and fits well within the current environment of reduced personnel and financial resources.

  17. Simulation of Natural Convection Heat Transfer in an Inclined Square Cavity With Perfectly Conducting Side Walls Using Finite Difference Approach

    NASA Astrophysics Data System (ADS)

    Azwadi, C. S. Nor; Fairus, M. Y. Mohd

    2010-06-01

    This study is about numerical simulation of natural heat transfer inside an inclined square cavity with perfectly conducting boundary conditions for the side walls. The Navier Stokes equations were solved using finite difference approach with uniform mesh procedure. Three different inclination angels were applied and the results are presented in terms of streamlines and isotherms plots. Based on the fluid flow pattern and the isothermal lines behaviour, the convection heat transfer has shown domination over the conduction as the tilt angle increases. The simulation of natural convection inside an air filled-tilted cavity is the first time to be done to the best of our knowledge.

  18. Communication interval selection in distributed heterogeneous simulation of large-scale dynamical systems

    NASA Astrophysics Data System (ADS)

    Lucas, Charles E.; Walters, Eric A.; Jatskevich, Juri; Wasynczuk, Oleg; Lamm, Peter T.

    2003-09-01

    In this paper, a new technique useful for the numerical simulation of large-scale systems is presented. This approach enables the overall system simulation to be formed by the dynamic interconnection of the various interdependent simulations, each representing a specific component or subsystem such as control, electrical, mechanical, hydraulic, or thermal. Each simulation may be developed separately using possibly different commercial-off-the-shelf simulation programs thereby allowing the most suitable language or tool to be used based on the design/analysis needs. These subsystems communicate the required interface variables at specific time intervals. A discussion concerning the selection of appropriate communication intervals is presented herein. For the purpose of demonstration, this technique is applied to a detailed simulation of a representative aircraft power system, such as that found on the Joint Strike Fighter (JSF). This system is comprised of ten component models each developed using MATLAB/Simulink, EASY5, or ACSL. When the ten component simulations were distributed across just four personal computers (PCs), a greater than 15-fold improvement in simulation speed (compared to the single-computer implementation) was achieved.

  19. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    DTIC Science & Technology

    2007-12-01

    acknowledged that Continuous Improvement (CI), or Kaizen in Japanese, is practiced in some way, shape, or form by most if not all Fortune 500 companies...greater resistance in the individualistic U.S. culture. Kaizen generally involves methodical examination and testing, followed by the adoption of new...or streamlined procedures, including scrupulous measurement and changes based on statistical deviation formulas. Kaizen appears to be a perfect fit

  20. The Teamwork Assessment Scale: A Novel Instrument to Assess Quality of Undergraduate Medical Students' Teamwork Using the Example of Simulation-based Ward-Rounds.

    PubMed

    Kiesewetter, Jan; Fischer, Martin R

    2015-01-01

    Simulation-based teamwork trainings are considered a powerful training method to advance teamwork, which becomes more relevant in medical education. The measurement of teamwork is of high importance and several instruments have been developed for various medical domains to meet this need. To our knowledge, no theoretically-based and easy-to-use measurement instrument has been published nor developed specifically for simulation-based teamwork trainings of medical students. Internist ward-rounds function as an important example of teamwork in medicine. The purpose of this study was to provide a validated, theoretically-based instrument that is easy-to-use. Furthermore, this study aimed to identify if and when rater scores relate to performance. Based on a theoretical framework for teamwork behaviour, items regarding four teamwork components (Team Coordination, Team Cooperation, Information Exchange, Team Adjustment Behaviours) were developed. In study one, three ward-round scenarios, simulated by 69 students, were videotaped and rated independently by four trained raters. The instrument was tested for the embedded psychometric properties and factorial structure. In study two, the instrument was tested for construct validity with an external criterion with a second set of 100 students and four raters. In study one, the factorial structure matched the theoretical components but was unable to separate Information Exchange and Team Cooperation. The preliminary version showed adequate psychometric properties (Cronbach's α=.75). In study two, the instrument showed physician rater scores were more reliable in measurement than those of student raters. Furthermore, a close correlation between the scale and clinical performance as an external criteria was shown (r=.64) and the sufficient psychometric properties were replicated (Cronbach's α=.78). The validation allows for use of the simulated teamwork assessment scale in undergraduate medical ward-round trainings to reliably measure teamwork by physicians. Further studies are needed to verify the applicability of the instrument.

  1. The Teamwork Assessment Scale: A Novel Instrument to Assess Quality of Undergraduate Medical Students' Teamwork Using the Example of Simulation-based Ward-Rounds

    PubMed Central

    Kiesewetter, Jan; Fischer, Martin R.

    2015-01-01

    Background: Simulation-based teamwork trainings are considered a powerful training method to advance teamwork, which becomes more relevant in medical education. The measurement of teamwork is of high importance and several instruments have been developed for various medical domains to meet this need. To our knowledge, no theoretically-based and easy-to-use measurement instrument has been published nor developed specifically for simulation-based teamwork trainings of medical students. Internist ward-rounds function as an important example of teamwork in medicine. Purposes: The purpose of this study was to provide a validated, theoretically-based instrument that is easy-to-use. Furthermore, this study aimed to identify if and when rater scores relate to performance. Methods: Based on a theoretical framework for teamwork behaviour, items regarding four teamwork components (Team Coordination, Team Cooperation, Information Exchange, Team Adjustment Behaviours) were developed. In study one, three ward-round scenarios, simulated by 69 students, were videotaped and rated independently by four trained raters. The instrument was tested for the embedded psychometric properties and factorial structure. In study two, the instrument was tested for construct validity with an external criterion with a second set of 100 students and four raters. Results: In study one, the factorial structure matched the theoretical components but was unable to separate Information Exchange and Team Cooperation. The preliminary version showed adequate psychometric properties (Cronbach’s α=.75). In study two, the instrument showed physician rater scores were more reliable in measurement than those of student raters. Furthermore, a close correlation between the scale and clinical performance as an external criteria was shown (r=.64) and the sufficient psychometric properties were replicated (Cronbach’s α=.78). Conclusions: The validation allows for use of the simulated teamwork assessment scale in undergraduate medical ward-round trainings to reliably measure teamwork by physicians. Further studies are needed to verify the applicability of the instrument. PMID:26038684

  2. The impact of groundwater velocity fields on streamlines in an aquifer system with a discontinuous aquitard (Inner Mongolia, China)

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Zhao, Yingwang; Xu, Hua

    2018-04-01

    Many numerical methods that simulate groundwater flow, particularly the continuous Galerkin finite element method, do not produce velocity information directly. Many algorithms have been proposed to improve the accuracy of velocity fields computed from hydraulic potentials. The differences in the streamlines generated from velocity fields obtained using different algorithms are presented in this report. The superconvergence method employed by FEFLOW, a popular commercial code, and some dual-mesh methods proposed in recent years are selected for comparison. The applications to depict hydrogeologic conditions using streamlines are used, and errors in streamlines are shown to lead to notable errors in boundary conditions, the locations of material interfaces, fluxes and conductivities. Furthermore, the effects of the procedures used in these two types of methods, including velocity integration and local conservation, are analyzed. The method of interpolating velocities across edges using fluxes is shown to be able to eliminate errors associated with refraction points that are not located along material interfaces and streamline ends at no-flow boundaries. Local conservation is shown to be a crucial property of velocity fields and can result in more accurate streamline densities. A case study involving both three-dimensional and two-dimensional cross-sectional models of a coal mine in Inner Mongolia, China, are used to support the conclusions presented.

  3. Investigation of smoothness-increasing accuracy-conserving filters for improving streamline integration through discontinuous fields.

    PubMed

    Steffen, Michael; Curtis, Sean; Kirby, Robert M; Ryan, Jennifer K

    2008-01-01

    Streamline integration of fields produced by computational fluid mechanics simulations is a commonly used tool for the investigation and analysis of fluid flow phenomena. Integration is often accomplished through the application of ordinary differential equation (ODE) integrators--integrators whose error characteristics are predicated on the smoothness of the field through which the streamline is being integrated--smoothness which is not available at the inter-element level of finite volume and finite element data. Adaptive error control techniques are often used to ameliorate the challenge posed by inter-element discontinuities. As the root of the difficulties is the discontinuous nature of the data, we present a complementary approach of applying smoothness-enhancing accuracy-conserving filters to the data prior to streamline integration. We investigate whether such an approach applied to uniform quadrilateral discontinuous Galerkin (high-order finite volume) data can be used to augment current adaptive error control approaches. We discuss and demonstrate through numerical example the computational trade-offs exhibited when one applies such a strategy.

  4. A systems engineering analysis of three-point and four-point wind turbine drivetrain configurations

    DOE PAGES

    Guo, Yi; Parsons, Tyler; Dykes, Katherine; ...

    2016-08-24

    This study compares the impact of drivetrain configuration on the mass and capital cost of a series of wind turbines ranging from 1.5 MW to 5.0 MW power ratings for both land-based and offshore applications. The analysis is performed with a new physics-based drivetrain analysis and sizing tool, Drive Systems Engineering (DriveSE), which is part of the Wind-Plant Integrated System Design & Engineering Model. DriveSE uses physics-based relationships to size all major drivetrain components according to given rotor loads simulated based on International Electrotechnical Commission design load cases. The model's sensitivity to input loads that contain a high degree ofmore » variability was analyzed. Aeroelastic simulations are used to calculate the rotor forces and moments imposed on the drivetrain for each turbine design. DriveSE is then used to size all of the major drivetrain components for each turbine for both three-point and four-point configurations. The simulation results quantify the trade-offs in mass and component costs for the different configurations. On average, a 16.7% decrease in total nacelle mass can be achieved when using a three-point drivetrain configuration, resulting in a 3.5% reduction in turbine capital cost. This analysis is driven by extreme loads and does not consider fatigue. Thus, the effects of configuration choices on reliability and serviceability are not captured. Furthermore, a first order estimate of the sizing, dimensioning and costing of major drivetrain components are made which can be used in larger system studies which consider trade-offs between subsystems such as the rotor, drivetrain and tower.« less

  5. A systems engineering analysis of three-point and four-point wind turbine drivetrain configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Yi; Parsons, Tyler; Dykes, Katherine

    This study compares the impact of drivetrain configuration on the mass and capital cost of a series of wind turbines ranging from 1.5 MW to 5.0 MW power ratings for both land-based and offshore applications. The analysis is performed with a new physics-based drivetrain analysis and sizing tool, Drive Systems Engineering (DriveSE), which is part of the Wind-Plant Integrated System Design & Engineering Model. DriveSE uses physics-based relationships to size all major drivetrain components according to given rotor loads simulated based on International Electrotechnical Commission design load cases. The model's sensitivity to input loads that contain a high degree ofmore » variability was analyzed. Aeroelastic simulations are used to calculate the rotor forces and moments imposed on the drivetrain for each turbine design. DriveSE is then used to size all of the major drivetrain components for each turbine for both three-point and four-point configurations. The simulation results quantify the trade-offs in mass and component costs for the different configurations. On average, a 16.7% decrease in total nacelle mass can be achieved when using a three-point drivetrain configuration, resulting in a 3.5% reduction in turbine capital cost. This analysis is driven by extreme loads and does not consider fatigue. Thus, the effects of configuration choices on reliability and serviceability are not captured. Furthermore, a first order estimate of the sizing, dimensioning and costing of major drivetrain components are made which can be used in larger system studies which consider trade-offs between subsystems such as the rotor, drivetrain and tower.« less

  6. New Insights into the Folding of a β-Sheet Miniprotein in a Reduced Space of Collective Hydrogen Bond Variables: Application to a Hydrodynamic Analysis of the Folding Flow

    PubMed Central

    Kalgin, Igor V.; Caflisch, Amedeo; Chekmarev, Sergei F.; Karplus, Martin

    2013-01-01

    A new analysis of the 20 μs equilibrium folding/unfolding molecular dynamics simulations of the three-stranded antiparallel β-sheet miniprotein (beta3s) in implicit solvent is presented. The conformation space is reduced in dimensionality by introduction of linear combinations of hydrogen bond distances as the collective variables making use of a specially adapted Principal Component Analysis (PCA); i.e., to make structured conformations more pronounced, only the formed bonds are included in determining the principal components. It is shown that a three-dimensional (3D) subspace gives a meaningful representation of the folding behavior. The first component, to which eight native hydrogen bonds make the major contribution (four in each beta hairpin), is found to play the role of the reaction coordinate for the overall folding process, while the second and third components distinguish the structured conformations. The representative points of the trajectory in the 3D space are grouped into conformational clusters that correspond to locally stable conformations of beta3s identified in earlier work. A simplified kinetic network based on the three components is constructed and it is complemented by a hydrodynamic analysis. The latter, making use of “passive tracers” in 3D space, indicates that the folding flow is much more complex than suggested by the kinetic network. A 2D representation of streamlines shows there are vortices which correspond to repeated local rearrangement, not only around minima of the free energy surface, but also in flat regions between minima. The vortices revealed by the hydrodynamic analysis are apparently not evident in folding pathways generated by transition-path sampling. Making use of the fact that the values of the collective hydrogen bond variables are linearly related to the Cartesian coordinate space, the RMSD between clusters is determined. Interestingly, the transition rates show an approximate exponential correlation with distance in the hydrogen bond subspace. Comparison with the many published studies shows good agreement with the present analysis for the parts that can be compared, supporting the robust character of our understanding of this “hydrogen atom” of protein folding. PMID:23621790

  7. Smart management of sample dilution using an artificial neural network to achieve streamlined processes and saving resources: the automated nephelometric testing of serum free light chain as case study.

    PubMed

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.

  8. Status of the Flooding Fragility Testing Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, C. L.; Savage, B.; Bhandari, B.

    2016-06-01

    This report provides an update on research addressing nuclear power plant component reliability under flooding conditions. The research includes use of the Component Flooding Evaluation Laboratory (CFEL) where individual components and component subassemblies will be tested to failure under various flooding conditions. The resulting component reliability data can then be incorporated with risk simulation strategies to provide a more thorough representation of overall plant risk. The CFEL development strategy consists of four interleaved phases. Phase 1 addresses design and application of CFEL with water rise and water spray capabilities allowing testing of passive and active components including fully electrified components.more » Phase 2 addresses research into wave generation techniques followed by the design and addition of the wave generation capability to CFEL. Phase 3 addresses methodology development activities including small scale component testing, development of full scale component testing protocol, and simulation techniques including Smoothed Particle Hydrodynamic (SPH) based computer codes. Phase 4 involves full scale component testing including work on full scale component testing in a surrogate CFEL testing apparatus.« less

  9. On 3-D inelastic analysis methods for hot section components (base program)

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Bak, M. J.; Nakazawa, S.; Banerjee, P. K.

    1986-01-01

    A 3-D Inelastic Analysis Method program is described. This program consists of a series of new computer codes embodying a progression of mathematical models (mechanics of materials, special finite element, boundary element) for streamlined analysis of: (1) combustor liners, (2) turbine blades, and (3) turbine vanes. These models address the effects of high temperatures and thermal/mechanical loadings on the local (stress/strain)and global (dynamics, buckling) structural behavior of the three selected components. Three computer codes, referred to as MOMM (Mechanics of Materials Model), MHOST (Marc-Hot Section Technology), and BEST (Boundary Element Stress Technology), have been developed and are briefly described in this report.

  10. Low-Reynolds Number Aerodynamics of an 8.9 Percent Scale Semispan Swept Wing for Assessment of Icing Effects

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Woodard, Brian S.; Diebold, Jeffrey M.; Moens, Frederic

    2017-01-01

    Aerodynamic assessment of icing effects on swept wings is an important component of a larger effort to improve three-dimensional icing simulation capabilities. An understanding of ice-shape geometric fidelity and Reynolds and Mach number effects on the iced-wing aerodynamics is needed to guide the development and validation of ice-accretion simulation tools. To this end, wind-tunnel testing and computational flow simulations were carried out for an 8.9%-scale semispan wing based upon the Common Research Model airplane configuration. The wind-tunnel testing was conducted at the Wichita State University 7 ft x 10 ft Beech wind tunnel from Reynolds numbers of 0.8×10(exp 6) to 2.4×10(exp 6) and corresponding Mach numbers of 0.09 to 0.27. This paper presents the results of initial studies investigating the model mounting configuration, clean-wing aerodynamics and effects of artificial ice roughness. Four different model mounting configurations were considered and a circular splitter plate combined with a streamlined shroud was selected as the baseline geometry for the remainder of the experiments and computational simulations. A detailed study of the clean-wing aerodynamics and stall characteristics was made. In all cases, the flow over the outboard sections of the wing separated as the wing stalled with the inboard sections near the root maintaining attached flow. Computational flow simulations were carried out with the ONERA elsA software that solves the compressible, three-dimensional RANS equations. The computations were carried out in either fully turbulent mode or with natural transition. Better agreement between the experimental and computational results was obtained when considering computations with free transition compared to turbulent solutions. These results indicate that experimental evolution of the clean wing performance coefficients were due to the effect of three-dimensional transition location and that this must be taken into account for future data analysis. This research also confirmed that artificial ice roughness created with rapid-prototype manufacturing methods can generate aerodynamic performance effects comparable to grit roughness of equivalent size when proper care is exercised in design and installation. The conclusions of this combined experimental and computational study contributed directly to the successful implementation of follow-on test campaigns with numerous artificial ice-shape configurations for this 8.9% scale model.

  11. Low-Reynolds Number Aerodynamics of an 8.9 Percent Scale Semispan Swept Wing for Assessment of Icing Effects

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Woodard, Brian S.; Diebold, Jeffrey M.; Moens, Frederic

    2017-01-01

    Aerodynamic assessment of icing effects on swept wings is an important component of a larger effort to improve three-dimensional icing simulation capabilities. An understanding of ice-shape geometric fidelity and Reynolds and Mach number effects on the iced-wing aerodynamics is needed to guide the development and validation of ice-accretion simulation tools. To this end, wind-tunnel testing and computational flow simulations were carried out for an 8.9 percent-scale semispan wing based upon the Common Research Model airplane configuration. The wind-tunnel testing was conducted at the Wichita State University 7 by 10 ft Beech wind tunnel from Reynolds numbers of 0.8×10(exp 6) to 2.4×10(exp 6) and corresponding Mach numbers of 0.09 to 0.27. This paper presents the results of initial studies investigating the model mounting configuration, clean-wing aerodynamics and effects of artificial ice roughness. Four different model mounting configurations were considered and a circular splitter plate combined with a streamlined shroud was selected as the baseline geometry for the remainder of the experiments and computational simulations. A detailed study of the clean-wing aerodynamics and stall characteristics was made. In all cases, the flow over the outboard sections of the wing separated as the wing stalled with the inboard sections near the root maintaining attached flow. Computational flow simulations were carried out with the ONERA elsA software that solves the compressible, threedimensional RANS equations. The computations were carried out in either fully turbulent mode or with natural transition. Better agreement between the experimental and computational results was obtained when considering computations with free transition compared to turbulent solutions. These results indicate that experimental evolution of the clean wing performance coefficients were due to the effect of three-dimensional transition location and that this must be taken into account for future data analysis. This research also confirmed that artificial ice roughness created with rapid-prototype manufacturing methods can generate aerodynamic performance effects comparable to grit roughness of equivalent size when proper care is exercised in design and installation. The conclusions of this combined experimental and computational study contributed directly to the successful implementation of follow-on test campaigns with numerous artificial ice-shape configurations for this 8.9 percent scale model.

  12. Multi-sources data fusion framework for remote triage prioritization in telehealth.

    PubMed

    Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K

    2014-09-01

    The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.

  13. CFD Prediction on the Pressure Distribution and Streamlines around an Isolated Single-Storey House Considering the Effect of Topographic Characteristics

    NASA Astrophysics Data System (ADS)

    Abdullah, J.; Zaini, S. S.; Aziz, M. S. A.; Majid, T. A.; Deraman, S. N. C.; Yahya, W. N. W.

    2018-04-01

    Single-storey houses are classified as low rise building and vulnerable to damages under windstorm event. This study was carried out with the aim to investigate the pressure distribution and streamlines around an isolated house by considering the effect of terrain characteristics. The topographic features such as flat, depression, ridge, and valley, are considered in this study. This simulation were analysed with Ansys FLUENT 14.0 software package. The result showed the topography characteristics influence the value of pressure coefficient and streamlines especially when the house was located at ridge terrain. The findings strongly suggested that wind analysis should include all topographic features in the analysis in order to establish the true wind force exerted on any structure.

  14. Simulation of a navigator algorithm for a low-cost GPS receiver

    NASA Technical Reports Server (NTRS)

    Hodge, W. F.

    1980-01-01

    The analytical structure of an existing navigator algorithm for a low cost global positioning system receiver is described in detail to facilitate its implementation on in-house digital computers and real-time simulators. The material presented includes a simulation of GPS pseudorange measurements, based on a two-body representation of the NAVSTAR spacecraft orbits, and a four component model of the receiver bias errors. A simpler test for loss of pseudorange measurements due to spacecraft shielding is also noted.

  15. The use of laterally vectored thrust to counter thrust asymmetry in a tactical jet aircraft

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A nonlinear, six degree-of-freedom flight simulator for a twin engine tactical jet was built on a hybrid computer to investigate lateral vectoring of the remaining thrust component for the case of a single engine failure at low dynamic pressures. Aircraft control was provided by an automatic controller rather than a pilot, and thrust vector control was provided by an open-loop controller that deflected a vane (located on the periphery of each exhaust jet and normally streamlined for noninterference with the flow). Lateral thrust vectoring decreased peak values of lateral control deflections, eliminated the requirement for steady-state lateral aerodynamic control deflections, and decreased the amount of altitude lost for a single engine failure.

  16. Motorcycle Drag Reduction using a Streamlined Object Ahead of the Rider

    NASA Astrophysics Data System (ADS)

    Selvamuthu, Thirukumaran; Thangadurai, Murugan

    2018-05-01

    Aerodynamics design of various components plays a significant role in reducing the overall drag of the vehicle to improve the fuel efficiency. In the present study, the effects of a semi-ellipsoidal structure placed ahead of a rider on the HONDA CBR 600 RR bike have been studied in detail for Reynolds number varying from 1.24 to 3.72 million. Three-dimensional numerical simulations were performed by solving the Reynolds averaged Navier-Stokes equations with the SST k-ω turbulence model. The numerical results were validated with the wind tunnel testing performed on a 1:12 scale down model using an external pyramidal balance. It has been observed that the wake pattern behind the vehicle, pressure and velocity distribution over the vehicle were modified remarkably by the inclusion of semi-ellipsoidal structure compared to the model with the rider. The drag coefficient of the bike was increased about 16% by placing a dummy rider over the vehicle. However, it decreased substantially and reached close to the base model value when the semi-ellipsoidal structure placed ahead of the rider. Further, the inclusion of semi-ellipsoidal structure produced a negative lift which improves the traction on the road compared to the base model.

  17. ``Magical'' fluid pathways: inspired airflow corridors for optimal drug delivery to human sinuses

    NASA Astrophysics Data System (ADS)

    Basu, Saikat; Farzal, Zainab; Kimbell, Julia S.

    2017-11-01

    Topical delivery methods like nasal sprays are an important therapeutic component for sinusitis (inflammation and clogging of the paranasal sinuses). The sinuses are air-filled sacs, identified as: maxillaries (under the eyes and deep to cheeks bilaterally; largest in volume), frontals (above and medial to the eyes, behind forehead area), ethmoids (between the eyes, inferior to the frontal sinuses), and sphenoids (superior and posterior to ethmoids). We develop anatomic CT-based 3D reconstructions of the human nasal cavity for multiple subjects. Through CFD simulations on Fluent for measured breathing rates, we track inspiratory airflow in all the models and the corresponding sprayed drug transport (for a commercially available sprayer, with experimentally tested particle size distributions). The protocol is implemented for a wide array of spray release points. We make the striking observation that the same release points in each subject provide better particle deposition in all the sinuses, despite the sinuses being located at different portions of the nasal cavity. This leads to the conjecture that the complicated anatomy-based flow physics artifacts in the nasal canal generate certain ``magical'' streamlines, providing passage for improved drug transport to all sinus targets. Supported by NIH Grant R01 HL122154.

  18. Migration of finite sized particles in a laminar square channel flow from low to high Reynolds numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbas, M., E-mail: micheline.abbas@ensiacet.fr; CNRS, Fédération de recherche FERMaT, CNRS, 31400, Toulouse; Magaud, P.

    2014-12-15

    The migration of neutrally buoyant finite sized particles in a Newtonian square channel flow is investigated in the limit of very low solid volumetric concentration, within a wide range of channel Reynolds numbers Re = [0.07-120]. In situ microscope measurements of particle distributions, taken far from the channel inlet (at a distance several thousand times the channel height), revealed that particles are preferentially located near the channel walls at Re > 10 and near the channel center at Re < 1. Whereas the cross-streamline particle motion is governed by inertia-induced lift forces at high inertia, it seems to be controlledmore » by shear-induced particle interactions at low (but finite) Reynolds numbers, despite the low solid volume fraction (<1%). The transition between both regimes is observed in the range Re = [1-10]. In order to exclude the effect of multi-body interactions, the trajectories of single freely moving particles are calculated thanks to numerical simulations based on the force coupling method. With the deployed numerical tool, the complete particle trajectories are accessible within a reasonable computational time only in the inertial regime (Re > 10). In this regime, we show that (i) the particle undergoes cross-streamline migration followed by a cross-lateral migration (parallel to the wall) in agreement with previous observations, and (ii) the stable equilibrium positions are located at the midline of the channel faces while the diagonal equilibrium positions are unstable. At low flow inertia, the first instants of the numerical simulations (carried at Re = O(1)) reveal that the cross-streamline migration of a single particle is oriented towards the channel wall, suggesting that the particle preferential positions around the channel center, observed in the experiments, are rather due to multi-body interactions.« less

  19. Assessing flow paths in a karst aquifer based on multiple dye tracing tests using stochastic simulation and the MODFLOW-CFP code

    NASA Astrophysics Data System (ADS)

    Assari, Amin; Mohammadi, Zargham

    2017-09-01

    Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.

  20. Development of an Integrated Nozzle for a Symmetric, RBCC Launch Vehicle Configuration

    NASA Technical Reports Server (NTRS)

    Smith, Timothy D.; Canabal, Francisco, III; Rice, Tharen; Blaha, Bernard

    2000-01-01

    The development of rocket based combined cycle (RBCC) engines is highly dependent upon integrating several different modes of operation into a single system. One of the key components to develop acceptable performance levels through each mode of operation is the nozzle. It must be highly integrated to serve the expansion processes of both rocket and air-breathing modes without undue weight, drag, or complexity. The NASA GTX configuration requires a fixed geometry, altitude-compensating nozzle configuration. The initial configuration, used mainly to estimate weight and cooling requirements was a 1 So half-angle cone, which cuts a concave surface from a point within the flowpath to the vehicle trailing edge. Results of 3-D CFD calculations on this geometry are presented. To address the critical issues associated with integrated, fixed geometry, multimode nozzle development, the GTX team has initiated a series of tasks to evolve the nozzle design, and validate performance levels. An overview of these tasks is given. The first element is a design activity to develop tools for integration of efficient expansion surfaces With the existing flowpath and vehicle aft-body, and to develop a second-generation nozzle design. A preliminary result using a "streamline-tracing" technique is presented. As the nozzle design evolves, a combination of 3-D CFD analysis and experimental evaluation will be used to validate the design procedure and determine the installed performance for propulsion cycle modeling. The initial experimental effort will consist of cold-flow experiments designed to validate the general trends of the streamline-tracing methodology and anchor the CFD analysis. Experiments will also be conducted to simulate nozzle performance during each mode of operation. As the design matures, hot-fire tests will be conducted to refine performance estimates and anchor more sophisticated reacting-flow analysis.

  1. Animating streamlines with repeated asymmetric patterns for steady flow visualization

    NASA Astrophysics Data System (ADS)

    Yeh, Chih-Kuo; Liu, Zhanping; Lee, Tong-Yee

    2012-01-01

    Animation provides intuitive cueing for revealing essential spatial-temporal features of data in scientific visualization. This paper explores the design of Repeated Asymmetric Patterns (RAPs) in animating evenly-spaced color-mapped streamlines for dense accurate visualization of complex steady flows. We present a smooth cyclic variable-speed RAP animation model that performs velocity (magnitude) integral luminance transition on streamlines. This model is extended with inter-streamline synchronization in luminance varying along the tangential direction to emulate orthogonal advancing waves from a geometry-based flow representation, and then with evenly-spaced hue differing in the orthogonal direction to construct tangential flow streaks. To weave these two mutually dual sets of patterns, we propose an energy-decreasing strategy that adopts an iterative yet efficient procedure for determining the luminance phase and hue of each streamline in HSL color space. We also employ adaptive luminance interleaving in the direction perpendicular to the flow to increase the contrast between streamlines.

  2. Mapping population-based structural connectomes.

    PubMed

    Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu

    2018-05-15

    Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. The impact of a streamlined funding application process on application time: two cross-sectional surveys of Australian researchers

    PubMed Central

    Barnett, Adrian G; Graves, Nicholas; Clarke, Philip; Herbert, Danielle

    2015-01-01

    Objective To examine if streamlining a medical research funding application process saved time for applicants. Design Cross-sectional surveys before and after the streamlining. Setting The National Health and Medical Research Council (NHMRC) of Australia. Participants Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures Average researcher time spent preparing an application and the total time for all applications in working days. Results The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications. PMID:25596201

  4. Computer simulation of refining process of a high consistency disc refiner based on CFD

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Yang, Jianwei; Wang, Jiahui

    2017-08-01

    In order to reduce refining energy consumption, the ANSYS CFX was used to simulate the refining process of a high consistency disc refiner. In the first it was assumed to be uniform Newton fluid of turbulent state in disc refiner with the k-ɛ flow model; then meshed grids and set the boundary conditions in 3-D model of the disc refiner; and then was simulated and analyzed; finally, the viscosity of the pulp were measured. The results show that the CFD method can be used to analyze the pressure and torque on the disc plate, so as to calculate the refining power, and streamlines and velocity vectors can also be observed. CFD simulation can optimize parameters of the bar and groove, which is of great significance to reduce the experimental cost and cycle.

  5. Mechanism analysis of Magnetohydrodynamic heat shield system and optimization of externally applied magnetic field

    NASA Astrophysics Data System (ADS)

    Li, Kai; Liu, Jun; Liu, Weiqiang

    2017-04-01

    As a novel thermal protection technique for hypersonic vehicles, Magnetohydrodynamic (MHD) heat shield system has been proved to be of great intrinsic value in the hypersonic field. In order to analyze the thermal protection mechanisms of such a system, a physical model is constructed for analyzing the effect of the Lorentz force components in the counter and normal directions. With a series of numerical simulations, the dominating Lorentz force components are analyzed for the MHD heat flux mitigation in different regions of a typical reentry vehicle. Then, a novel magnetic field with variable included angle between magnetic induction line and streamline is designed, which significantly improves the performance of MHD thermal protection in the stagnation and shoulder areas. After that, the relationships between MHD shock control and MHD thermal protection are investigated, based on which the magnetic field above is secondarily optimized obtaining better performances of both shock control and thermal protection. Results show that the MHD thermal protection is mainly determined by the Lorentz force's effect on the boundary layer. From the stagnation to the shoulder region, the flow deceleration effect of the counter-flow component is weakened while the flow deflection effect of the normal component is enhanced. Moreover, there is no obviously positive correlation between the MHD shock control and thermal protection. But once a good Lorentz force's effect on the boundary layer is guaranteed, the thermal protection performance can be further improved with an enlarged shock stand-off distance by strengthening the counter-flow Lorentz force right after shock.

  6. White Matter Tract Segmentation as Multiple Linear Assignment Problems

    PubMed Central

    Sharmin, Nusrat; Olivetti, Emanuele; Avesani, Paolo

    2018-01-01

    Diffusion magnetic resonance imaging (dMRI) allows to reconstruct the main pathways of axons within the white matter of the brain as a set of polylines, called streamlines. The set of streamlines of the whole brain is called the tractogram. Organizing tractograms into anatomically meaningful structures, called tracts, is known as the tract segmentation problem, with important applications to neurosurgical planning and tractometry. Automatic tract segmentation techniques can be unsupervised or supervised. A common criticism of unsupervised methods, like clustering, is that there is no guarantee to obtain anatomically meaningful tracts. In this work, we focus on supervised tract segmentation, which is driven by prior knowledge from anatomical atlases or from examples, i.e., segmented tracts from different subjects. We present a supervised tract segmentation method that segments a given tract of interest in the tractogram of a new subject using multiple examples as prior information. Our proposed tract segmentation method is based on the idea of streamline correspondence i.e., on finding corresponding streamlines across different tractograms. In the literature, streamline correspondence has been addressed with the nearest neighbor (NN) strategy. Differently, here we formulate the problem of streamline correspondence as a linear assignment problem (LAP), which is a cornerstone of combinatorial optimization. With respect to the NN, the LAP introduces a constraint of one-to-one correspondence between streamlines, that forces the correspondences to follow the local anatomical differences between the example and the target tract, neglected by the NN. In the proposed solution, we combined the Jonker-Volgenant algorithm (LAPJV) for solving the LAP together with an efficient way of computing the nearest neighbors of a streamline, which massively reduces the total amount of computations needed to segment a tract. Moreover, we propose a ranking strategy to merge correspondences coming from different examples. We validate the proposed method on tractograms generated from the human connectome project (HCP) dataset and compare the segmentations with the NN method and the ROI-based method. The results show that LAP-based segmentation is vastly more accurate than ROI-based segmentation and substantially more accurate than the NN strategy. We provide a Free/OpenSource implementation of the proposed method. PMID:29467600

  7. White Matter Tract Segmentation as Multiple Linear Assignment Problems.

    PubMed

    Sharmin, Nusrat; Olivetti, Emanuele; Avesani, Paolo

    2017-01-01

    Diffusion magnetic resonance imaging (dMRI) allows to reconstruct the main pathways of axons within the white matter of the brain as a set of polylines, called streamlines. The set of streamlines of the whole brain is called the tractogram. Organizing tractograms into anatomically meaningful structures, called tracts, is known as the tract segmentation problem, with important applications to neurosurgical planning and tractometry. Automatic tract segmentation techniques can be unsupervised or supervised. A common criticism of unsupervised methods, like clustering, is that there is no guarantee to obtain anatomically meaningful tracts. In this work, we focus on supervised tract segmentation, which is driven by prior knowledge from anatomical atlases or from examples, i.e., segmented tracts from different subjects. We present a supervised tract segmentation method that segments a given tract of interest in the tractogram of a new subject using multiple examples as prior information. Our proposed tract segmentation method is based on the idea of streamline correspondence i.e., on finding corresponding streamlines across different tractograms. In the literature, streamline correspondence has been addressed with the nearest neighbor (NN) strategy. Differently, here we formulate the problem of streamline correspondence as a linear assignment problem (LAP), which is a cornerstone of combinatorial optimization. With respect to the NN, the LAP introduces a constraint of one-to-one correspondence between streamlines, that forces the correspondences to follow the local anatomical differences between the example and the target tract, neglected by the NN. In the proposed solution, we combined the Jonker-Volgenant algorithm (LAPJV) for solving the LAP together with an efficient way of computing the nearest neighbors of a streamline, which massively reduces the total amount of computations needed to segment a tract. Moreover, we propose a ranking strategy to merge correspondences coming from different examples. We validate the proposed method on tractograms generated from the human connectome project (HCP) dataset and compare the segmentations with the NN method and the ROI-based method. The results show that LAP-based segmentation is vastly more accurate than ROI-based segmentation and substantially more accurate than the NN strategy. We provide a Free/OpenSource implementation of the proposed method.

  8. Material Model Evaluation of a Composite Honeycomb Energy Absorber

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Annett, Martin S.; Fasanella, Edwin L.; Polanco, Michael A.

    2012-01-01

    A study was conducted to evaluate four different material models in predicting the dynamic crushing response of solid-element-based models of a composite honeycomb energy absorber, designated the Deployable Energy Absorber (DEA). Dynamic crush tests of three DEA components were simulated using the nonlinear, explicit transient dynamic code, LS-DYNA . In addition, a full-scale crash test of an MD-500 helicopter, retrofitted with DEA blocks, was simulated. The four material models used to represent the DEA included: *MAT_CRUSHABLE_FOAM (Mat 63), *MAT_HONEYCOMB (Mat 26), *MAT_SIMPLIFIED_RUBBER/FOAM (Mat 181), and *MAT_TRANSVERSELY_ANISOTROPIC_CRUSHABLE_FOAM (Mat 142). Test-analysis calibration metrics included simple percentage error comparisons of initial peak acceleration, sustained crush stress, and peak compaction acceleration of the DEA components. In addition, the Roadside Safety Verification and Validation Program (RSVVP) was used to assess similarities and differences between the experimental and analytical curves for the full-scale crash test.

  9. Ringing in an HNBody Upgrade

    NASA Astrophysics Data System (ADS)

    Rimlinger, Thomas; Hamilton, Douglas; Hahn, Joseph M.

    2017-06-01

    We are in the process of developing a useful extension to the N-body integrator HNBody (Rauch & Hamilton 2002), enabling it to simulate a viscous, self-gravitating ring orbiting an oblate body. Our algorithm follows that used in the symplectic integrator epi_int (Hahn & Spitale 2013), in which the ring is simulated as many (~100) interacting, elliptic, confocal streamlines. This idea was first introduced in an analytic context by Goldreich & Tremaine (1979) and enabled rapid progress in the theory of ring evolution; since then, such discretization has been standard in the literature. While we adopt epi_int’s streamline formalism, we nevertheless improve upon its design in several ways. Epi_int uses epicyclic elements in its drift step; approximating these elements introduces small, systematic errors that build up with time. We sidestep this problem by instead using the more traditional Keplerian osculating elements. In addition, epi_int uses several particles per wire to effectively calculate the inter-gravitational forces everywhere along each streamline. We replicate this ability but can often gain a speed boost by using a single tracer particle per streamline; while this restricts us to simulating rings dominated by the m = 1 mode, this is typical of most observed narrow eccentric ringlets. We have also extended epi_int’s two dimensional algorithm into 3D. Finally, whereas epi_int is written in IDL, HNBody is written in C, which yields considerably faster integrations.Braga-Ribas et al. (2014) reported a set of narrow rings orbiting the Centaur Chariklo, but neither their investigation nor that of Pan & Wu (2016) yielded a satisfactory origin and evolution scenario. Eschewing the assumption that such rings must be short-lived, we instead argue (as in Rimlinger et al. 2016) that sufficiently eccentric rings can self-confine for hundreds of millions of years while circularizing. In this case, Chariklo may have formed rings as a KBO. We are working towards demonstrating both the feasibility of this theory and the utility of the HNBody extension by using it to simulate such a ring around Chariklo.

  10. Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport

    NASA Astrophysics Data System (ADS)

    Saracco, Paolo; Pia, Maria Grazia; Batic, Matej

    2014-04-01

    We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.

  11. SUPG Finite Element Simulations of Compressible Flows for Aerothermodynamic Applications

    NASA Technical Reports Server (NTRS)

    Kirk, Benjamin S.

    2007-01-01

    This viewgraph presentation reviews the Streamline-Upwind Petrov-Galerkin (SUPG) Finite Element Simulation. It covers the background, governing equations, weak formulation, shock capturing, inviscid flux discretization, time discretization, linearization, and implicit solution strategies. It also reviews some applications such as Type IV Shock Interaction, Forward-Facing Cavity and AEDC Sharp Double Cone.

  12. Streamlining genomes: toward the generation of simplified and stabilized microbial systems.

    PubMed

    Leprince, Audrey; van Passel, Mark W J; dos Santos, Vitor A P Martins

    2012-10-01

    At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and random) helps to generate simplified, stabilized and predictable genomes, whereas multiplexing genome engineering reveals a broad functional genetic diversity. The decrease in oligo and gene synthesis costs promises effective combinatorial tools for the generation of chassis based on streamlined and tractable genomes. Here we review recent progresses in streamlining genomes through recombineering techniques aiming to generate insights into cellular mechanisms and responses towards the design and assembly of streamlined genome chassis together with new cellular modules in diverse biotechnological applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. The impact of a streamlined funding application process on application time: two cross-sectional surveys of Australian researchers.

    PubMed

    Barnett, Adrian G; Graves, Nicholas; Clarke, Philip; Herbert, Danielle

    2015-01-16

    To examine if streamlining a medical research funding application process saved time for applicants. Cross-sectional surveys before and after the streamlining. The National Health and Medical Research Council (NHMRC) of Australia. Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Average researcher time spent preparing an application and the total time for all applications in working days. The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Reduced Structural Connectivity in Frontostriatal White Matter Tracts in the Associative Loop in Schizophrenia.

    PubMed

    Levitt, James J; Nestor, Paul G; Levin, Laura; Pelavin, Paula; Lin, Pan; Kubicki, Marek; McCarley, Robert W; Shenton, Martha E; Rathi, Yogesh

    2017-11-01

    The striatum receives segregated and integrative white matter tracts from the cortex facilitating information processing in the cortico-basal ganglia network. The authors examined both types of input tracts in the striatal associative loop in chronic schizophrenia patients and healthy control subjects. Structural and diffusion MRI scans were acquired on a 3-T system from 26 chronic schizophrenia patients and 26 matched healthy control subjects. Using FreeSurfer, the associative cortex was parcellated into ventrolateral prefrontal cortex and dorsolateral prefrontal cortex subregions. The striatum was manually parcellated into its associative and sensorimotor functional subregions. Fractional anisotropy and normalized streamlines, an estimate of fiber counts, were assessed in four frontostriatal tracts (dorsolateral prefrontal cortex-associative striatum, dorsolateral prefrontal cortex-sensorimotor striatum, ventrolateral prefrontal cortex-associative striatum, and ventrolateral prefrontal cortex-sensorimotor striatum). Furthermore, these measures were correlated with a measure of cognitive control, the Trail-Making Test, Part B. Results showed reduced fractional anisotropy and fewer streamlines in chronic schizophrenia patients for all four tracts, both segregated and integrative. Post hoc t tests showed reduced fractional anisotropy in the left ventrolateral prefrontal cortex-associative striatum and left ventrolateral prefrontal cortex-sensorimotor striatum and fewer normalized streamlines in the right dorsolateral prefrontal cortex-sensorimotor striatum and in the left and right ventrolateral prefrontal cortex-sensorimotor striatum in chronic schizophrenia patients. Furthermore, normalized streamlines in the right dorsolateral prefrontal cortex-sensorimotor striatum negatively correlated with Trail-Making Test, Part B, time spent in healthy control subjects but not in chronic schizophrenia patients. These findings demonstrated that structural connectivity is reduced in both segregated and integrative tracts in the striatal associative loop in chronic schizophrenia and that reduced normalized streamlines in the right-hemisphere dorsolateral prefrontal cortex-sensorimotor striatum predicted worse cognitive control in healthy control subjects but not in chronic schizophrenia patients, suggesting a loss of a "normal" brain-behavior correlation in chronic schizophrenia.

  15. Optimization of Simplex Atomizer Inlet Port Configuration through Computational Fluid Dynamics and Experimental Study for Aero-Gas Turbine Applications

    NASA Astrophysics Data System (ADS)

    Marudhappan, Raja; Chandrasekhar, Udayagiri; Hemachandra Reddy, Koni

    2017-10-01

    The design of plain orifice simplex atomizer for use in the annular combustion system of 1100 kW turbo shaft engine is optimized. The discrete flow field of jet fuel inside the swirl chamber of the atomizer and up to 1.0 mm downstream of the atomizer exit are simulated using commercial Computational Fluid Dynamics (CFD) software. The Euler-Euler multiphase model is used to solve two sets of momentum equations for liquid and gaseous phases and the volume fraction of each phase is tracked throughout the computational domain. The atomizer design is optimized after performing several 2D axis symmetric analyses with swirl and the optimized inlet port design parameters are used for 3D simulation. The Volume Of Fluid (VOF) multiphase model is used in the simulation. The orifice exit diameter is 0.6 mm. The atomizer is fabricated with the optimized geometric parameters. The performance of the atomizer is tested in the laboratory. The experimental observations are compared with the results obtained from 2D and 3D CFD simulations. The simulated velocity components, pressure field, streamlines and air core dynamics along the atomizer axis are compared to previous research works and found satisfactory. The work has led to a novel approach in the design of pressure swirl atomizer.

  16. A streamline curvature method for design of supercritical and subcritical airfoils

    NASA Technical Reports Server (NTRS)

    Barger, R. L.; Brooks, C. W., Jr.

    1974-01-01

    An airfoil design procedure, applicable to both subcritical and supercritical airfoils, is described. The method is based on the streamline curvature velocity equation. Several examples illustrating this method are presented and discussed.

  17. eSBMTools 1.0: enhanced native structure-based modeling tools.

    PubMed

    Lutz, Benjamin; Sinner, Claude; Heuermann, Geertje; Verma, Abhinav; Schug, Alexander

    2013-11-01

    Molecular dynamics simulations provide detailed insights into the structure and function of biomolecular systems. Thus, they complement experimental measurements by giving access to experimentally inaccessible regimes. Among the different molecular dynamics techniques, native structure-based models (SBMs) are based on energy landscape theory and the principle of minimal frustration. Typically used in protein and RNA folding simulations, they coarse-grain the biomolecular system and/or simplify the Hamiltonian resulting in modest computational requirements while achieving high agreement with experimental data. eSBMTools streamlines running and evaluating SBM in a comprehensive package and offers high flexibility in adding experimental- or bioinformatics-derived restraints. We present a software package that allows setting up, modifying and evaluating SBM for both RNA and proteins. The implemented workflows include predicting protein complexes based on bioinformatics-derived inter-protein contact information, a standardized setup of protein folding simulations based on the common PDB format, calculating reaction coordinates and evaluating the simulation by free-energy calculations with weighted histogram analysis method or by phi-values. The modules interface with the molecular dynamics simulation program GROMACS. The package is open source and written in architecture-independent Python2. http://sourceforge.net/projects/esbmtools/. alexander.schug@kit.edu. Supplementary data are available at Bioinformatics online.

  18. Water resources planning based on complex system dynamics: A case study of Tianjin city

    NASA Astrophysics Data System (ADS)

    Zhang, X. H.; Zhang, H. W.; Chen, B.; Chen, G. Q.; Zhao, X. H.

    2008-12-01

    A complex system dynamic (SD) model focusing on water resources, termed as TianjinSD, is developed for the integrated and scientific management of the water resources of Tianjin, which contains information feedback that governs interactions in the system and is capable of synthesizing component-level knowledge into system behavior simulation at an integrated level, thus presenting reasonable predictive results for policy-making on water resources allocation and management. As for the Tianjin city, interactions among 96 components for 12 years are explored and four planning alternatives are chosen, one of which is based on the conventional mode assuming that the existing pattern of human activities will be prevailed, while the others are alternative planning designs based on the interaction of local authorities and planning researchers. Optimal mode is therefore obtained according to different scenarios when compared the simulation results for evaluation of different decisions and dynamic consequences.

  19. Vlasov Simulation of Electrostatic Solitary Structures in Multi-Component Plasmas

    NASA Technical Reports Server (NTRS)

    Umeda, Takayuki; Ashour-Abdalla, Maha; Pickett, Jolene S.; Goldstein, Melvyn L.

    2012-01-01

    Electrostatic solitary structures have been observed in the Earth's magnetosheath by the Cluster spacecraft. Recent theoretical work has suggested that these solitary structures are modeled by electron acoustic solitary waves existing in a four-component plasma system consisting of core electrons, two counter-streaming electron beams, and one species of background ions. In this paper, the excitation of electron acoustic waves and the formation of solitary structures are studied by means of a one-dimensional electrostatic Vlasov simulation. The present result first shows that either electron acoustic solitary waves with negative potential or electron phase-space holes with positive potential are excited in four-component plasma systems. However, these electrostatic solitary structures have longer duration times and higher wave amplitudes than the solitary structures observed in the magnetosheath. The result indicates that a high-speed and small free energy source may be needed as a fifth component. An additional simulation of a five-component plasma consisting of a stable four-component plasma and a weak electron beam shows the generation of small and fast electron phase-space holes by the bump-on-tail instability. The physical properties of the small and fast electron phase-space holes are very similar to those obtained by the previous theoretical analysis. The amplitude and duration time of solitary structures in the simulation are also in agreement with the Cluster observation.

  20. Dynamic Investigation of Release Characteristics of a Streamlined Internal Store from a Simulated Bomb Bay of the Republic F-105 Airplane at Mach Numbers of 0.8, 1.4, and 1.98, Coord. No. AF-222

    NASA Technical Reports Server (NTRS)

    Lee, John B.

    1956-01-01

    An investigation has been conducted in the 27- by 27-inch preflight jet of the Langley Pilotless Aircraft Research Station at Wallops Island, Va., of the release characteristics of a dynamically scaled streamlined-type internally carried store from a simulated bomb bay at Mach numbers M(sub o) of 0.8, 1.4, and 1.98. A l/17-scale model of the Republic F-105 half-fuselage and bomb-bay configuration was used with a streamlined store shape of a fineness ratio of 6.00. Simulated altitudes were 3,400 feet at M(sub o) = 0.8, 3,400, and 29,000 feet at M(sub o) = 1.4, and 29,000 feet at M(sub o) = 1.98. At supersonic speeds, high pitching moments are induced on the store in the vicinity of the bomb bay at high dynamic pressures. Successful ejections could not be made with the original configuration at supersonic speeds at near sea-level conditions. The pitching moments caused by unsymmetrical pressures on the store in a disturbed flow field were overcome by replacing the high-aspect-ratio fin with a low-aspect-ratio fin that had a 30-percent area increase which was less subject to aeroelastic effects. Release characteristics of the store were improved by orienting the fins so that they were in a more uniform flow field at the point of store release. The store pitching moments were shown to be reduced by increasing the simulated altitude. Favorable ejections were made at subsonic speeds at near sea-level conditions.

  1. Modified current follower-based immittance function simulators

    NASA Astrophysics Data System (ADS)

    Alpaslan, Halil; Yuce, Erkan

    2017-12-01

    In this paper, four immittance function simulators consisting of a single modified current follower with single Z- terminal and a minimum number of passive components are proposed. The first proposed circuit can provide +L parallel with +R and the second proposed one can realise -L parallel with -R. The third proposed structure can provide +L series with +R and the fourth proposed one can realise -L series with -R. However, all the proposed immittance function simulators need a single resistive matching constraint. Parasitic impedance effects on all the proposed immittance function simulators are investigated. A second-order current-mode (CM) high-pass filter derived from the first proposed immittance function simulator is given as an application example. Also, a second-order CM low-pass filter derived from the third proposed immittance function simulator is given as an application example. A number of simulation results based on SPICE programme and an experimental test result are given to verify the theory.

  2. Segmentation of discrete vector fields.

    PubMed

    Li, Hongyu; Chen, Wenbin; Shen, I-Fan

    2006-01-01

    In this paper, we propose an approach for 2D discrete vector field segmentation based on the Green function and normalized cut. The method is inspired by discrete Hodge Decomposition such that a discrete vector field can be broken down into three simpler components, namely, curl-free, divergence-free, and harmonic components. We show that the Green Function Method (GFM) can be used to approximate the curl-free and the divergence-free components to achieve our goal of the vector field segmentation. The final segmentation curves that represent the boundaries of the influence region of singularities are obtained from the optimal vector field segmentations. These curves are composed of piecewise smooth contours or streamlines. Our method is applicable to both linear and nonlinear discrete vector fields. Experiments show that the segmentations obtained using our approach essentially agree with human perceptual judgement.

  3. Comparative Analysis of Wolbachia Genomes Reveals Streamlining and Divergence of Minimalist Two-Component Systems

    PubMed Central

    Christensen, Steen; Serbus, Laura Renee

    2015-01-01

    Two-component regulatory systems are commonly used by bacteria to coordinate intracellular responses with environmental cues. These systems are composed of functional protein pairs consisting of a sensor histidine kinase and cognate response regulator. In contrast to the well-studied Caulobacter crescentus system, which carries dozens of these pairs, the streamlined bacterial endosymbiont Wolbachia pipientis encodes only two pairs: CckA/CtrA and PleC/PleD. Here, we used bioinformatic tools to compare characterized two-component system relays from C. crescentus, the related Anaplasmataceae species Anaplasma phagocytophilum and Ehrlichia chaffeensis, and 12 sequenced Wolbachia strains. We found the core protein pairs and a subset of interacting partners to be highly conserved within Wolbachia and these other Anaplasmataceae. Genes involved in two-component signaling were positioned differently within the various Wolbachia genomes, whereas the local context of each gene was conserved. Unlike Anaplasma and Ehrlichia, Wolbachia two-component genes were more consistently found clustered with metabolic genes. The domain architecture and key functional residues standard for two-component system proteins were well-conserved in Wolbachia, although residues that specify cognate pairing diverged substantially from other Anaplasmataceae. These findings indicate that Wolbachia two-component signaling pairs share considerable functional overlap with other α-proteobacterial systems, whereas their divergence suggests the potential for regulatory differences and cross-talk. PMID:25809075

  4. Streamlining Field Data Collection With Mobile Apps

    NASA Astrophysics Data System (ADS)

    Camp, Reid J.; Wheaton, Joseph M.

    2014-12-01

    Fieldwork is a major component of nearly every geoscience discipline. Over the past 3 decades, scientists have amassed an array of specialized instrumentation and equipment to help them measure and monitor a staggering assortment of geophysical phenomena.

  5. Using McStas for modelling complex optics, using simple building bricks

    NASA Astrophysics Data System (ADS)

    Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim

    2011-04-01

    The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training.In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors.For quite a while, users have requested the ability to allow “components inside components” or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide.We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.

  6. Study on optimization of multiionization-chamber system for BNCT.

    PubMed

    Fujii, T; Tanaka, H; Maruhashi, A; Ono, K; Sakurai, Y

    2011-12-01

    In order to monitor stability of doses from the four components such as thermal, epi-thermal, fast neutron and gamma-ray during BNCT irradiation, we are developing a multiionization-chamber system. This system is consisted of four kinds of ionization chamber, which have specific sensitivity for each component, respectively. Since a suitable structure for each chamber depends on the energy spectrum of the irradiation field, the optimization study of the chamber structures for the epi-thermal neutron beam of cyclotron-based epi-thermal neutron source (C-BENS) was performed by using a Monte Carlo simulation code "PHITS" and suitable chamber-structures were determined. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Finite element procedures for time-dependent convection-diffusion-reaction systems

    NASA Technical Reports Server (NTRS)

    Tezduyar, T. E.; Park, Y. J.; Deans, H. A.

    1988-01-01

    New finite element procedures based on the streamline-upwind/Petrov-Galerkin formulations are developed for time-dependent convection-diffusion-reaction equations. These procedures minimize spurious oscillations for convection-dominated and reaction-dominated problems. The results obtained for representative numerical examples are accurate with minimal oscillations. As a special application problem, the single-well chemical tracer test (a procedure for measuring oil remaining in a depleted field) is simulated numerically. The results show the importance of temperature effects on the interpreted value of residual oil saturation from such tests.

  8. Simulation of an enhanced TCAS 2 system in operation

    NASA Technical Reports Server (NTRS)

    Rojas, R. G.; Law, P.; Burnside, W. D.

    1987-01-01

    Described is a computer simulation of a Boeing 737 aircraft equipped with an enhanced Traffic and Collision Avoidance System (TCAS II). In particular, an algorithm is developed which permits the computer simulation of the tracking of a target airplane by a Boeing 373 which has a TCAS II array mounted on top of its fuselage. This algorithm has four main components: namely, the target path, the noise source, the alpha-beta filter, and threat detection. The implementation of each of these four components is described. Furthermore, the areas where the present algorithm needs to be improved are also mentioned.

  9. Structures and mechanisms - Streamlining for fuel economy

    NASA Technical Reports Server (NTRS)

    Card, M. F.

    1983-01-01

    The design of prospective NASA space station components which inherently possess the means for structural growth without compromising initial system characteristics is considered. In structural design terms, space station growth can be achieved by increasing design safety factors, introducing dynamic isolators to prevent loads from reaching the initial components, or preplanning the refurbishment of the original structure with stronger elements. Design tradeoffs will be based on the definition of on-orbit loads, including docking and maneuvering, whose derived load spectra will allow the estimation of fatigue life. Improvements must be made in structural materials selection in order to reduce contamination, slow degradation, and extend the life of coatings. To minimize on-orbit maintenance, long service life lubrication systems with advanced sealing devices must be developed.

  10. Sensitivity studies of the new Coastal Surge and Inundation Prediction System

    NASA Astrophysics Data System (ADS)

    Condon, A. J.; Veeramony, J.

    2012-12-01

    This paper details the sensitivity studies involved in the validation of a coastal storm surge and inundation prediction system for operational use by the United States Navy. The system consists of the Delft3D-FLOW model coupled with the Delft3D-WAVE model. This dynamically coupled system will replace the current operational system, PC-Tides which does not include waves or other global ocean circulation. The Delft3D modeling system uses multiple nests to capture large, basin-scale circulation as well as coastal circulation and tightly couples waves and circulation at all scales. An additional benefit in using the presented system is that the Delft Dashboard, a graphical user interface product, can be used to simplify the set-up of Delft3D features such as the grid, elevation data, boundary forcing, and nesting. In this way less man-hours and training will be needed to perform inundation forecasts. The new coupled system is used to model storm surge and inundation produced by Hurricane Ike (2008) along the Gulf of Mexico coast. Due to the time constraints in an operational forecasting environment, storm simulations must be as streamlined as possible. Many factors such as model resolution, elevation data sets, parametrization of bottom friction, frequency of coupling between hydrodynamic and wave components, and atmospheric forcing among others can influence the run times and results of the simulations. To assess the sensitivity of the modeling system to these various components a "best" simulation was first developed. The best simulation consists of reanalysis atmospheric forcing in the form of Oceanweather wind and pressure fields. Further the wind field is modified by applying a directional land-masking to account for changes in land-roughness in the coastal zone. A number of air-sea drag coefficient formulations were tested to find the best match with observed results. An analysis of sea-level trends for the region reveals a seasonal trend of elevated sea level in the region which is applied throughout the Gulf of Mexico. The hydrodynamic model is run in 2D depth averaged mode with a spatially varying Manning's N coefficient based on land cover data. Multiple nests are used with resolutions varying between 0.1° and 0.004°. A blended bathymetry and topography dataset from multiple sources is used. Tidal constituents are obtained from the Oregon State University global model of ocean tides based on TOPEX7.2 satellite altimeter data. Simulated water level is compared to data from NOAA National Ocean Service observing stations throughout the region. Simulated inundation is compared to observations by means of Federal Emergency Management Agency High Water Mark (HWM) data. Results from the "best" simulation show very favorable comparison to observations. Simulated peak water levels are generally within 0.25 m and HWMs are well correlated with observations. Once the "best" simulation was established, sensitivity of the system to the wind model, drag coefficient, elevation dataset, initial water level, wave coupling, bottom roughness, and domain resolution was investigated. Each component has an influence on the simulation results, some much more than others. As expected the atmospheric forcing is the key component, however all other factors must be carefully chosen to obtain the best results.

  11. A novel drug management system in the Febuxostat versus Allopurinol Streamlined Trial: A description of a pharmacy system designed to supply medications directly to patients within a prospective multicenter randomised clinical trial.

    PubMed

    Rogers, Amy; Flynn, Robert Wv; McDonnell, Patrick; Mackenzie, Isla S; MacDonald, Thomas M

    2016-12-01

    Trials of investigational medicinal products are required to adhere to strict guidelines with regard to the handling and supply of medication. Information technology offers opportunities to approach clinical trial methodology in new ways. This report summarises a novel pharmacy system designed to supply trial medications directly to patients by post in the Febuxostat versus Allopurinol Streamlined Trial. A bespoke web-based software package was designed to facilitate the direct supply of trial medications to Febuxostat versus Allopurinol Streamlined Trial participants from a pharmacy based in the Medicines Monitoring Unit, University of Dundee. To date, 65,467 packs of medication have been dispensed using the system to 3978 patients. Up to 238 packs per day have been dispensed. The Medicines Monitoring Unit Febuxostat versus Allopurinol Streamlined Trial drug management system is an effective method of administering the complex drug supply requirements of a large-scale clinical trial with advantages over existing arrangements. A low rate of loss to follow-up in the Febuxostat versus Allopurinol Streamlined Trial may be attributable to the drug management system. © The Author(s) 2016.

  12. Lightweighting Impacts on Fuel Economy, Cost, and Component Losses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooker, A. D.; Ward, J.; Wang, L.

    2013-01-01

    The Future Automotive Systems Technology Simulator (FASTSim) is the U.S. Department of Energy's high-level vehicle powertrain model developed at the National Renewable Energy Laboratory. It uses a time versus speed drive cycle to estimate the powertrain forces required to meet the cycle. It simulates the major vehicle powertrain components and their losses. It includes a cost model based on component sizing and fuel prices. FASTSim simulated different levels of lightweighting for four different powertrains: a conventional gasoline engine vehicle, a hybrid electric vehicle (HEV), a plug-in hybrid electric vehicle (PHEV), and a battery electric vehicle (EV). Weight reductions impacted themore » conventional vehicle's efficiency more than the HEV, PHEV and EV. Although lightweighting impacted the advanced vehicles' efficiency less, it reduced component cost and overall costs more. The PHEV and EV are less cost effective than the conventional vehicle and HEV using current battery costs. Assuming the DOE's battery cost target of $100/kWh, however, the PHEV attained similar cost and lightweighting benefits. Generally, lightweighting was cost effective when it costs less than $6/kg of mass eliminated.« less

  13. Controlling Factors of the Fate of Ionospheric Outflow at Earth and Mars

    NASA Astrophysics Data System (ADS)

    Liemohn, M. W.; Welling, D. T.; Ilie, R.; Ganushkina, N. Y.; Johnson, B. C.; Xu, S.; Dong, C.

    2015-12-01

    Both Earth and Mars experience ionospheric outflow, but the radically different magnetic field configurations at the two planets yield significantly different patterns of outflow and processes governing outflow. This study examines a set of numerical simulations for Earth and Mars to explore the factors controlling ionospheric outflow and the fate of the escaping ions (immediate precipitation, magnetospheric recirculation, or loss to deep space). Specifically, simulation results from the Space Weather Modeling Framework (SWMF), which is capable of handling both planetary space environments, are analyzed to assess the physical processes governing the fate of ionospheric ions. Velocity streamlines from the SWMF results are traced from the high-latitude inner boundary of the BATS-R-US MHD simulation domain and followed through geospace. Some of these streamlines return to the inner boundary of the simulation domain, others extend to the outer boundary of the domain, while most others eventually cross (or at least approach) the magnetospheric equatorial plane. At Earth, this plane is well defined, while at Mars there are multiple mini-magnetospheres in which ionospheric ions can become trapped. These streamlines are categorized according to their eventual destination. Multi-fluid MHD simulations are examined in this study, assessing the influence of species mass on trajectories through near-planet space. Steady-state numerical experiments with different levels of solar driving are examined to quantify the influence of each driver on outflow characteristics and the fate of outflowing ions. Real event intervals are considered to assess flows in a time-varying magnetospheric system. For Earth, as solar wind dynamic pressure increases, the dominant outflow region moves to lower latitudes and significantly more of the outflowing ions escape to deep space. As the interplanetary magnetic field increases in southward magnitude, the region of dominant outflow shifts to lower latitudes and more is injected into the inner magnetosphere. The ionospheric regions dominantly contributing to mass within the magnetosphere are assessed and compared for the different driving conditions. At Mars, the situation is much more complicated.

  14. Calculation of laminar heating rates on three-dimensional configurations using the axisymmetric analogue

    NASA Technical Reports Server (NTRS)

    Hamilton, H. H., II

    1980-01-01

    A theoretical method was developed for computing approximate laminar heating rates on three dimensional configurations at angle of attack. The method is based on the axisymmetric analogue which is used to reduce the three dimensional boundary layer equations along surface streamlines to an equivalent axisymmetric form by using the metric coefficient which describes streamline divergence (or convergence). The method was coupled with a three dimensional inviscid flow field program for computing surface streamline paths, metric coefficients, and boundary layer edge conditions.

  15. Climate change and northern prairie wetlands: Simulations of long-term dynamics

    USGS Publications Warehouse

    Poiani, Karen A.; Johnson, W. Carter; Swanson, George A.; Winter, Thomas C.

    1996-01-01

    A mathematical model (WETSIM 2.0) was used to simulate wetland hydrology and vegetation dynamics over a 32-yr period (1961–1992) in a North Dakota prairie wetland. A hydrology component of the model calculated changes in water storage based on precipitation, evapotranspiration, snowpack, surface runoff, and subsurface inflow. A spatially explicit vegetation component in the model calculated changes in distribution of vegetative cover and open water, depending on water depth, seasonality, and existing type of vegetation.The model reproduced four known dry periods and one extremely wet period during the three decades. One simulated dry period in the early 1980s did not actually occur. Simulated water levels compared favorably with continuous observed water levels outside the calibration period (1990–1992). Changes in vegetative cover were realistic except for years when simulated water levels were significantly different than actual levels. These generally positive results support the use of the model for exploring the effects of possible climate changes on wetland resources.

  16. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  17. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  18. Numerical Simulation of Airflow Fields in Two Typical Nasal Structures of Empty Nose Syndrome: A Computational Fluid Dynamics Study

    PubMed Central

    Di, Meng-Yang; Jiang, Zhe; Gao, Zhi-Qiang; Li, Zhi; An, Yi-Ran; Lv, Wei

    2013-01-01

    Background The pathogenesis of empty nose syndrome (ENS) has not been elucidated so far. Though postulated, there remains a lack of experimental evidence about the roles of nasal aerodynamics on the development of ENS. Objective To investigate the nasal aerodynamic features of ENS andto explore the role of aerodynamic changes on the pathogenesis of ENS. Methods Seven sinonasal models were numerically constructed, based on the high resolution computed tomography images of seven healthy male adults. Bilateral radical inferior/middle turbinectomy were numerically performed to mimic the typical nasal structures of ENS-inferior turbinate (ENS-IT) and ENS-middle turbinate (ENS-MT). A steady laminar model was applied in calculation. Velocity, pressure, streamlines, air flux and wall shear stress were numerically investigated. Each parameter of normal structures was compared with those of the corresponding pathological models of ENS-IT and ENS-MT, respectively. Results ENS-MT: Streamlines, air flux distribution, and wall shear stress distribution were generally similar to those of the normal structures; nasal resistances decreased. Velocities decreased locally, while increased around the sphenopalatine ganglion by 0.20±0.17m/s and 0.22±0.10m/s during inspiration and expiration, respectively. ENS-IT: Streamlines were less organized with new vortexes shown near the bottom wall. The airflow rates passing through the nasal olfactory area decreased by 26.27%±8.68% and 13.18%±7.59% during inspiration and expiration, respectively. Wall shear stresses, nasal resistances and local velocities all decreased. Conclusion Our CFD simulation study suggests that the changes in nasal aerodynamics may play an essential role in the pathogenesis of ENS. An increased velocity around the sphenopalatine ganglion in the ENS-MT models could be responsible for headache in patients with ENS-MT. However, these results need to be validated in further studies with a larger sample size and more complicated calculating models. PMID:24367645

  19. Deep Part Load Flow Analysis in a Francis Model turbine by means of two-phase unsteady flow simulations

    NASA Astrophysics Data System (ADS)

    Conrad, Philipp; Weber, Wilhelm; Jung, Alexander

    2017-04-01

    Hydropower plants are indispensable to stabilize the grid by reacting quickly to changes of the energy demand. However, an extension of the operating range towards high and deep part load conditions without fatigue of the hydraulic components is desirable to increase their flexibility. In this paper a model sized Francis turbine at low discharge operating conditions (Q/QBEP = 0.27) is analyzed by means of computational fluid dynamics (CFD). Unsteady two-phase simulations for two Thoma-number conditions are conducted. Stochastic pressure oscillations, observed on the test rig at low discharge, require sophisticated numerical models together with small time steps, large grid sizes and long simulation times to cope with these fluctuations. In this paper the BSL-EARSM model (Explicit Algebraic Reynolds Stress) was applied as a compromise between scale resolving and two-equation turbulence models with respect to computational effort and accuracy. Simulation results are compared to pressure measurements showing reasonable agreement in resolving the frequency spectra and amplitude. Inner blade vortices were predicted successfully in shape and size. Surface streamlines in blade-to-blade view are presented, giving insights to the formation of the inner blade vortices. The acquired time dependent pressure fields can be used for quasi-static structural analysis (FEA) for fatigue calculations in the future.

  20. A microfluidic device for continuous manipulation of biological cells using dielectrophoresis.

    PubMed

    Das, Debanjan; Biswas, Karabi; Das, Soumen

    2014-06-01

    The present study demonstrates the design, simulation, fabrication and testing of a label-free continuous manipulation and separation micro-device of particles/biological cells suspended on medium based on conventional dielectrophoresis. The current dielectrophoretic device uses three planner electrodes to generate non-uniform electric field and induces both p-DEP and n-DEP force simultaneously depending on the dielectric properties of the particles and thus influencing at least two types of particles at a time. Numerical simulations were performed to predict the distribution of non-uniform electric field, DEP force and particle trajectories. The device is fabricated utilizing the advantage of bonding between PDMS and SU8 polymer. The p-DEP particles move away from the center of the streamline, while the n-DEP particles will follow the central streamline along the channel length. Dielectrophoretic effects were initially tested using polystyrene beads followed by manipulation of HeLa cells. In the experiment, it was observed that polystyrene beads in DI water always response as n-DEP up to 1MHz frequency, whereas HeLa cells in PBS medium response as n-DEP up to 400kHz frequency and then it experiences p-DEP up to 1MHz. Further, the microscopic observations of DEP responses of HeLa cells were verified by performing trapping experiment at static condition. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  1. Recent advances in heterocycle generation using the efficient Ugi multiple-component condensation reaction.

    PubMed

    Tempest, Paul A

    2005-11-01

    The current trend of rising research spending and falling numbers of novel chemical entities continues to drive efforts aimed at increasing efficiency in the drug discovery process. Strategic issues, such as assigning resources to poorly validated targets have been implicated in the declining productivity of recent years. Tactical approaches employed to improve this situation include attempts to speed the discovery process toward decision points in a timely manner. Accelerating the optimization of high-throughput screening hits is a goal in streamlining the discovery process, and the use of multiple-component condensation (MCC) reactions have proved useful toward this end. MCC reactions are powerful and efficient tools for the generation of diverse compound sets. Collections of compounds can be synthesized with all of the required diversity elements included in a single synthetic step. One of the most widely investigated MCC reactions is the Ugi four-component condensation. This review highlights disclosures of the Ugi reaction published over the past two years (2003 to 2005) in three areas: (i) Ugi reaction in conjunction with post-condensation cyclization; (ii) bifunctional condensations leading to heterocyclic cores; and (iii) general findings relating to linear products or interesting improvements in the basic Ugi reaction.

  2. Dashboard systems: implementing pharmacometrics from bench to bedside.

    PubMed

    Mould, Diane R; Upton, Richard N; Wojciechowski, Jessica

    2014-09-01

    In recent years, there has been increasing interest in the development of medical decision-support tools, including dashboard systems. Dashboard systems are software packages that integrate information and calculations about therapeutics from multiple components into a single interface for use in the clinical environment. Given the high cost of medical care, and the increasing need to demonstrate positive clinical outcomes for reimbursement, dashboard systems may become an important tool for improving patient outcome, improving clinical efficiency and containing healthcare costs. Similarly the costs associated with drug development are also rising. The use of model-based drug development (MBDD) has been proposed as a tool to streamline this process, facilitating the selection of appropriate doses and making informed go/no-go decisions. However, complete implementation of MBDD has not always been successful owing to a variety of factors, including the resources required to provide timely modeling and simulation updates. The application of dashboard systems in drug development reduces the resource requirement and may expedite updating models as new data are collected, allowing modeling results to be available in a timely fashion. In this paper, we present some background information on dashboard systems and propose the use of these systems both in the clinic and during drug development.

  3. Applied Cognitive Task Analysis (ACTA) Methodology

    DTIC Science & Technology

    1997-11-01

    experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the

  4. Magneto-phonon polaritons of antiferromagnetic/ion-crystal superlattices

    NASA Astrophysics Data System (ADS)

    Ta, Jin-Xing; Song, Yu-Ling; Wang, Xuan-Zhang

    2010-07-01

    Magnetophonon polaritons in the superlattices composed of alternating antiferromagnetic and ion-crystal components are investigated with the transfer matrix method. Numerical simulations based on FeF2/TlBr superlattices show that there are four different bulk polariton bands, with negative refraction and positive refraction. Many surface polariton modes with various features arise around the bulk bands with negative refraction.

  5. Black raspberry genetic and genomic resources development

    USDA-ARS?s Scientific Manuscript database

    This study incorporates field and laboratory components to advance and streamline identification of a variety of traits of economic interest and to develop molecular markers for marker assisted breeding of black raspberry (Rubus occidentalis). A lack of adapted, disease resistant cultivars has led t...

  6. Developing black raspberry genetic and genomic resources

    USDA-ARS?s Scientific Manuscript database

    This study incorporates field and laboratory components to advance and streamline identification of a variety of traits of economic interest and to develop molecular markers for marker assisted breeding of black raspberry (Rubus occidentalis). A lack of adapted, disease resistant cultivars has led t...

  7. Direct numerical simulation of laminar-turbulent flow over a flat plate at hypersonic flow speeds

    NASA Astrophysics Data System (ADS)

    Egorov, I. V.; Novikov, A. V.

    2016-06-01

    A method for direct numerical simulation of a laminar-turbulent flow around bodies at hypersonic flow speeds is proposed. The simulation is performed by solving the full three-dimensional unsteady Navier-Stokes equations. The method of calculation is oriented to application of supercomputers and is based on implicit monotonic approximation schemes and a modified Newton-Raphson method for solving nonlinear difference equations. By this method, the development of three-dimensional perturbations in the boundary layer over a flat plate and in a near-wall flow in a compression corner is studied at the Mach numbers of the free-stream of M = 5.37. In addition to pulsation characteristic, distributions of the mean coefficients of the viscous flow in the transient section of the streamlined surface are obtained, which enables one to determine the beginning of the laminar-turbulent transition and estimate the characteristics of the turbulent flow in the boundary layer.

  8. WebGL-enabled 3D visualization of a Solar Flare Simulation

    NASA Astrophysics Data System (ADS)

    Chen, A.; Cheung, C. M. M.; Chintzoglou, G.

    2016-12-01

    The visualization of magnetohydrodynamic (MHD) simulations of astrophysical systems such as solar flares often requires specialized software packages (e.g. Paraview and VAPOR). A shortcoming of using such software packages is the inability to share our findings with the public and scientific community in an interactive and engaging manner. By using the javascript-based WebGL application programming interface (API) and the three.js javascript package, we create an online in-browser experience for rendering solar flare simulations that will be interactive and accessible to the general public. The WebGL renderer displays objects such as vector flow fields, streamlines and textured isosurfaces. This allows the user to explore the spatial relation between the solar coronal magnetic field and the thermodynamic structure of the plasma in which the magnetic field is embedded. Plans for extending the features of the renderer will also be presented.

  9. Inertial Wave Turbulence Driven by Elliptical Instability.

    PubMed

    Le Reun, Thomas; Favier, Benjamin; Barker, Adrian J; Le Bars, Michael

    2017-07-21

    The combination of elliptical deformation of streamlines and vorticity can lead to the destabilization of any rotating flow via the elliptical instability. Such a mechanism has been invoked as a possible source of turbulence in planetary cores subject to tidal deformations. The saturation of the elliptical instability has been shown to generate turbulence composed of nonlinearly interacting waves and strong columnar vortices with varying respective amplitudes, depending on the control parameters and geometry. In this Letter, we present a suite of numerical simulations to investigate the saturation and the transition from vortex-dominated to wave-dominated regimes. This is achieved by simulating the growth and saturation of the elliptical instability in an idealized triply periodic domain, adding a frictional damping to the geostrophic component only, to mimic its interaction with boundaries. We reproduce several experimental observations within one idealized local model and complement them by reaching more extreme flow parameters. In particular, a wave-dominated regime that exhibits many signatures of inertial wave turbulence is characterized for the first time. This regime is expected in planetary interiors.

  10. Inertial Wave Turbulence Driven by Elliptical Instability

    NASA Astrophysics Data System (ADS)

    Le Reun, Thomas; Favier, Benjamin; Barker, Adrian J.; Le Bars, Michael

    2017-07-01

    The combination of elliptical deformation of streamlines and vorticity can lead to the destabilization of any rotating flow via the elliptical instability. Such a mechanism has been invoked as a possible source of turbulence in planetary cores subject to tidal deformations. The saturation of the elliptical instability has been shown to generate turbulence composed of nonlinearly interacting waves and strong columnar vortices with varying respective amplitudes, depending on the control parameters and geometry. In this Letter, we present a suite of numerical simulations to investigate the saturation and the transition from vortex-dominated to wave-dominated regimes. This is achieved by simulating the growth and saturation of the elliptical instability in an idealized triply periodic domain, adding a frictional damping to the geostrophic component only, to mimic its interaction with boundaries. We reproduce several experimental observations within one idealized local model and complement them by reaching more extreme flow parameters. In particular, a wave-dominated regime that exhibits many signatures of inertial wave turbulence is characterized for the first time. This regime is expected in planetary interiors.

  11. Effects of Time-Dependent Inflow Perturbations on Turbulent Flow in a Street Canyon

    NASA Astrophysics Data System (ADS)

    Duan, G.; Ngan, K.

    2017-12-01

    Urban flow and turbulence are driven by atmospheric flows with larger horizontal scales. Since building-resolving computational fluid dynamics models typically employ steady Dirichlet boundary conditions or forcing, the accuracy of numerical simulations may be limited by the neglect of perturbations. We investigate the sensitivity of flow within a unit-aspect-ratio street canyon to time-dependent perturbations near the inflow boundary. Using large-eddy simulation, time-periodic perturbations to the streamwise velocity component are incorporated via the nudging technique. Spatial averages of pointwise differences between unperturbed and perturbed velocity fields (i.e., the error kinetic energy) show a clear dependence on the perturbation period, though spatial structures are largely insensitive to the time-dependent forcing. The response of the error kinetic energy is maximized for perturbation periods comparable to the time scale of the mean canyon circulation. Frequency spectra indicate that this behaviour arises from a resonance between the inflow forcing and the mean motion around closed streamlines. The robustness of the results is confirmed using perturbations derived from measurements of roof-level wind speed.

  12. Planning, Implementation and Optimization of Future space Missions using an Immersive Visualization Environement (IVE) Machine

    NASA Astrophysics Data System (ADS)

    Harris, E.

    Planning, Implementation and Optimization of Future Space Missions using an Immersive Visualization Environment (IVE) Machine E. N. Harris, Lockheed Martin Space Systems, Denver, CO and George.W. Morgenthaler, U. of Colorado at Boulder History: A team of 3-D engineering visualization experts at the Lockheed Martin Space Systems Company have developed innovative virtual prototyping simulation solutions for ground processing and real-time visualization of design and planning of aerospace missions over the past 6 years. At the University of Colorado, a team of 3-D visualization experts are developing the science of 3-D visualization and immersive visualization at the newly founded BP Center for Visualization, which began operations in October, 2001. (See IAF/IAA-01-13.2.09, "The Use of 3-D Immersive Visualization Environments (IVEs) to Plan Space Missions," G. A. Dorn and G. W. Morgenthaler.) Progressing from Today's 3-D Engineering Simulations to Tomorrow's 3-D IVE Mission Planning, Simulation and Optimization Techniques: 3-D (IVEs) and visualization simulation tools can be combined for efficient planning and design engineering of future aerospace exploration and commercial missions. This technology is currently being developed and will be demonstrated by Lockheed Martin in the (IVE) at the BP Center using virtual simulation for clearance checks, collision detection, ergonomics and reach-ability analyses to develop fabrication and processing flows for spacecraft and launch vehicle ground support operations and to optimize mission architecture and vehicle design subject to realistic constraints. Demonstrations: Immediate aerospace applications to be demonstrated include developing streamlined processing flows for Reusable Space Transportation Systems and Atlas Launch Vehicle operations and Mars Polar Lander visual work instructions. Long-range goals include future international human and robotic space exploration missions such as the development of a Mars Reconnaissance Orbiter and Lunar Base construction scenarios. Innovative solutions utilizing Immersive Visualization provide the key to streamlining the mission planning and optimizing engineering design phases of future aerospace missions.

  13. Strategies for In Vivo Screening and Mitigation of Hepatotoxicity Associated with Antisense Drugs.

    PubMed

    Kamola, Piotr J; Maratou, Klio; Wilson, Paul A; Rush, Kay; Mullaney, Tanya; McKevitt, Tom; Evans, Paula; Ridings, Jim; Chowdhury, Probash; Roulois, Aude; Fairchild, Ann; McCawley, Sean; Cartwright, Karen; Gooderham, Nigel J; Gant, Timothy W; Moores, Kitty; Hughes, Stephen A; Edbrooke, Mark R; Clark, Kenneth; Parry, Joel D

    2017-09-15

    Antisense oligonucleotide (ASO) gapmers downregulate gene expression by inducing enzyme-dependent degradation of targeted RNA and represent a promising therapeutic platform for addressing previously undruggable genes. Unfortunately, their therapeutic application, particularly that of the more potent chemistries (e.g., locked-nucleic-acid-containing gapmers), has been hampered by their frequent hepatoxicity, which could be driven by hybridization-mediated interactions. An early de-risking of this liability is a crucial component of developing safe, ASO-based drugs. To rank ASOs based on their effect on the liver, we have developed an acute screen in the mouse that can be applied early in the drug development cycle. A single-dose (3-day) screen with streamlined endpoints (i.e., plasma transaminase levels and liver weights) was observed to be predictive of ASO hepatotoxicity ranking established based on a repeat-dose (15 day) study. Furthermore, to study the underlying mechanisms of liver toxicity, we applied transcriptome profiling and pathway analyses and show that adverse in vivo liver phenotypes correlate with the number of potent, hybridization-mediated off-target effects (OTEs). We propose that a combination of in silico OTE predictions, streamlined in vivo hepatotoxicity screening, and a transcriptome-wide selectivity screen is a valid approach to identifying and progressing safer compounds. Copyright © 2017 GSK R&D. Published by Elsevier Inc. All rights reserved.

  14. Passive simulation of the nonlinear port-Hamiltonian modeling of a Rhodes Piano

    NASA Astrophysics Data System (ADS)

    Falaize, Antoine; Hélie, Thomas

    2017-03-01

    This paper deals with the time-domain simulation of an electro-mechanical piano: the Fender Rhodes. A simplified description of this multi-physical system is considered. It is composed of a hammer (nonlinear mechanical component), a cantilever beam (linear damped vibrating component) and a pickup (nonlinear magneto-electronic transducer). The approach is to propose a power-balanced formulation of the complete system, from which a guaranteed-passive simulation is derived to generate physically-based realistic sound synthesis. Theses issues are addressed in four steps. First, a class of Port-Hamiltonian Systems is introduced: these input-to-output systems fulfill a power balance that can be decomposed into conservative, dissipative and source parts. Second, physical models are proposed for each component and are recast in the port-Hamiltonian formulation. In particular, a finite-dimensional model of the cantilever beam is derived, based on a standard modal decomposition applied to the Euler-Bernoulli model. Third, these systems are interconnected, providing a nonlinear finite-dimensional Port-Hamiltonian System of the piano. Fourth, a passive-guaranteed numerical method is proposed. This method is built to preserve the power balance in the discrete-time domain, and more precisely, its decomposition structured into conservative, dissipative and source parts. Finally, simulations are performed for a set of physical parameters, based on empirical but realistic values. They provide a variety of audio signals which are perceptively relevant and qualitatively similar to some signals measured on a real instrument.

  15. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  16. Composite laminate failure parameter optimization through four-point flexure experimentation and analysis

    DOE PAGES

    Nelson, Stacy; English, Shawn; Briggs, Timothy

    2016-05-06

    Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less

  17. Optically Based Rapid Screening Method for Proven Optimal Treatment Strategies Before Treatment Begins

    DTIC Science & Technology

    to rapidly test /screen breast cancer therapeutics as a strategy to streamline drug development and provide individualized treatment. The results...system can therefore be used to streamline pre-clinical drug development, by reducing the number of animals , cost, and time required to screen new drugs

  18. Two inviscid computational simulations of separated flow about airfoils

    NASA Technical Reports Server (NTRS)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  19. Simulating Effects of Forest Management Practices on Pesticide.

    Treesearch

    M.C. Smith; W.G. Knisel; J.L. Michael; D.G. Neary

    1993-01-01

    The GLEAMS model pesticide component was modified to simulate up to 245 pesticides simultaneously, and the revised model was used to pesticide pesticide application windows for forest site preparation and pine release. Five herbicides were made for soils representing four hydrologic soil groups in four climatic regions of the southeastern United States. Five herbicides...

  20. Design, development and clinical validation of computer-aided surgical simulation system for streamlined orthognathic surgical planning.

    PubMed

    Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M; Teichgraeber, John F; Gateno, Jaime; Xia, James J

    2017-12-01

    There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities.

  1. Design, development and clinical validation of computer-aided surgical simulation system for streamlined orthognathic surgical planning

    PubMed Central

    Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M.; Teichgraeber, John F.; Gateno, Jaime

    2017-01-01

    Purpose There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. Methods The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. Result When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. Conclusion We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities. PMID:28432489

  2. Advanced in Visualization of 3D Time-Dependent CFD Solutions

    NASA Technical Reports Server (NTRS)

    Lane, David A.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Numerical simulations of complex 3D time-dependent (unsteady) flows are becoming increasingly feasible because of the progress in computing systems. Unfortunately, many existing flow visualization systems were developed for time-independent (steady) solutions and do not adequately depict solutions from unsteady flow simulations. Furthermore, most systems only handle one time step of the solutions individually and do not consider the time-dependent nature of the solutions. For example, instantaneous streamlines are computed by tracking the particles using one time step of the solution. However, for streaklines and timelines, particles need to be tracked through all time steps. Streaklines can reveal quite different information about the flow than those revealed by instantaneous streamlines. Comparisons of instantaneous streamlines with dynamic streaklines are shown. For a complex 3D flow simulation, it is common to generate a grid system with several millions of grid points and to have tens of thousands of time steps. The disk requirement for storing the flow data can easily be tens of gigabytes. Visualizing solutions of this magnitude is a challenging problem with today's computer hardware technology. Even interactive visualization of one time step of the flow data can be a problem for some existing flow visualization systems because of the size of the grid. Current approaches for visualizing complex 3D time-dependent CFD solutions are described. The flow visualization system developed at NASA Ames Research Center to compute time-dependent particle traces from unsteady CFD solutions is described. The system computes particle traces (streaklines) by integrating through the time steps. This system has been used by several NASA scientists to visualize their CFD time-dependent solutions. The flow visualization capabilities of this system are described, and visualization results are shown.

  3. Upgrades to Electronic Speckle Interferometer (ESPI) Operation and Data Analysis at NASA's Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Connelly, Joseph; Blake, Peter; Jones, Joycelyn

    2008-01-01

    The authors report operational upgrades and streamlined data analysis of a commissioned electronic speckle interferometer (ESPI) in a permanent in-house facility at NASA's Goddard Space Flight Center. Our ESPI was commercially purchased for use by the James Webb Space Telescope (JWST) development team. We have quantified and reduced systematic error sources, improved the software operability with a user-friendly graphic interface, developed an instrument simulator, streamlined data analysis for long-duration testing, and implemented a turn-key approach to speckle interferometry. We also summarize results from a test of the JWST support structure (previously published), and present new results from several pieces of test hardware at various environmental conditions.

  4. A Study of Two-Equation Turbulence Models on the Elliptic Streamline Flow

    NASA Technical Reports Server (NTRS)

    Blaisdell, Gregory A.; Qin, Jim H.; Shariff, Karim; Rai, Man Mohan (Technical Monitor)

    1995-01-01

    Several two-equation turbulence models are compared to data from direct numerical simulations (DNS) of the homogeneous elliptic streamline flow, which combines rotation and strain. The models considered include standard two-equation models and models with corrections for rotational effects. Most of the rotational corrections modify the dissipation rate equation to account for the reduced dissipation rate in rotating turbulent flows, however, the DNS data shows that the production term in the turbulent kinetic energy equation is not modeled correctly by these models. Nonlinear relations for the Reynolds stresses are considered as a means of modifying the production term. Implications for the modeling of turbulent vortices will be discussed.

  5. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  6. A new approach to integrate GPU-based Monte Carlo simulation into inverse treatment plan optimization for proton therapy.

    PubMed

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2017-01-07

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6  ±  15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.

  7. A new approach to integrate GPU-based Monte Carlo simulation into inverse treatment plan optimization for proton therapy

    NASA Astrophysics Data System (ADS)

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2017-01-01

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6  ±  15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.

  8. A New Approach to Integrate GPU-based Monte Carlo Simulation into Inverse Treatment Plan Optimization for Proton Therapy

    PubMed Central

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2016-01-01

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6±15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size. PMID:27991456

  9. Reducing the Time and Cost of Testing Engines

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Producing a new aircraft engine currently costs approximately $1 billion, with 3 years of development time for a commercial engine and 10 years for a military engine. The high development time and cost make it extremely difficult to transition advanced technologies for cleaner, quieter, and more efficient new engines. To reduce this time and cost, NASA created a vision for the future where designers would use high-fidelity computer simulations early in the design process in order to resolve critical design issues before building the expensive engine hardware. To accomplish this vision, NASA's Glenn Research Center initiated a collaborative effort with the aerospace industry and academia to develop its Numerical Propulsion System Simulation (NPSS), an advanced engineering environment for the analysis and design of aerospace propulsion systems and components. Partners estimate that using NPSS has the potential to dramatically reduce the time, effort, and expense necessary to design and test jet engines by generating sophisticated computer simulations of an aerospace object or system. These simulations will permit an engineer to test various design options without having to conduct costly and time-consuming real-life tests. By accelerating and streamlining the engine system design analysis and test phases, NPSS facilitates bringing the final product to market faster. NASA's NPSS Version (V)1.X effort was a task within the Agency s Computational Aerospace Sciences project of the High Performance Computing and Communication program, which had a mission to accelerate the availability of high-performance computing hardware and software to the U.S. aerospace community for its use in design processes. The technology brings value back to NASA by improving methods of analyzing and testing space transportation components.

  10. A VLSI Implementation of Four-Phase Lift Controller Using Verilog HDL

    NASA Astrophysics Data System (ADS)

    Kumar, Manish; Singh, Priyanka; Singh, Shesha

    2017-08-01

    With the advent of an era of staggering range of new technologies to provide ease of mobility and transportation elevators have become an essential component of all high rise buildings. An elevator is a type of vertical transportation that moves people between the floors of a high rise building. A four-Phase lift controller modeled on Verilog HDL code using Finite State Machine (FSM) has been presented in this paper. Verilog HDL helps in automated analysis and simulation of lift controller circuit. This design is based on synchronous input that operates on a fixed frequency. The Lift motion is controlled by means of accepting the destination floor level as input and generate control signal as output. In the proposed design a Verilog RTL code is developed and verified. Project Navigator of XILINX has been used as a code writing platform and results were simulated using Modelsim 5.4a simulator. This paper discusses the overall evolution of design and also discusses simulated results.

  11. Studying Turbulence Using Numerical Simulation Databases. Part 6; Proceedings of the 1996 Summer Program

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics considered include: New approach to turbulence modeling; Second moment closure analysis of the backstep flow database; Prediction of the backflow and recovery regions in the backward facing step at various Reynolds numbers; Turbulent flame propagation in partially premixed flames; Ensemble averaged dynamic modeling. Also included a study of the turbulence structures of wall-bounded shear flows; Simulation and modeling of the elliptic streamline flow.

  12. Tractography of Association Fibers Associated with Language Processing.

    PubMed

    Egger, K; Yang, S; Reisert, M; Kaller, C; Mader, I; Beume, L; Weiller, C; Urbach, H

    2015-10-01

    Several major association fiber tracts are known to be part of the language processing system. There is evidence that high angular diffusion-based MRI is able to separate these fascicles in a constant way. In this study, we wanted to proof this thesis using a novel whole brain "global tracking" approach and to test for possible lateralization. Global tracking was performed in six healthy right-handed volunteers for the arcuate fascicle (AF), the medial longitudinal fascicle (MdLF), the inferior fronto-occipital fascicle (IFOF), and the inferior longitudinal fascicle (ILF). These fiber tracts were characterized quantitatively using the number of streamlines (SL) and the mean fractional anisotropy (FA). We were able to characterize the AF, the MdLF, the IFOF, and the ILF consistently in six healthy volunteers using global tracking. A left-sided dominance (LI > 0.2) for the AF was found in all participants. The MdLF showed a left-sided dominance in four participants (one female, three male). Regarding the FA, no lateralization (LI > 0.2) could be shown in any of the fascicles. Using a novel global tracking algorithm we confirmed that the courses of the primary language processing associated fascicles can consistently be differentiated. Additionally we were able to show a streamline-based left-sided lateralization in the AF of all right-handed healthy subjects.

  13. Computer simulation of two-dimensional unsteady flows in estuaries and embayments by the method of characteristics : basic theory and the formulation of the numerical method

    USGS Publications Warehouse

    Lai, Chintu

    1977-01-01

    Two-dimensional unsteady flows of homogeneous density in estuaries and embayments can be described by hyperbolic, quasi-linear partial differential equations involving three dependent and three independent variables. A linear combination of these equations leads to a parametric equation of characteristic form, which consists of two parts: total differentiation along the bicharacteristics and partial differentiation in space. For its numerical solution, the specified-time-interval scheme has been used. The unknown, partial space-derivative terms can be eliminated first by suitable combinations of difference equations, converted from the corresponding differential forms and written along four selected bicharacteristics and a streamline. Other unknowns are thus made solvable from the known variables on the current time plane. The computation is carried to the second-order accuracy by using trapezoidal rule of integration. Means to handle complex boundary conditions are developed for practical application. Computer programs have been written and a mathematical model has been constructed for flow simulation. The favorable computer outputs suggest further exploration and development of model worthwhile. (Woodard-USGS)

  14. Template-Directed Instrumentation Reduces Cost and Improves Efficiency for Total Knee Arthroplasty: An Economic Decision Analysis and Pilot Study.

    PubMed

    McLawhorn, Alexander S; Carroll, Kaitlin M; Blevins, Jason L; DeNegre, Scott T; Mayman, David J; Jerabek, Seth A

    2015-10-01

    Template-directed instrumentation (TDI) for total knee arthroplasty (TKA) may streamline operating room (OR) workflow and reduce costs by preselecting implants and minimizing instrument tray burden. A decision model simulated the economics of TDI. Sensitivity analyses determined thresholds for model variables to ensure TDI success. A clinical pilot was reviewed. The accuracy of preoperative templates was validated, and 20 consecutive primary TKAs were performed using TDI. The model determined that preoperative component size estimation should be accurate to ±1 implant size for 50% of TKAs to implement TDI. The pilot showed that preoperative template accuracy exceeded 97%. There were statistically significant improvements in OR turnover time and in-room time for TDI compared to an historical cohort of TKAs. TDI reduces costs and improves OR efficiency. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Eulerian and Lagrangian methods for vortex tracking in 2D and 3D flows

    NASA Astrophysics Data System (ADS)

    Huang, Yangzi; Green, Melissa

    2014-11-01

    Coherent structures are a key component of unsteady flows in shear layers. Improvement of experimental techniques has led to larger amounts of data and requires of automated procedures for vortex tracking. Many vortex criteria are Eulerian, and identify the structures by an instantaneous local swirling motion in the field, which are indicated by closed or spiral streamlines or pathlines in a reference frame. Alternatively, a Lagrangian Coherent Structures (LCS) analysis is a Lagrangian method based on the quantities calculated along fluid particle trajectories. In the current work, vortex detection is demonstrated on data from the simulation of two cases: a 2D flow with a flat plate undergoing a 45 ° pitch-up maneuver and a 3D wall-bounded turbulence channel flow. Vortices are visualized and tracked by their centers and boundaries using Γ1, the Q criterion, and LCS saddle points. In the cases of 2D flow, saddle points trace showed a rapid acceleration of the structure which indicates the shedding from the plate. For channel flow, saddle points trace shows that average structure convection speed exhibits a similar trend as a function of wall-normal distance as the mean velocity profile, and leads to statistical quantities of vortex dynamics. Dr. Jeff Eldredge and his research group at UCLA are gratefully acknowledged for sharing the database of simulation for the current research. This work was supported by the Air Force Office of Scientific Research under AFOSR Award No. FA9550-14-1-0210.

  16. The 3D pore structure and fluid dynamics simulation of macroporous monoliths: High permeability due to alternating channel width.

    PubMed

    Jungreuthmayer, Christian; Steppert, Petra; Sekot, Gerhard; Zankel, Armin; Reingruber, Herbert; Zanghellini, Jürgen; Jungbauer, Alois

    2015-12-18

    Polymethacrylate-based monoliths have excellent flow properties. Flow in the wide channel interconnected with narrow channels is theoretically assumed to account for favorable permeability. Monoliths were cut into 898 slices in 50nm distances and visualized by serial block face scanning electron microscopy (SBEM). A 3D structure was reconstructed and used for the calculation of flow profiles within the monolith and for calculation of pressure drop and permeability by computational fluid dynamics (CFD). The calculated and measured permeabilities showed good agreement. Small channels clearly flowed into wide and wide into small channels in a repetitive manner which supported the hypothesis describing the favorable flow properties of these materials. This alternating property is also reflected in the streamline velocity which fluctuated. These findings were corroborated by artificial monoliths which were composed of regular (interconnected) cells where narrow cells followed wide cells. In the real monolith and the artificial monoliths with interconnected flow channels similar velocity fluctuations could be observed. A two phase flow simulation showed a lateral velocity component, which may contribute to the transport of molecules to the monolith wall. Our study showed that the interconnection of small and wide pores is responsible for the excellent pressure flow properties. This study is also a guide for further design of continuous porous materials to achieve good flow properties. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T.

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, andmore » information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.« less

  18. Impingement of Droplets in 60 Deg Elbows with Potential Flow

    NASA Technical Reports Server (NTRS)

    Hacker, Paul T.; Saper, Paul G.; Kadow, Charles F.

    1956-01-01

    Trajectories were determined for water droplets or other aerosol particles in air flowing through 600 elbows especially designed for two-dimensional potential motion. The elbows were established by selecting as walls of each elbow two streamlines of a flow field produced by a complex potential function that establishes a two-dimensional flow around. a 600 bend. An unlimited number of elbows with slightly different shapes can be established by selecting different pairs of streamlines as walls. Some of these have a pocket on the outside wall. The elbows produced by the complex potential function are suitable for use in aircraft air-inlet ducts and have the following characteristics: (1) The resultant velocity at any point inside the elbow is always greater than zero but never exceeds the velocity at the entrance. (2) The air flow field at the entrance and exit is almost uniform and rectilinear. (3) The elbows are symmetrical with respect to the bisector of the angle of bend. These elbows should have lower pressure losses than bends of constant cross-sectional area. The droplet impingement data derived from the trajectories are presented along with equations so that collection efficiency, area, rate, and distribution of droplet impingement can be determined for any elbow defined by any pair of streamlines within a portion of the flow field established by the complex potential function. Coordinates for some typical streamlines of the flow field and velocity components for several points along these streamlines are presented in tabular form. A comparison of the 600 elbow with previous calculations for a comparable 90 elbow indicated that the impingement characteristics of the two elbows were very similar.

  19. A psychophysiological assessment of operator workload during simulated flight missions

    NASA Technical Reports Server (NTRS)

    Kramer, Arthur F.; Sirevaag, Erik J.; Braune, Rolf

    1987-01-01

    The applicability of the dual-task event-related (brain) potential (ERP) paradigm to the assessment of an operator's mental workload and residual capacity in a complex situation of a flight mission was demonstrated using ERP measurements and subjective workload ratings of student pilots flying a fixed-based single-engine simulator. Data were collected during two separate 45-min flights differing in difficulty; flight demands were examined by dividing each flight into four segments: takeoff, straight and level flight, holding patterns, and landings. The P300 ERP component in particular was found to discriminate among the levels of task difficulty in a systematic manner, decreasing in amplitude with an increase in task demands. The P300 amplitude is shown to be negatively correlated with deviations from command headings across the four flight segments.

  20. Fast Automatic Segmentation of White Matter Streamlines Based on a Multi-Subject Bundle Atlas.

    PubMed

    Labra, Nicole; Guevara, Pamela; Duclap, Delphine; Houenou, Josselin; Poupon, Cyril; Mangin, Jean-François; Figueroa, Miguel

    2017-01-01

    This paper presents an algorithm for fast segmentation of white matter bundles from massive dMRI tractography datasets using a multisubject atlas. We use a distance metric to compare streamlines in a subject dataset to labeled centroids in the atlas, and label them using a per-bundle configurable threshold. In order to reduce segmentation time, the algorithm first preprocesses the data using a simplified distance metric to rapidly discard candidate streamlines in multiple stages, while guaranteeing that no false negatives are produced. The smaller set of remaining streamlines is then segmented using the original metric, thus eliminating any false positives from the preprocessing stage. As a result, a single-thread implementation of the algorithm can segment a dataset of almost 9 million streamlines in less than 6 minutes. Moreover, parallel versions of our algorithm for multicore processors and graphics processing units further reduce the segmentation time to less than 22 seconds and to 5 seconds, respectively. This performance enables the use of the algorithm in truly interactive applications for visualization, analysis, and segmentation of large white matter tractography datasets.

  1. Evaluation of the impact of carotid artery bifurcation angle on hemodynamics by use of computational fluid dynamics: a simulation and volunteer study.

    PubMed

    Saho, Tatsunori; Onishi, Hideo

    2016-07-01

    In this study, we evaluated the hemodynamics of carotid artery bifurcation with various geometries using simulated and volunteer models based on magnetic resonance imaging (MRI). Computational fluid dynamics (CFD) was analyzed by use of OpenFOAM. The velocity distribution, streamline, and wall shear stress (WSS) were evaluated in a simulated model with known bifurcation angles (30°, 40°, 50°, 60°, derived from patients' data) and in three-dimensional (3D) healthy volunteer models. Separated flow was observed at the outer side of the bifurcation, and large bifurcation models represented upstream transfer of the point. Local WSS values at the outer bifurcation [both simulated (<30 Pa) and volunteer (<50 Pa) models] were lower than those in the inner region (>100 Pa). The bifurcation angle had a significant negative correlation with the WSS value (p<0.05). The results of this study show that the carotid artery bifurcation angle is related to the WSS value. This suggests that hemodynamic stress can be estimated based on the carotid artery geometry. The construction of a clinical database for estimation of developing atherosclerosis is warranted.

  2. Discussion of boundary-layer characteristics near the casing of an axial-flow compressor

    NASA Technical Reports Server (NTRS)

    Mager, Artur; Mahoney, John J; Budinger, Ray E

    1951-01-01

    Boundary-layer velocity profiles on the casing of an axial-flow compressor behind the guide vanes and rotor were measured and resolved into two components: along the streamline of the flow and perpendicular to it. Boundary-layer thickness and the deflection of the boundary layer at the wall were the generalizing parameters. By use of these results and the momentum-integral equations, the characteristics of boundary on the walls of axial-flow compressor are qualitatively discussed. Important parameters concerning secondary flow in the boundary layer appear to be turning of the flow and the product of boundary-layer thickness and streamline curvature outside the boundary layer. Two types of separation are shown to be possible in three dimensional boundary layer.

  3. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  4. 24 CFR 990.245 - Types of appeals.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Types of appeals. 990.245 Section... THE PUBLIC HOUSING OPERATING FUND PROGRAM Appeals § 990.245 Types of appeals. (a) Streamlined appeal. This appeal would demonstrate that the application of a specific Operating Fund formula component has a...

  5. Measuring the quantum geometric tensor in two-dimensional photonic and exciton-polariton systems

    NASA Astrophysics Data System (ADS)

    Bleu, O.; Solnyshkov, D. D.; Malpuech, G.

    2018-05-01

    We propose theoretically a method that allows to measure all the components of the quantum geometric tensor (the metric tensor and the Berry curvature) in a photonic system. The method is based on standard optical measurements. It applies to two-band systems, which can be mapped to a pseudospin, and to four-band systems, which can be described by two entangled pseudospins. We apply this method to several specific cases. We consider a 2D planar cavity with two polarization eigenmodes, where the pseudospin measurement can be performed via polarization-resolved photoluminescence. We also consider the s band of a staggered honeycomb lattice with polarization-degenerate modes (scalar photons), where the sublattice pseudospin can be measured by performing spatially resolved interferometric measurements. We finally consider the s band of a honeycomb lattice with polarized (spinor) photons as an example of a four-band model. We simulate realistic experimental situations in all cases. We find the photon eigenstates by solving the Schrödinger equation including pumping and finite lifetime, and then simulate the measurements to finally extract realistic mappings of the k-dependent tensor components.

  6. Mississippi Department of Transportation Research Division peer exchange II final report : September 24-26, 2002.

    DOT National Transportation Integrated Search

    2009-01-01

    A peer exchange can discuss general research management, or the agency can focus on certain parts of the research process. MDOT chose to focus on four themes: 1) Current Research Organization, 2) Identifying improvements That Streamline the Research ...

  7. Development of the brain's structural network efficiency in early adolescence: A longitudinal DTI twin study.

    PubMed

    Koenis, Marinka M G; Brouwer, Rachel M; van den Heuvel, Martijn P; Mandl, René C W; van Soelen, Inge L C; Kahn, René S; Boomsma, Dorret I; Hulshoff Pol, Hilleke E

    2015-12-01

    The brain is a network and our intelligence depends in part on the efficiency of this network. The network of adolescents differs from that of adults suggesting developmental changes. However, whether the network changes over time at the individual level and, if so, how this relates to intelligence, is unresolved in adolescence. In addition, the influence of genetic factors in the developing network is not known. Therefore, in a longitudinal study of 162 healthy adolescent twins and their siblings (mean age at baseline 9.9 [range 9.0-15.0] years), we mapped local and global structural network efficiency of cerebral fiber pathways (weighted with mean FA and streamline count) and assessed intelligence over a three-year interval. We find that the efficiency of the brain's structural network is highly heritable (locally up to 74%). FA-based local and global efficiency increases during early adolescence. Streamline count based local efficiency both increases and decreases, and global efficiency reorganizes to a net decrease. Local FA-based efficiency was correlated to IQ. Moreover, increases in FA-based network efficiency (global and local) and decreases in streamline count based local efficiency are related to increases in intellectual functioning. Individual changes in intelligence and local FA-based efficiency appear to go hand in hand in frontal and temporal areas. More widespread local decreases in streamline count based efficiency (frontal cingulate and occipital) are correlated with increases in intelligence. We conclude that the teenage brain is a network in progress in which individual differences in maturation relate to level of intellectual functioning. © 2015 Wiley Periodicals, Inc.

  8. Investigation of heat flux on aerodynamic body in supersonic gas flow with local energy deposition

    NASA Astrophysics Data System (ADS)

    Dobrov, Y. V.; Lashkov, V. A.; Mashek, I. Ch.; Khoronzhuk, R. S.

    2018-05-01

    Existence and intensive growth of heat flux on a vehicle is one of the main problems in hypersonic flight. Experimental study of heat flux in the stagnation point of a blunt cylinder in supersonic flow was made using gradient heat flux sensor. It was found that a transfer function of the measuring system should be used for obtaining data at fast-changing heat flux measurements. It was established that it was possible to produce a short-term heat transfer from the surface of streamlined body with the help of microwave discharge. Numerical simulation showed that it is possible to change nature of the flow by means of local energy deposition in case of streamlined wedge.

  9. Magnetic resonance electrical impedance tomography (MREIT) based on the solution of the convection equation using FEM with stabilization.

    PubMed

    Oran, Omer Faruk; Ider, Yusuf Ziya

    2012-08-21

    Most algorithms for magnetic resonance electrical impedance tomography (MREIT) concentrate on reconstructing the internal conductivity distribution of a conductive object from the Laplacian of only one component of the magnetic flux density (∇²B(z)) generated by the internal current distribution. In this study, a new algorithm is proposed to solve this ∇²B(z)-based MREIT problem which is mathematically formulated as the steady-state scalar pure convection equation. Numerical methods developed for the solution of the more general convection-diffusion equation are utilized. It is known that the solution of the pure convection equation is numerically unstable if sharp variations of the field variable (in this case conductivity) exist or if there are inconsistent boundary conditions. Various stabilization techniques, based on introducing artificial diffusion, are developed to handle such cases and in this study the streamline upwind Petrov-Galerkin (SUPG) stabilization method is incorporated into the Galerkin weighted residual finite element method (FEM) to numerically solve the MREIT problem. The proposed algorithm is tested with simulated and also experimental data from phantoms. Successful conductivity reconstructions are obtained by solving the related convection equation using the Galerkin weighted residual FEM when there are no sharp variations in the actual conductivity distribution. However, when there is noise in the magnetic flux density data or when there are sharp variations in conductivity, it is found that SUPG stabilization is beneficial.

  10. Grid Standards and Codes | Grid Modernization | NREL

    Science.gov Websites

    simulations that take advantage of advanced concepts such as hardware-in-the-loop testing. Such methods of methods and solutions. Projects Accelerating Systems Integration Standards Sharp increases in goal of this project is to develop streamlined and accurate methods for New York utilities to determine

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Stacy; English, Shawn; Briggs, Timothy

    Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less

  12. Streamline Your Project: A Lifecycle Model.

    ERIC Educational Resources Information Center

    Viren, John

    2000-01-01

    Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…

  13. Variability in Nose-to-Lung Aerosol Delivery

    PubMed Central

    Walenga, Ross L; Tian, Geng; Hindle, Michael; Yelverton, Joshua; Dodson, Kelley; Longest, P. Worth

    2014-01-01

    Nasal delivery of lung targeted pharmaceutical aerosols is ideal for drugs that need to be administered during high flow nasal cannula (HFNC) gas delivery, but based on previous studies losses and variability through both the delivery system and nasal cavity are expected to be high. The objective of this study was to assess the variability in aerosol delivery through the nose to the lungs with a nasal cannula interface for conventional and excipient enhanced growth (EEG) delivery techniques. A database of nasal cavity computed tomography (CT) scans was collected and analyzed, from which four models were selected to represent a wide range of adult anatomies, quantified based on the nasal surface area-to-volume ratio (SA/V). Computational fluid dynamics (CFD) methods were validated with existing in vitro data and used to predict aerosol delivery through a streamlined nasal cannula and the four nasal models at a steady state flow rate of 30 L/min. Aerosols considered were solid particles for EEG delivery (initial 0.9 μm and 1.5 μm aerodynamic diameters) and conventional droplets (5 μm) for a control case. Use of the EEG approach was found to reduce depositional losses in the nasal cavity by an order of magnitude and substantially reduce variability. Specifically, for aerosol deposition efficiency in the four geometries, the 95% confidence intervals (CI) for 0.9 and 5 μm aerosols were 2.3-3.1 and 15.5-66.3%, respectively. Simulations showed that the use of EEG as opposed to conventional methods improved delivered dose of aerosols through the nasopharynx, expressed as penetration fraction (PF), by approximately a factor of four. Variability of PF, expressed by the coefficient of variation (CV), was reduced by a factor of four with EEG delivery compared with the control case. Penetration fraction correlated well with SA/V for larger aerosols, but smaller aerosols showed some dependence on nasopharyngeal exit hydraulic diameter. In conclusion, results indicated that the EEG technique not only improved lung aerosol delivery, but largely eliminated variability in both nasal depositional loss and lung PF in a newly developed set of nasal airway models. PMID:25308992

  14. Rapid automation of a cell-based assay using a modular approach: case study of a flow-based Varicella Zoster Virus infectivity assay.

    PubMed

    Joelsson, Daniel; Gates, Irina V; Pacchione, Diana; Wang, Christopher J; Bennett, Philip S; Zhang, Yuhua; McMackin, Jennifer; Frey, Tina; Brodbeck, Kristin C; Baxter, Heather; Barmat, Scott L; Benetti, Luca; Bodmer, Jean-Luc

    2010-06-01

    Vaccine manufacturing requires constant analytical monitoring to ensure reliable quality and a consistent safety profile of the final product. Concentration and bioactivity of active components of the vaccine are key attributes routinely evaluated throughout the manufacturing cycle and for product release and dosage. In the case of live attenuated virus vaccines, bioactivity is traditionally measured in vitro by infection of susceptible cells with the vaccine followed by quantification of virus replication, cytopathology or expression of viral markers. These assays are typically multi-day procedures that require trained technicians and constant attention. Considering the need for high volumes of testing, automation and streamlining of these assays is highly desirable. In this study, the automation and streamlining of a complex infectivity assay for Varicella Zoster Virus (VZV) containing test articles is presented. The automation procedure was completed using existing liquid handling infrastructure in a modular fashion, limiting custom-designed elements to a minimum to facilitate transposition. In addition, cellular senescence data provided an optimal population doubling range for long term, reliable assay operation at high throughput. The results presented in this study demonstrate a successful automation paradigm resulting in an eightfold increase in throughput while maintaining assay performance characteristics comparable to the original assay. Copyright 2010 Elsevier B.V. All rights reserved.

  15. Streamline-based microfluidic device

    NASA Technical Reports Server (NTRS)

    Tai, Yu-Chong (Inventor); Zheng, Siyang (Inventor); Kasdan, Harvey (Inventor)

    2013-01-01

    The present invention provides a streamline-based device and a method for using the device for continuous separation of particles including cells in biological fluids. The device includes a main microchannel and an array of side microchannels disposed on a substrate. The main microchannel has a plurality of stagnation points with a predetermined geometric design, for example, each of the stagnation points has a predetermined distance from the upstream edge of each of the side microchannels. The particles are separated and collected in the side microchannels.

  16. Impact assessment: Eroding benefits through streamlining?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bond, Alan, E-mail: alan.bond@uea.ac.uk; School of Geo and Spatial Sciences, North-West University; Pope, Jenny, E-mail: jenny@integral-sustainability.net

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environmentmore » and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment.« less

  17. Simulations in nursing practice: toward authentic leadership.

    PubMed

    Shapira-Lishchinsky, Orly

    2014-01-01

    Aim  This study explores nurses' ethical decision-making in team simulations in order to identify the benefits of these simulations for authentic leadership. Background  While previous studies have indicated that team simulations may improve ethics in the workplace by reducing the number of errors, those studies focused mainly on clinical aspects and not on nurses' ethical experiences or on the benefits of authentic leadership. Methods  Fifty nurses from 10 health institutions in central Israel participated in the study. Data about nurses' ethical experiences were collected from 10 teams. Qualitative data analysis based on Grounded Theory was applied, using the atlas.ti 5.0 software package. Findings  Simulation findings suggest four main benefits that reflect the underlying components of authentic leadership: self-awareness, relational transparency, balanced information processing and internalized moral perspective. Conclusions  Team-based simulation as a training tool may lead to authentic leadership among nurses. Implications for nursing management  Nursing management should incorporate team simulations into nursing practice to help resolve power conflicts and to develop authentic leadership in nursing. Consequently, errors will decrease, patients' safety will increase and optimal treatment will be provided. © 2012 John Wiley & Sons Ltd.

  18. Using Advanced Analysis Approaches to Complete Long-Term Evaluations of Natural Attenuation Processes on the Remediation of Dissolved Chlorinated Solvent Contamination

    DTIC Science & Technology

    2008-10-01

    and UTCHEM (Clement et al., 1998). While all four of these software packages use conservation of mass as the basic principle for tracking NAPL...simulate dissolution of a single NAPL component. UTCHEM can be used to simulate dissolution of a multiple NAPL components using either linear or first...parameters. No UTCHEM a/ 3D model, general purpose NAPL simulator. Yes Virulo a/ Probabilistic model for predicting leaching of viruses in unsaturated

  19. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool; (2) a low fidelity simulator development tool; (3) a dynamic, interactive interface between the HCI and the simulator; and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  20. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool, (2) a low fidelity simulator development tool, (3) a dynamic, interactive interface between the HCI and the simulator, and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  1. 3-D inelastic analysis methods for hot section components. Volume 2: Advanced special functions models

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.

  2. High Volume Pulsed EPC for T/R Modules in Satellite Constellation

    NASA Astrophysics Data System (ADS)

    Notarianni, Michael; Maynadier, Paul; Marin, Marc

    2014-08-01

    In the frame of Iridium Next business, a mobile satellite service, Thales Alenia Space (TAS) has to produce more than 2400 x 65W and 162 x 250W pulsed Electronic Power Conditioners (EPC) to supply the RF transmit/receive modules that compose the active antenna of the satellites.The company has to deal with mass production constraints where cost, volume and performances are crucial factors. Compared to previous constellations realized by TAS, the overall challenge is to make further improvements in a short time:- Predictable electrical models- Deeper design-to-cost approach- Streamlining improvements and test coverageAs the active antenna drives the consumption of the payload, accurate performances have been evaluated early owing to the use of simulation (based on average model) and breadboard tests at the same time.The necessary cost reduction has been done owing to large use of COTS (Components Off The Shelf). In order to secure cost and schedule, each manufacturing step has been optimized to maximize test coverage in order to guarantee high reliability.At this time, more than 200 flight models have already been manufactured, validating this approach.This paper is focused on the 65W EPC but the same activities have been led on the 250W EPC.

  3. LIBS data analysis using a predictor-corrector based digital signal processor algorithm

    NASA Astrophysics Data System (ADS)

    Sanders, Alex; Griffin, Steven T.; Robinson, Aaron

    2012-06-01

    There are many accepted sensor technologies for generating spectra for material classification. Once the spectra are generated, communication bandwidth limitations favor local material classification with its attendant reduction in data transfer rates and power consumption. Transferring sensor technologies such as Cavity Ring-Down Spectroscopy (CRDS) and Laser Induced Breakdown Spectroscopy (LIBS) require effective material classifiers. A result of recent efforts has been emphasis on Partial Least Squares - Discriminant Analysis (PLS-DA) and Principle Component Analysis (PCA). Implementation of these via general purpose computers is difficult in small portable sensor configurations. This paper addresses the creation of a low mass, low power, robust hardware spectra classifier for a limited set of predetermined materials in an atmospheric matrix. Crucial to this is the incorporation of PCA or PLS-DA classifiers into a predictor-corrector style implementation. The system configuration guarantees rapid convergence. Software running on multi-core Digital Signal Processor (DSPs) simulates a stream-lined plasma physics model estimator, reducing Analog-to-Digital (ADC) power requirements. This paper presents the results of a predictorcorrector model implemented on a low power multi-core DSP to perform substance classification. This configuration emphasizes the hardware system and software design via a predictor corrector model that simultaneously decreases the sample rate while performing the classification.

  4. 75 FR 9797 - Policies To Promote Rural Radio Service and To Streamline Allotment and Assignment Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Media Bureau and the Wireless Telecommunications Bureau (collectively, the ``Bureaus'') delegated... relationship with the federal government as domestic dependent nations with inherent sovereign powers over... suggested by commenters. First, the Commission will allow assignments or transfers within the four-year...

  5. Kalman Filter for Spinning Spacecraft Attitude Estimation

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Sedlak, Joseph E.

    2008-01-01

    This paper presents a Kalman filter using a seven-component attitude state vector comprising the angular momentum components in an inertial reference frame, the angular momentum components in the body frame, and a rotation angle. The relatively slow variation of these parameters makes this parameterization advantageous for spinning spacecraft attitude estimation. The filter accounts for the constraint that the magnitude of the angular momentum vector is the same in the inertial and body frames by employing a reduced six-component error state. Four variants of the filter, defined by different choices for the reduced error state, are tested against a quaternion-based filter using simulated data for the THEMIS mission. Three of these variants choose three of the components of the error state to be the infinitesimal attitude error angles, facilitating the computation of measurement sensitivity matrices and causing the usual 3x3 attitude covariance matrix to be a submatrix of the 6x6 covariance of the error state. These variants differ in their choice for the other three components of the error state. The variant employing the infinitesimal attitude error angles and the angular momentum components in an inertial reference frame as the error state shows the best combination of robustness and efficiency in the simulations. Attitude estimation results using THEMIS flight data are also presented.

  6. Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grierson, B. A.; Yuan, X.; Gorelenkova, M.

    TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less

  7. Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT

    DOE PAGES

    Grierson, B. A.; Yuan, X.; Gorelenkova, M.; ...

    2018-02-21

    TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less

  8. Control method of Three-phase Four-leg converter based on repetitive control

    NASA Astrophysics Data System (ADS)

    Hui, Wang

    2018-03-01

    The research chose the magnetic levitation force of wind power generation system as the object. In order to improve the power quality problem caused by unbalanced load in power supply system, we combined the characteristics and repetitive control principle of magnetic levitation wind power generation system, and then an independent control strategy for three-phase four-leg converter was proposed. In this paper, based on the symmetric component method, the second order generalized integrator was used to generate the positive and negative sequence of signals, and the decoupling control was carried out under the synchronous rotating reference frame, in which the positive and negative sequence voltage is PI double closed loop, and a PI regulator with repetitive control was introduced to eliminate the static error regarding the fundamental frequency fluctuation characteristic of zero sequence component. The simulation results based on Matlab/Simulink show that the proposed control project can effectively suppress the disturbance caused by unbalanced loads and maintain the load voltage balance. The project is easy to be achieved and remarkably improves the quality of the independent power supply system.

  9. The composite load spectra project

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.; Kurth, R. E.

    1990-01-01

    Probabilistic methods and generic load models capable of simulating the load spectra that are induced in space propulsion system components are being developed. Four engine component types (the transfer ducts, the turbine blades, the liquid oxygen posts and the turbopump oxidizer discharge duct) were selected as representative hardware examples. The composite load spectra that simulate the probabilistic loads for these components are typically used as the input loads for a probabilistic structural analysis. The knowledge-based system approach used for the composite load spectra project provides an ideal environment for incremental development. The intelligent database paradigm employed in developing the expert system provides a smooth coupling between the numerical processing and the symbolic (information) processing. Large volumes of engine load information and engineering data are stored in database format and managed by a database management system. Numerical procedures for probabilistic load simulation and database management functions are controlled by rule modules. Rules were hard-wired as decision trees into rule modules to perform process control tasks. There are modules to retrieve load information and models. There are modules to select loads and models to carry out quick load calculations or make an input file for full duty-cycle time dependent load simulation. The composite load spectra load expert system implemented today is capable of performing intelligent rocket engine load spectra simulation. Further development of the expert system will provide tutorial capability for users to learn from it.

  10. Web-based learning resources - new opportunities for competency development.

    PubMed

    Moen, Anne; Nygård, Kathrine A; Gauperaa, Torunn

    2009-01-01

    Creating web-based learning environments holds great promise for on the job training and competence development in nursing. The web-based learning environment was designed and customized by four professional development nurses. We interviewed five RNs that pilot tested the web-based resource. Our findings give some insight into how the web-based design tool are perceived and utilized, and how content is represented in the learning environment. From a competency development perspective, practicing authentic tasks in a web-based learning environment can be useful to train skills and keep up important routines. The approach found in this study also needs careful consideration. Emphasizing routines and skills can be important to reduce variation and ensure more streamlined practice from an institution-wide quality improvement efforts. How the emphasis on routines and skills plays out towards the individual's overall professional development needs further careful studies.

  11. Analysis of the transient behavior of rubbing components

    NASA Technical Reports Server (NTRS)

    Quezdou, M. B.; Mullen, R. L.

    1986-01-01

    Finite element equations are developed for studying deformations and temperatures resulting from frictional heating in sliding system. The formulation is done for linear steady state motion in two dimensions. The equations include the effect of the velocity on the moving components. This gives spurious oscillations in their solutions by Galerkin finite element methods. A method called streamline upwind scheme is used to try to deal with this deficiency. The finite element program is then used to investigate the friction of heating in gas path seal.

  12. Multiscale combination of climate model simulations and proxy records over the last millennium

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Xing, Pei; Luo, Yong; Nie, Suping; Zhao, Zongci; Huang, Jianbin; Tian, Qinhua

    2018-05-01

    To highlight the compatibility of climate model simulation and proxy reconstruction at different timescales, a timescale separation merging method combining proxy records and climate model simulations is presented. Annual mean surface temperature anomalies for the last millennium (851-2005 AD) at various scales over the land of the Northern Hemisphere were reconstructed with 2° × 2° spatial resolution, using an optimal interpolation (OI) algorithm. All target series were decomposed using an ensemble empirical mode decomposition method followed by power spectral analysis. Four typical components were obtained at inter-annual, decadal, multidecadal, and centennial timescales. A total of 323 temperature-sensitive proxy chronologies were incorporated after screening for each component. By scaling the proxy components using variance matching and applying a localized OI algorithm to all four components point by point, we obtained merged surface temperatures. Independent validation indicates that the most significant improvement was for components at the inter-annual scale, but this became less evident with increasing timescales. In mid-latitude land areas, 10-30% of grids were significantly corrected at the inter-annual scale. By assimilating the proxy records, the merged results reduced the gap in response to volcanic forcing between a pure reconstruction and simulation. Difficulty remained in verifying the centennial information and quantifying corresponding uncertainties, so additional effort should be devoted to this aspect in future research.

  13. The pillars of well-constructed simulated patient programs: A qualitative study with experienced educators.

    PubMed

    Pritchard, Shane A; Blackstock, Felicity C; Keating, Jennifer L; Nestel, Debra

    2017-11-01

    The inclusion of simulated patients (SPs) in health professional education is growing internationally. However, there is limited evidence for best practice in SP methodology. This study investigated how experienced SP educators support SPs in providing SP-based education for health professional students. Experienced SP educators were identified via relevant professional associations, peer-reviewed publications, and peer referral. Semi-structured individual interviews were conducted via telephone. Data were analyzed independently by three researchers using principles of inductive thematic analysis. Four themes were identified that represent the key structural components of SP programs considered by educators seeking to optimize learning for health professional students in SP programs: managing SPs by operationalizing an effective program, selecting SPs by rigorously screening for suitability, preparing SPs by educating for a specific scenario, and directing SPs by leading safe and meaningful interactions. Within these components, subthemes were described, with considerable variation in approaches. Key structural components to SP programs were consistently described by experienced SP educators who operationalize them. A framework has been proposed to assist educators in designing high-quality SP programs that support SPs and learners. Future research is required to evaluate and refine this framework and other evidence-based resources for SP educators.

  14. Development of Finite-Volume Methods for Three-Dimensional Transonic Flows.

    DTIC Science & Technology

    1980-08-01

    rapidly. Away from the airfoil, the streamlines spread. This type of mesh can easily be blended into a Cartesian mesh for the far field. A disadvantage...E. W., and Stern, M. A. (1980) "Simulated Transonic Flows for Aircraft with Nacelles, Pylons and Winglets ," AIAA Paper 80-0130, January. Caughey, D. A

  15. Reflective teaching of medical communication skills with DiViDU: assessing the level of student reflection on recorded consultations with simulated patients.

    PubMed

    Hulsman, R L; Harmsen, A B; Fabriek, M

    2009-02-01

    Acquisition of effective, goal-oriented communication skills requires both practicing skills and reflective thinking. Reflection is a cyclic process of perceiving and analysing communication behaviour in terms of goals and effects and designing improved actions. Based on Korthagen's ALACT reflection model, communication training on history taking was designed. Objectives were to develop rating criteria for assessment of the students' level of reflection and to collect student evaluations of the reflective cycle components in the communication training. All second year medical students recorded a consultation with a simulated patient. In DiViDU, a web-based ICT program, students reviewed the video, identified and marked three key events, attached written reflections and provided peer-feedback. Students' written reflections were rated on four reflection categories. A reflection-level score was based on a frequency count of the number of categories used over three reflections. Students filled out an evaluation questionnaire on components of the communication training. Data were analyzed of 304 (90.6%) students. The four reflection categories Observations, Motives, Effects and Goals of behaviour were used in 7-38%. Most students phrased undirected questions for improvement (93%). The average reflection score was 2.1 (S.D. 2.0). All training components were considered instructive. Acting was preferred most. Reviewing video was considered instructive. Self-reflection was considered more difficult than providing written feedback to the reflections of peers. Reflection on communication behaviour can be systematically implemented and measured in a structured way. Reflection levels were low, probably indicating a limited notion of goal-oriented attributes of communication skills. Early introduction of critical self-reflection facilitates acceptance of an important ability for physicians for continued life-long learning and becoming mindful practitioners.

  16. Performance investigation of an innovative Vertical Axis Hydrokinetic Turbine – Straight Blade Cascaded (VAHT-SBC) for low current speed

    NASA Astrophysics Data System (ADS)

    Hantoro, R.; Prananda, J.; Mahmashani, A. W.; Septyaningru, E.; Imanuddin, F.

    2018-05-01

    Research on the development and innovation of Vertical Axis Hydrokinetic Turbine (VAHT) to improve performance has been done. One of the important indicator that affects VAHT’s performance is Coefficient of Performance (Cp). Theoretical Cp value for the VAT (Darrieus) turbine is 0.45. This paper presents the results of a performance investigation for an innovative Vertical Axis Hydrokinetic Turbine – Straight Blade Cascaded (VAHT-SBC) by modifying the number and the arrangement of blades using CFD simulation. Symmetrical NACA 0018 is used for this study, each model is simulated with current speed variation (U - m/s) of 0.5, 1 and 1.5. An increase in Cp value is shown in variation of 9 blades (3 blades cascaded in each arm) with Cp value of 0.396 at TSR of 2.27 which is reach 88% of the theoretical value. Furthermore, the streamline velocity of the pressure contour, velocity streamline and torque fluctuations are also presented in this paper to gain in deep information.

  17. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.

    PubMed

    Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A

    2017-02-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed  ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.

  18. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model

    PubMed Central

    Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.

    2017-01-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746

  19. 3-D inelastic analysis methods for hot section components (base program). [turbine blades, turbine vanes, and combustor liners

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Bak, M. J.; Nakazawa, S.; Banerjee, P. K.

    1984-01-01

    A 3-D inelastic analysis methods program consists of a series of computer codes embodying a progression of mathematical models (mechanics of materials, special finite element, boundary element) for streamlined analysis of combustor liners, turbine blades, and turbine vanes. These models address the effects of high temperatures and thermal/mechanical loadings on the local (stress/strain) and global (dynamics, buckling) structural behavior of the three selected components. These models are used to solve 3-D inelastic problems using linear approximations in the sense that stresses/strains and temperatures in generic modeling regions are linear functions of the spatial coordinates, and solution increments for load, temperature and/or time are extrapolated linearly from previous information. Three linear formulation computer codes, referred to as MOMM (Mechanics of Materials Model), MHOST (MARC-Hot Section Technology), and BEST (Boundary Element Stress Technology), were developed and are described.

  20. Smartphone apps and the nutrition care process: Current perspectives and future considerations.

    PubMed

    Chen, Juliana; Gemming, Luke; Hanning, Rhona; Allman-Farinelli, Margaret

    2018-04-01

    To provide dietitians with practical guidance on incorporating smartphone applications (apps) in the nutrition care process (NCP) to optimize patient education and counseling. The current evidence-base for mobile health (mHealth) apps was searched using PubMed and Google Scholar. Where and how apps could be implemented by dietitians across the four steps of the NCP is discussed. With functionality to automatically convert patient dietary records into nutrient components, nutrition assessment can be streamlined using nutrition apps, allowing more time for dietitians to deliver education and nutrition counseling. Dietitians could prescribe apps to provide patients with education on nutrition skills and in counseling for better adherence to behavior change. Improved patient-provider communication is also made possible through the opportunity for real-time monitoring and evaluation of patient progress via apps. A practical framework termed the 'Mobile Nutrition Care Process Grid' provides dietitians with best-practice guidance on how to use apps. Including apps into dietetic practice could enhance the efficiency and quality of nutrition care and counseling delivered by dietitians. Apps should be considered an adjunct to enable dietetic counseling and care, rather than to replace the expertise, social support and accountability provided by dietitians. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Hierarchical streamline bundles.

    PubMed

    Yu, Hongfeng; Wang, Chaoli; Shene, Ching-Kuang; Chen, Jacqueline H

    2012-08-01

    Effective 3D streamline placement and visualization play an essential role in many science and engineering disciplines. The main challenge for effective streamline visualization lies in seed placement, i.e., where to drop seeds and how many seeds should be placed. Seeding too many or too few streamlines may not reveal flow features and patterns either because it easily leads to visual clutter in rendering or it conveys little information about the flow field. Not only does the number of streamlines placed matter, their spatial relationships also play a key role in understanding the flow field. Therefore, effective flow visualization requires the streamlines to be placed in the right place and in the right amount. This paper introduces hierarchical streamline bundles, a novel approach to simplifying and visualizing 3D flow fields defined on regular grids. By placing seeds and generating streamlines according to flow saliency, we produce a set of streamlines that captures important flow features near critical points without enforcing the dense seeding condition. We group spatially neighboring and geometrically similar streamlines to construct a hierarchy from which we extract streamline bundles at different levels of detail. Streamline bundles highlight multiscale flow features and patterns through clustered yet not cluttered display. This selective visualization strategy effectively reduces visual clutter while accentuating visual foci, and therefore is able to convey the desired insight into the flow data.

  2. 26th JANNAF Airbreathing Propulsion Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Fry, Ronald S. (Editor); Gannaway, Mary T. (Editor)

    2002-01-01

    This volume, the first of four volumes, is a collection of 28 unclassified/unlimited-distribution papers which were presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 26th Airbreathing Propulsion Subcommittee (APS) was held jointly with the 38th Combustion Subcommittee (CS), 20th Propulsion Systems Hazards Subcommittee (PSHS), and 2nd Modeling and Simulation Subcommittee. The meeting was held 8-12 April 2002 at the Bayside Inn at The Sandestin Golf & Beach Resort and Eglin Air Force Base, Destin, Florida. Topics covered include: scramjet and ramjet R&D program overviews; tactical propulsion; space access; NASA GTX status; PDE technology; actively cooled engine structures; modeling and simulation of complex hydrocarbon fuels and unsteady processes; and component modeling and simulation.

  3. Shear driven droplet shedding and coalescence on a superhydrophobic surface

    NASA Astrophysics Data System (ADS)

    Moghtadernejad, S.; Tembely, M.; Jadidi, M.; Esmail, N.; Dolatabadi, A.

    2015-03-01

    The interest on shedding and coalescence of sessile droplets arises from the importance of these phenomena in various scientific problems and industrial applications such as ice formation on wind turbine blades, power lines, nacelles, and aircraft wings. It is shown recently that one of the ways to reduce the probability of ice accretion on industrial components is using superhydrophobic coatings due to their low adhesion to water droplets. In this study, a combined experimental and numerical approach is used to investigate droplet shedding and coalescence phenomena under the influence of air shear flow on a superhydrophobic surface. Droplets with a size of 2 mm are subjected to various air speeds ranging from 5 to 90 m/s. A numerical simulation based on the Volume of Fluid method coupled with the Large Eddy Simulation turbulent model is carried out in conjunction with the validating experiments to shed more light on the coalescence of droplets and detachment phenomena through a detailed analysis of the aerodynamics forces and velocity vectors on the droplet and the streamlines around it. The results indicate a contrast in the mechanism of two-droplet coalescence and subsequent detachment with those related to the case of a single droplet shedding. At lower speeds, the two droplets coalesce by attracting each other with successive rebounds of the merged droplet on the substrate, while at higher speeds, the detachment occurs almost instantly after coalescence, with a detachment time decreasing exponentially with the air speed. It is shown that coalescence phenomenon assists droplet detachment from the superhydrophobic substrate at lower air speeds.

  4. Streamlined research funding using short proposals and accelerated peer review: an observational study.

    PubMed

    Barnett, Adrian G; Herbert, Danielle L; Campbell, Megan; Daly, Naomi; Roberts, Jason A; Mudge, Alison; Graves, Nicholas

    2015-02-07

    Despite the widely recognised importance of sustainable health care systems, health services research remains generally underfunded in Australia. The Australian Centre for Health Services Innovation (AusHSI) is funding health services research in the state of Queensland. AusHSI has developed a streamlined protocol for applying and awarding funding using a short proposal and accelerated peer review. An observational study of proposals for four health services research funding rounds from May 2012 to November 2013. A short proposal of less than 1,200 words was submitted using a secure web-based portal. The primary outcome measures are: time spent preparing proposals; a simplified scoring of grant proposals (reject, revise or accept for interview) by a scientific review committee; and progressing from submission to funding outcomes within eight weeks. Proposals outside of health services research were deemed ineligible. There were 228 eligible proposals across 4 funding rounds: from 29% to 79% were shortlisted and 9% to 32% were accepted for interview. Success rates increased from 6% (in 2012) to 16% (in 2013) of eligible proposals. Applicants were notified of the outcomes within two weeks from the interview; which was a maximum of eight weeks after the submission deadline. Applicants spent 7 days on average preparing their proposal. Applicants with a ranking of reject or revise received written feedback and suggested improvements for their proposals, and resubmissions composed one third of the 2013 rounds. The AusHSI funding scheme is a streamlined application process that has simplified the process of allocating health services research funding for both applicants and peer reviewers. The AusHSI process has minimised the time from submission to notification of funding outcomes.

  5. Dshell++: A Component Based, Reusable Space System Simulation Framework

    NASA Technical Reports Server (NTRS)

    Lim, Christopher S.; Jain, Abhinandan

    2009-01-01

    This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.

  6. Alignment of Tractograms As Graph Matching.

    PubMed

    Olivetti, Emanuele; Sharmin, Nusrat; Avesani, Paolo

    2016-01-01

    The white matter pathways of the brain can be reconstructed as 3D polylines, called streamlines, through the analysis of diffusion magnetic resonance imaging (dMRI) data. The whole set of streamlines is called tractogram and represents the structural connectome of the brain. In multiple applications, like group-analysis, segmentation, or atlasing, tractograms of different subjects need to be aligned. Typically, this is done with registration methods, that transform the tractograms in order to increase their similarity. In contrast with transformation-based registration methods, in this work we propose the concept of tractogram correspondence, whose aim is to find which streamline of one tractogram corresponds to which streamline in another tractogram, i.e., a map from one tractogram to another. As a further contribution, we propose to use the relational information of each streamline, i.e., its distances from the other streamlines in its own tractogram, as the building block to define the optimal correspondence. We provide an operational procedure to find the optimal correspondence through a combinatorial optimization problem and we discuss its similarity to the graph matching problem. In this work, we propose to represent tractograms as graphs and we adopt a recent inexact sub-graph matching algorithm to approximate the solution of the tractogram correspondence problem. On tractograms generated from the Human Connectome Project dataset, we report experimental evidence that tractogram correspondence, implemented as graph matching, provides much better alignment than affine registration and comparable if not better results than non-linear registration of volumes.

  7. Collaboration-Centred Cities through Urban Apps Based on Open and User-Generated Data

    PubMed Central

    Aguilera, Unai; López-de-Ipiña, Diego; Pérez, Jorge

    2016-01-01

    This paper describes the IES Cities platform conceived to streamline the development of urban apps that combine heterogeneous datasets provided by diverse entities, namely, government, citizens, sensor infrastructure and other information data sources. This work pursues the challenge of achieving effective citizen collaboration by empowering them to prosume urban data across time. Particularly, this paper focuses on the query mapper; a key component of the IES Cities platform devised to democratize the development of open data-based mobile urban apps. This component allows developers not only to use available data, but also to contribute to existing datasets with the execution of SQL sentences. In addition, the component allows developers to create ad hoc storages for their applications, publishable as new datasets accessible by other consumers. As multiple users could be contributing and using a dataset, our solution also provides a data level permission mechanism to control how the platform manages the access to its datasets. We have evaluated the advantages brought forward by IES Cities from the developers’ perspective by describing an exemplary urban app created on top of it. In addition, we include an evaluation of the main functionalities of the query mapper. PMID:27376300

  8. Collaboration-Centred Cities through Urban Apps Based on Open and User-Generated Data.

    PubMed

    Aguilera, Unai; López-de-Ipiña, Diego; Pérez, Jorge

    2016-07-01

    This paper describes the IES Cities platform conceived to streamline the development of urban apps that combine heterogeneous datasets provided by diverse entities, namely, government, citizens, sensor infrastructure and other information data sources. This work pursues the challenge of achieving effective citizen collaboration by empowering them to prosume urban data across time. Particularly, this paper focuses on the query mapper; a key component of the IES Cities platform devised to democratize the development of open data-based mobile urban apps. This component allows developers not only to use available data, but also to contribute to existing datasets with the execution of SQL sentences. In addition, the component allows developers to create ad hoc storages for their applications, publishable as new datasets accessible by other consumers. As multiple users could be contributing and using a dataset, our solution also provides a data level permission mechanism to control how the platform manages the access to its datasets. We have evaluated the advantages brought forward by IES Cities from the developers' perspective by describing an exemplary urban app created on top of it. In addition, we include an evaluation of the main functionalities of the query mapper.

  9. Theory-Based Interventions Combining Mental Simulation and Planning Techniques to Improve Physical Activity: Null Results from Two Randomized Controlled Trials.

    PubMed

    Meslot, Carine; Gauchet, Aurélie; Allenet, Benoît; François, Olivier; Hagger, Martin S

    2016-01-01

    Interventions to assist individuals in initiating and maintaining regular participation in physical activity are not always effective. Psychological and behavioral theories advocate the importance of both motivation and volition in interventions to change health behavior. Interventions adopting self-regulation strategies that foster motivational and volitional components may, therefore, have utility in promoting regular physical activity participation. We tested the efficacy of an intervention adopting motivational (mental simulation) and volitional (implementation intentions) components to promote a regular physical activity in two studies. Study 1 adopted a cluster randomized design in which participants ( n = 92) were allocated to one of three conditions: mental simulation plus implementation intention, implementation intention only, or control. Study 2 adopted a 2 (mental simulation vs. no mental simulation) × 2 (implementation intention vs. no implementation intention) randomized controlled design in which fitness center attendees ( n = 184) were randomly allocated one of four conditions: mental simulation only, implementation intention only, combined, or control. Physical activity behavior was measured by self-report (Study 1) or fitness center attendance (Study 2) at 4- (Studies 1 and 2) and 19- (Study 2 only) week follow-up periods. Findings revealed no statistically significant main or interactive effects of the mental simulation and implementation intention conditions on physical activity outcomes in either study. Findings are in contrast to previous research which has found pervasive effects for both intervention strategies. Findings are discussed in light of study limitations including the relatively small sample sizes, particularly for Study 1, deviations in the operationalization of the intervention components from previous research and the lack of a prompt for a goal intention. Future research should focus on ensuring uniformity in the format of the intervention components, test the effects of each component alone and in combination using standardized measures across multiple samples, and systematically explore effects of candidate moderators.

  10. Theory-Based Interventions Combining Mental Simulation and Planning Techniques to Improve Physical Activity: Null Results from Two Randomized Controlled Trials

    PubMed Central

    Meslot, Carine; Gauchet, Aurélie; Allenet, Benoît; François, Olivier; Hagger, Martin S.

    2016-01-01

    Interventions to assist individuals in initiating and maintaining regular participation in physical activity are not always effective. Psychological and behavioral theories advocate the importance of both motivation and volition in interventions to change health behavior. Interventions adopting self-regulation strategies that foster motivational and volitional components may, therefore, have utility in promoting regular physical activity participation. We tested the efficacy of an intervention adopting motivational (mental simulation) and volitional (implementation intentions) components to promote a regular physical activity in two studies. Study 1 adopted a cluster randomized design in which participants (n = 92) were allocated to one of three conditions: mental simulation plus implementation intention, implementation intention only, or control. Study 2 adopted a 2 (mental simulation vs. no mental simulation) × 2 (implementation intention vs. no implementation intention) randomized controlled design in which fitness center attendees (n = 184) were randomly allocated one of four conditions: mental simulation only, implementation intention only, combined, or control. Physical activity behavior was measured by self-report (Study 1) or fitness center attendance (Study 2) at 4- (Studies 1 and 2) and 19- (Study 2 only) week follow-up periods. Findings revealed no statistically significant main or interactive effects of the mental simulation and implementation intention conditions on physical activity outcomes in either study. Findings are in contrast to previous research which has found pervasive effects for both intervention strategies. Findings are discussed in light of study limitations including the relatively small sample sizes, particularly for Study 1, deviations in the operationalization of the intervention components from previous research and the lack of a prompt for a goal intention. Future research should focus on ensuring uniformity in the format of the intervention components, test the effects of each component alone and in combination using standardized measures across multiple samples, and systematically explore effects of candidate moderators. PMID:27899904

  11. The self streamlining wind tunnel. [wind tunnel walls

    NASA Technical Reports Server (NTRS)

    Goodyer, M. J.

    1975-01-01

    A two dimensional test section in a low speed wind tunnel capable of producing flow conditions free from wall interference is presented. Flexible top and bottom walls, and rigid sidewalls from which models were mounted spanning the tunnel are shown. All walls were unperforated, and the flexible walls were positioned by screw jacks. To eliminate wall interference, the wind tunnel itself supplied the information required in the streamlining process, when run with the model present. Measurements taken at the flexible walls were used by the tunnels computer check wall contours. Suitable adjustments based on streamlining criteria were then suggested by the computer. The streamlining criterion adopted when generating infinite flowfield conditions was a matching of static pressures in the test section at a wall with pressures computed for an imaginary inviscid flowfield passing over the outside of the same wall. Aerodynamic data taken on a cylindrical model operating under high blockage conditions are presented to illustrate the operation of the tunnel in its various modes.

  12. Towards the Identification of the Keeper Erosion Cause(s): Numerical Simulations of the Plasma and Neutral Gas Using the Global Cathode Model OrCa2D-II

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Katz, Ira; Goebel, Dan M.; Jameson, Kristina K.

    2006-01-01

    Numerical simulations with the time-dependent Orificed Cathode (OrCa2D-II) computer code show that classical enhancements of the plasma resistivity can not account for the elevated electron temperatures and steep plasma potential gradients measured in the plume of a 25-27.5 A discharge hollow cathode. The cathode, which employs a 0.11-in diameter orifice, was operated at 5.5 sccm without an applied magnetic field using two different anode geometries. It is found that anomalous resistivity based on electron-driven instabilities improves the comparison between theory and experiment. It is also estimated that other effects such as the Hall-effect from the self-induced magnetic field, not presently included in OrCa2D-II, may contribute to the constriction of the current density streamlines thus explaining the higher plasma densities observed along the centerline.

  13. Ethmoidectomy combined with superior meatus enlargement increases olfactory airflow

    PubMed Central

    Kondo, Kenji; Nomura, Tsutomu; Yamasoba, Tatsuya

    2017-01-01

    Objectives The relationship between a particular surgical technique in endoscopic sinus surgery (ESS) and airflow changes in the post‐operative olfactory region has not been assessed. The present study aimed to compare olfactory airflow after ESS between conventional ethmoidectomy and ethmoidectomy with superior meatus enlargement, using virtual ESS and computational fluid dynamics (CFD) analysis. Study Design Prospective computational study. Materials and Methods Nasal computed tomography images of four adult subjects were used to generate models of the nasal airway. The original preoperative model was digitally edited as virtual ESS by performing uncinectomy, ethmoidectomy, antrostomy, and frontal sinusotomy. The following two post‐operative models were prepared: conventional ethmoidectomy with normal superior meatus (ESS model) and ethmoidectomy with superior meatus enlargement (ESS‐SM model). The calculated three‐dimensional nasal geometries were confirmed using virtual endoscopy to ensure that they corresponded to the post‐operative anatomy observed in the clinical setting. Steady‐state, laminar, inspiratory airflow was simulated, and the velocity, streamline, and mass flow rate in the olfactory region were compared among the preoperative and two postoperative models. Results The mean velocity in the olfactory region, number of streamlines bound to the olfactory region, and mass flow rate were higher in the ESS‐SM model than in the other models. Conclusion We successfully used an innovative approach involving virtual ESS, virtual endoscopy, and CFD to assess postoperative outcomes after ESS. It is hypothesized that the increased airflow to the olfactory fossa achieved with ESS‐SM may lead to improved olfactory function; however, further studies are required. Level of Evidence NA. PMID:28894833

  14. Orientation program for hospital-based nurse practitioners.

    PubMed

    Bahouth, Mona N; Esposito-Herr, Mary Beth

    2009-01-01

    The transition from student to practicing clinician is often a challenging and difficult period for many nurse practitioners. Newly graduated nurse practitioners commonly describe feelings of inadequacy in assuming clinical responsibilities, lack of support by team members, unclear expectations for the orientation period, and role isolation. This article describes the formal nurse practitioner orientation program implemented at the University of Maryland Medical Center, a large urban academic medical center, to facilitate the transition of new nurse practitioners into the workforce. This comprehensive program incorporates streamlined administrative activities, baseline didactic and simulation-based critical care education, ongoing and focused peer support, access to formalized resources, and individualized clinical preceptor programs. This formalized orientation program has proven to be one of the key variables to successful integration of nurse practitioners into our acute care clinical teams.

  15. On the influences of key modelling constants of large eddy simulations for large-scale compartment fires predictions

    NASA Astrophysics Data System (ADS)

    Yuen, Anthony C. Y.; Yeoh, Guan H.; Timchenko, Victoria; Cheung, Sherman C. P.; Chan, Qing N.; Chen, Timothy

    2017-09-01

    An in-house large eddy simulation (LES) based fire field model has been developed for large-scale compartment fire simulations. The model incorporates four major components, including subgrid-scale turbulence, combustion, soot and radiation models which are fully coupled. It is designed to simulate the temporal and fluid dynamical effects of turbulent reaction flow for non-premixed diffusion flame. Parametric studies were performed based on a large-scale fire experiment carried out in a 39-m long test hall facility. Several turbulent Prandtl and Schmidt numbers ranging from 0.2 to 0.5, and Smagorinsky constants ranging from 0.18 to 0.23 were investigated. It was found that the temperature and flow field predictions were most accurate with turbulent Prandtl and Schmidt numbers of 0.3, respectively, and a Smagorinsky constant of 0.2 applied. In addition, by utilising a set of numerically verified key modelling parameters, the smoke filling process was successfully captured by the present LES model.

  16. Gas production and migration in landfills and geological materials.

    PubMed

    Nastev, M; Therrien, R; Lefebvre, R; Gélinas, P

    2001-11-01

    Landfill gas, originating from the anaerobic biodegradation of the organic content of waste, consists mainly of methane and carbon dioxide, with traces of volatile organic compounds. Pressure, concentration and temperature gradients that develop within the landfill result in gas emissions to the atmosphere and in lateral migration through the surrounding soils. Environmental and safety issues associated with the landfill gas require control of off-site gas migration. The numerical model TOUGH2-LGM (Transport of Unsaturated Groundwater and Heat-Landfill Gas Migration) has been developed to simulate landfill gas production and migration processes within and beyond landfill boundaries. The model is derived from the general non-isothermal multiphase flow simulator TOUGH2, to which a new equation of state module is added. It simulates the migration of five components in partially saturated media: four fluid components (water, atmospheric air, methane and carbon dioxide) and one energy component (heat). The four fluid components are present in both the gas and liquid phases. The model incorporates gas-liquid partitioning of all fluid components by means of dissolution and volatilization. In addition to advection in the gas and liquid phase, multi-component diffusion is simulated in the gas phase. The landfill gas production rate is proportional to the organic substrate and is modeled as an exponentially decreasing function of time. The model is applied to the Montreal's CESM landfill site, which is located in a former limestone rock quarry. Existing data were used to characterize hydraulic properties of the waste and the limestone. Gas recovery data at the site were used to define the gas production model. Simulations in one and two dimensions are presented to investigate gas production and migration in the landfill, and in the surrounding limestone. The effects of a gas recovery well and landfill cover on gas migration are also discussed.

  17. Streamlined Archaeo-Geophysical Data Processing and Integration for DoD Field Use. Cost and Performance Report

    DTIC Science & Technology

    2012-09-01

    used a proton magnetometer to detect kiln and earth-filled pits in the United Kingdom as early as 1958 (Atkinson, 1953; Clark, 2001; Gaffney and Gater...Grad601-2 (Figure 8, upper left) is a vertical component dual sensor fluxgate gradiometer. It is designed for archaeological prospection, permits

  18. Points of View Analysis Revisited: Fitting Multidimensional Structures to Optimal Distance Components with Cluster Restrictions on the Variables.

    ERIC Educational Resources Information Center

    Meulman, Jacqueline J.; Verboon, Peter

    1993-01-01

    Points of view analysis, as a way to deal with individual differences in multidimensional scaling, was largely supplanted by the weighted Euclidean model. It is argued that the approach deserves new attention, especially as a technique to analyze group differences. A streamlined and integrated process is proposed. (SLD)

  19. Function modeling: improved raster analysis through delayed reading and function raster datasets

    Treesearch

    John S. Hogland; Nathaniel M. Anderson; J .Greg Jones

    2013-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  20. Issues and Strategies for Establishing Work-Integrated Learning for Multidisciplinary Teams: A Focus on Degrees in Sustainability

    ERIC Educational Resources Information Center

    Wilson, Robyn Fay

    2015-01-01

    This study was conducted to identify challenges and potential strategies to streamline work-integrated learning placements for multidisciplinary teams of students undertaking degrees in sustainability. Face-to-face interviews using a semi-structured questionnaire were conducted with 15 academics and senior university staff, from four universities…

  1. A new package in MODFLOW to simulate unconfined groundwater flow in sloping aquifers.

    PubMed

    Wang, Quanrong; Zhan, Hongbin; Tang, Zhonghua

    2014-01-01

    The nonhorizontal-model-layer (NHML) grid system is more accurate than the horizontal-model-layer grid system to describe groundwater flow in an unconfined sloping aquifer on the basis of MODFLOW-2000. However, the finite-difference scheme of NHML was based on the Dupuit-Forchheimer assumption that the streamlines were horizontal, which was acceptable for slope less than 0.10. In this study, we presented a new finite-difference scheme of NHML based on the Boussinesq assumption and developed a new package SLOPE which was incorporated into MODFLOW-2000 to become the MODFLOW-SP model. The accuracy of MODFLOW-SP was tested against solution of Mac Cormack (1969). The differences between the solutions of MODFLOW-2000 and MODFLOW-SP were nearly negligible when the slope was less than 0.27, and they were noticeable during the transient flow stage and vanished in steady state when the slope increased above 0.27. We established a model considering the vertical flow using COMSOL Multiphysics to test the robustness of constrains used in MODFLOW-SP. The results showed that streamlines quickly became parallel with the aquifer base except in the narrow regions near the boundaries when the initial flow was not parallel to the aquifer base. MODFLOW-SP can be used to predict the hydraulic head of an unconfined aquifer along the profile perpendicular to the aquifer base when the slope was smaller than 0.50. The errors associated with constrains used in MODFLOW-SP were small but noticeable when the slope increased to 0.75, and became significant for the slope of 1.0. © 2013, National Ground Water Association.

  2. Sensor Based Framework for Secure Multimedia Communication in VANET

    PubMed Central

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  3. Evaluation of Veterinary Student Surgical Skills Preparation for Ovariohysterectomy Using Simulators: A Pilot Study.

    PubMed

    Read, Emma K; Vallevand, Andrea; Farrell, Robin M

    2016-01-01

    This paper describes the development and evaluation of training intended to enhance students' performance on their first live-animal ovariohysterectomy (OVH). Cognitive task analysis informed a seven-page lab manual, 30-minute video, and 46-item OVH checklist (categorized into nine surgery components and three phases of surgery). We compared two spay simulator models (higher-fidelity silicone versus lower-fidelity cloth and foam). Third-year veterinary students were randomly assigned to a training intervention: lab manual and video only; lab manual, video, and $675 silicone-based model; lab manual, video, and $64 cloth and foam model. We then assessed transfer of training to a live-animal OVH. Chi-square analyses determined statistically significant differences between the interventions on four of nine surgery components, all three phases of surgery, and overall score. Odds ratio analyses indicated that training with a spay model improved the odds of attaining an excellent or good rating on 25 of 46 checklist items, six of nine surgery components, all three phases of surgery, and the overall score. Odds ratio analyses comparing the spay models indicated an advantage for the $675 silicon-based model on only 6 of 46 checklist items, three of nine surgery components, and one phase of surgery. Training with a spay model improved performance when compared to training with a manual and video only. Results suggested that training with a lower-fidelity/cost model might be as effective when compared to a higher-fidelity/cost model. Further research is required to investigate simulator fidelity and costs on transfer of training to the operational environment.

  4. Effects of complex terrain on atmospheric flow: dividing streamline observations and quantification

    NASA Astrophysics Data System (ADS)

    Thompson, Michael; Fernando, Harindra; di Sabatino, Silvana; Leo, Laura; University of Notre Dame Team

    2013-11-01

    As part of the MATERHORN field campaign on atmospheric flow in mountainous terrain, the dividing streamline concept for stratified flow over obstacles was investigated using smoke flow visualization and meteorological measurements. At small Froude numbers (Fr < 1), a stratified flow approaching a mountain either possesses enough kinetic energy to pass over the summit or else flow around the sides, with dividing streamlines separating the two scenarios. An isolated northwestern peak of the Granite Mountain, approximately 60 m in height, was used for the study. Incoming flow velocities and temperature profiles were measured upstream using sonic anemometers and thermocouples mounted on a 32 m tower, while onsite measurements were taken with portable weather stations. Sufficiently strong stratification was developed around 3:00AM GMT, with Froude numbers in the range for dividing streamlines to exist. In the first trial, suitably placed red smoke releases were used and in another trial white smoke was released from a 25 m crane. In both cases well-defined dividing streamlines were observed and its vertical location was at a height about half of the mountain height, which is consistent with theoretical results based on Shepard's formula. This research was supported by the Office of Naval Research (ONR) grant number N00014-11-1-0709.

  5. Planning Models for Tuberculosis Control Programs

    PubMed Central

    Chorba, Ronald W.; Sanders, J. L.

    1971-01-01

    A discrete-state, discrete-time simulation model of tuberculosis is presented, with submodels of preventive interventions. The model allows prediction of the prevalence of the disease over the simulation period. Preventive and control programs and their optimal budgets may be planned by using the model for cost-benefit analysis: costs are assigned to the program components and disease outcomes to determine the ratio of program expenditures to future savings on medical and socioeconomic costs of tuberculosis. Optimization is achieved by allocating funds in successive increments to alternative program components in simulation and identifying those components that lead to the greatest reduction in prevalence for the given level of expenditure. The method is applied to four hypothetical disease prevalence situations. PMID:4999448

  6. BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.

    PubMed

    Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter

    2013-02-01

    Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. A conceptual design of shock-eliminating clover combustor for large scale scramjet engine

    NASA Astrophysics Data System (ADS)

    Sun, Ming-bo; Zhao, Yu-xin; Zhao, Guo-yan; Liu, Yuan

    2017-01-01

    A new concept of shock-eliminating clover combustor is proposed for large scale scramjet engine to fulfill the requirements of fuel penetration, total pressure recovery and cooling. To generate the circular-to-clover transition shape of the combustor, the streamline tracing technique is used based on an axisymmetric expansion parent flowfield calculated using the method of characteristics. The combustor is examined using inviscid and viscous numerical simulations and a pure circular shape is calculated for comparison. The results showed that the combustor avoids the shock wave generation and produces low total pressure losses in a wide range of flight condition with various Mach number. The flameholding device for this combustor is briefly discussed.

  8. Using Simulation to Improve Systems-Based Practices.

    PubMed

    Gardner, Aimee K; Johnston, Maximilian; Korndorffer, James R; Haque, Imad; Paige, John T

    2017-09-01

    Ensuring the safe, effective management of patients requires efficient processes of care within a smoothly operating system in which highly reliable teams of talented, skilled health care providers are able to use the vast array of high-technology resources and intensive care techniques available. Simulation can play a unique role in exploring and improving the complex perioperative system by proactively identifying latent safety threats and mitigating their damage to ensure that all those who work in this critical health care environment can provide optimal levels of patient care. A panel of five experts from a wide range of institutions was brought together to discuss the added value of simulation-based training for improving systems-based aspects of the perioperative service line. Panelists shared the way in which simulation was demonstrated at their institutions. The themes discussed by each panel member were delineated into four avenues through which simulation-based techniques have been used. Simulation-based techniques are being used in (1) testing new clinical workspaces and facilities before they open to identify potential latent conditions; (2) practicing how to identify the deteriorating patient and escalate care in an effective manner; (3) performing prospective root cause analyses to address system weaknesses leading to sentinel events; and (4) evaluating the efficiency and effectiveness of the electronic health record in the perioperative setting. This focused review of simulation-based interventions to test and improve components of the perioperative microsystem, which includes literature that has emerged since the panel's presentation, highlights the broad-based utility of simulation-based technologies in health care. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  9. Computation of Steady-State Probability Distributions in Stochastic Models of Cellular Networks

    PubMed Central

    Hallen, Mark; Li, Bochong; Tanouchi, Yu; Tan, Cheemeng; West, Mike; You, Lingchong

    2011-01-01

    Cellular processes are “noisy”. In each cell, concentrations of molecules are subject to random fluctuations due to the small numbers of these molecules and to environmental perturbations. While noise varies with time, it is often measured at steady state, for example by flow cytometry. When interrogating aspects of a cellular network by such steady-state measurements of network components, a key need is to develop efficient methods to simulate and compute these distributions. We describe innovations in stochastic modeling coupled with approaches to this computational challenge: first, an approach to modeling intrinsic noise via solution of the chemical master equation, and second, a convolution technique to account for contributions of extrinsic noise. We show how these techniques can be combined in a streamlined procedure for evaluation of different sources of variability in a biochemical network. Evaluation and illustrations are given in analysis of two well-characterized synthetic gene circuits, as well as a signaling network underlying the mammalian cell cycle entry. PMID:22022252

  10. Design and Fabrication of Flying Saucer Utilizing Coanda Effect

    NASA Astrophysics Data System (ADS)

    Aabid, Abdul; Khan, S. A.

    2018-05-01

    Coanda effect is used in several engineering applications with distinctive designs and structures. It is also applied in aircrafts flying at low speeds for a comfortable ride. In this paper, we have designed and modelled Coanda effect in terms of a flying saucer. The fabrication was done by means of structural and electronic components. Electrical motor was used as a propeller to take off and land vertically (VTOL) along with hovering capability. The rotor disc diameter is smaller than the bulbous body unlike a helicopter which makes to fly very stable. Control flaps were used to regulate the path by altering the flow over the streamlined body. The model was then tested with a remote control. Numerical Simulation of the tesla turbine was done using ANSYS 14.5 software and displacements were obtained by applying different forces on designed model. CATIA V5 was used to analyse the shaft of the model to get minimum value of torque at which the shaft starts to deform.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarman, Sten, E-mail: sarman@ownit.nu; Wang, Yong-Lei; Laaksonen, Aatto

    The self-diffusion coefficients of nematic phases of various model systems consisting of regular convex calamitic and discotic ellipsoids and non-convex bodies such as bent-core molecules and soft ellipsoid strings have been obtained as functions of the shear rate in a shear flow. Then the self-diffusion coefficient is a second rank tensor with three different diagonal components and two off-diagonal components. These coefficients were found to be determined by a combination of two mechanisms, which previously have been found to govern the self-diffusion of shearing isotropic liquids, namely, (i) shear alignment enhancing the diffusion in the direction parallel to the streamlinesmore » and hindering the diffusion in the perpendicular directions and (ii) the distortion of the shell structure in the liquid whereby a molecule more readily can escape from a surrounding shell of nearest neighbors, so that the mobility increases in every direction. Thus, the diffusion parallel to the streamlines always increases with the shear rate since these mechanisms cooperate in this direction. In the perpendicular directions, these mechanisms counteract each other so that the behaviour becomes less regular. In the case of the nematic phases of the calamitic and discotic ellipsoids and of the bent core molecules, mechanism (ii) prevails so that the diffusion coefficients increase. However, the diffusion coefficients of the soft ellipsoid strings decrease in the direction of the velocity gradient because the broadsides of these molecules are oriented perpendicularly to this direction due the shear alignment (i). The cross coupling coefficient relating a gradient of tracer particles in the direction of the velocity gradient and their flow in the direction of the streamlines is negative and rather large, whereas the other coupling coefficient relating a gradient in the direction of the streamlines and a flow in the direction of the velocity gradient is very small.« less

  12. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    PubMed

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  13. Updated Panel-Method Computer Program

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1995-01-01

    Panel code PMARC_12 (Panel Method Ames Research Center, version 12) computes potential-flow fields around complex three-dimensional bodies such as complete aircraft models. Contains several advanced features, including internal mathematical modeling of flow, time-stepping wake model for simulating either steady or unsteady motions, capability for Trefftz computation of drag induced by plane, and capability for computation of off-body and on-body streamlines, and capability of computation of boundary-layer parameters by use of two-dimensional integral boundary-layer method along surface streamlines. Investigators interested in visual representations of phenomena, may want to consider obtaining program GVS (ARC-13361), General visualization System. GVS is Silicon Graphics IRIS program created to support scientific-visualization needs of PMARC_12. GVS available separately from COSMIC. PMARC_12 written in standard FORTRAN 77, with exception of NAMELIST extension used for input.

  14. Combination HIV prevention among MSM in South Africa: results from agent-based modeling.

    PubMed

    Brookmeyer, Ron; Boren, David; Baral, Stefan D; Bekker, Linda-Gail; Phaswana-Mafuya, Nancy; Beyrer, Chris; Sullivan, Patrick S

    2014-01-01

    HIV prevention trials have demonstrated the effectiveness of a number of behavioral and biomedical interventions. HIV prevention packages are combinations of interventions and offer potential to significantly increase the effectiveness of any single intervention. Estimates of the effectiveness of prevention packages are important for guiding the development of prevention strategies and for characterizing effect sizes before embarking on large scale trials. Unfortunately, most research to date has focused on testing single interventions rather than HIV prevention packages. Here we report the results from agent-based modeling of the effectiveness of HIV prevention packages for men who have sex with men (MSM) in South Africa. We consider packages consisting of four components: antiretroviral therapy for HIV infected persons with CD4 count <350; PrEP for high risk uninfected persons; behavioral interventions to reduce rates of unprotected anal intercourse (UAI); and campaigns to increase HIV testing. We considered 163 HIV prevention packages corresponding to different intensity levels of the four components. We performed 2252 simulation runs of our agent-based model to evaluate those packages. We found that a four component package consisting of a 15% reduction in the rate of UAI, 50% PrEP coverage of high risk uninfected persons, 50% reduction in persons who never test for HIV, and 50% ART coverage over and above persons already receiving ART at baseline, could prevent 33.9% of infections over 5 years (95% confidence interval, 31.5, 36.3). The package components with the largest incremental prevention effects were UAI reduction and PrEP coverage. The impact of increased HIV testing was magnified in the presence of PrEP. We find that HIV prevention packages that include both behavioral and biomedical components can in combination prevent significant numbers of infections with levels of coverage, acceptance and adherence that are potentially achievable among MSM in South Africa.

  15. EMU Suit Performance Simulation

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew S.; Benson, Elizabeth; Harvill, Lauren; Rajulu, Sudhakar

    2014-01-01

    Introduction: Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for research and development are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques that focus on a human-centric design paradigm. These new techniques make use of virtual prototype simulations and fully adjustable physical prototypes of suit hardware. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process. Objectives: The primary objective was to test modern simulation techniques for evaluating the human performance component of two EMU suit concepts, pivoted and planar style hard upper torso (HUT). Methods: This project simulated variations in EVA suit shoulder joint design and subject anthropometry and then measured the differences in shoulder mobility caused by the modifications. These estimations were compared to human-in-the-loop test data gathered during past suited testing using four subjects (two large males, two small females). Results: Results demonstrated that EVA suit modeling and simulation are feasible design tools for evaluating and optimizing suit design based on simulated performance. The suit simulation model was found to be advantageous in its ability to visually represent complex motions and volumetric reach zones in three dimensions, giving designers a faster and deeper comprehension of suit component performance vs. human performance. Suit models were able to discern differing movement capabilities between EMU HUT configurations, generic suit fit concerns, and specific suit fit concerns for crewmembers based on individual anthropometry

  16. Reconstruction of the arcuate fasciculus for surgical planning in the setting of peritumoral edema using two-tensor unscented Kalman filter tractography.

    PubMed

    Chen, Zhenrui; Tie, Yanmei; Olubiyi, Olutayo; Rigolo, Laura; Mehrtash, Alireza; Norton, Isaiah; Pasternak, Ofer; Rathi, Yogesh; Golby, Alexandra J; O'Donnell, Lauren J

    2015-01-01

    Diffusion imaging tractography is increasingly used to trace critical fiber tracts in brain tumor patients to reduce the risk of post-operative neurological deficit. However, the effects of peritumoral edema pose a challenge to conventional tractography using the standard diffusion tensor model. The aim of this study was to present a novel technique using a two-tensor unscented Kalman filter (UKF) algorithm to track the arcuate fasciculus (AF) in brain tumor patients with peritumoral edema. Ten right-handed patients with left-sided brain tumors in the vicinity of language-related cortex and evidence of significant peritumoral edema were retrospectively selected for the study. All patients underwent 3-Tesla magnetic resonance imaging (MRI) including a diffusion-weighted dataset with 31 directions. Fiber tractography was performed using both single-tensor streamline and two-tensor UKF tractography. A two-regions-of-interest approach was applied to perform the delineation of the AF. Results from the two different tractography algorithms were compared visually and quantitatively. Using single-tensor streamline tractography, the AF appeared disrupted in four patients and contained few fibers in the remaining six patients. Two-tensor UKF tractography delineated an AF that traversed edematous brain areas in all patients. The volume of the AF was significantly larger on two-tensor UKF than on single-tensor streamline tractography (p < 0.01). Two-tensor UKF tractography provides the ability to trace a larger volume AF than single-tensor streamline tractography in the setting of peritumoral edema in brain tumor patients.

  17. Estimating FIA plot characteristics using NAIP imagery, function modeling, and the RMRS raster utility coding library

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2015-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  18. On Flowfield Periodicity in the NASA Transonic Flutter Cascade. Part 2; Numerical Study

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; McFarland, Eric R.; Wood, Jerry R.; Lepicovsky, Jan

    2000-01-01

    The transonic flutter cascade facility at NASA Glenn Research Center was redesigned based on a combined program of experimental measurements and numerical analyses. The objectives of the redesign were to improve the periodicity of the cascade in steady operation, and to better quantify the inlet and exit flow conditions needed for CFD predictions. Part I of this paper describes the experimental measurements, which included static pressure measurements on the blade and endwalls made using both static taps and pressure sensitive paints, cobra probe measurements of the endwall boundary layers and blade wakes, and shadowgraphs of the wave structure. Part II of this paper describes three CFD codes used to analyze the facility, including a multibody panel code, a quasi-three-dimensional viscous code, and a fully three-dimensional viscous code. The measurements and analyses both showed that the operation of the cascade was heavily dependent on the configuration of the sidewalls. Four configurations of the sidewalls were studied and the results are described. For the final configuration, the quasi-three-dimensional viscous code was used to predict the location of mid-passage streamlines for a perfectly periodic cascade. By arranging the tunnel sidewalls to approximate these streamlines, sidewall interference was minimized and excellent periodicity was obtained.

  19. Manning’s equation and two-dimensional flow analogs

    NASA Astrophysics Data System (ADS)

    Hromadka, T. V., II; Whitley, R. J.; Jordan, N.; Meyer, T.

    2010-07-01

    SummaryTwo-dimensional (2D) flow models based on the well-known governing 2D flow equations are applied to floodplain analysis purposes. These 2D models numerically solve the governing flow equations simultaneously or explicitly on a discretization of the floodplain using grid tiles or similar tile cell geometry, called "elements". By use of automated information systems such as digital terrain modeling, digital elevation models, and GIS, large-scale topographic floodplain maps can be readily discretized into thousands of elements that densely cover the floodplain in an edge-to-edge form. However, the assumed principal flow directions of the flow model analog, as applied across an array of elements, typically do not align with the floodplain flow streamlines. This paper examines the mathematical underpinnings of a four-direction flow analog using an array of square elements with respect to floodplain flow streamlines that are not in alignment with the analog's principal flow directions. It is determined that application of Manning's equation to estimate the friction slope terms of the governing flow equations, in directions that are not coincident with the flow streamlines, may introduce a bias in modeling results, in the form of slight underestimation of flow depths. It is also determined that the maximum theoretical bias, occurs when a single square element is rotated by about 13°, and not 45° as would be intuitively thought. The bias as a function of rotation angle for an array of square elements follows approximately the bias for a single square element. For both the theoretical single square element and an array of square elements, the bias as a function of alignment angle follows a relatively constant value from about 5° to about 85°, centered at about 45°. This bias was first noted about a decade prior to the present paper, and the magnitude of this bias was estimated then to be about 20% at about 10° misalignment. An adjustment of Manning's n is investigated based on a considered steady state uniform flow problem, but the magnitude of the adjustment (about 20%) is on the order of the magnitude of the accepted ranges of friction factors. For usual cases where random streamline trajectory variability within the floodplain flow is greater than a few degrees from perfect alignment, the apparent bias appears to be implicitly included in the Manning's n values. It can be concluded that the array of square elements may be applied over the digital terrain model without respect to topographic flow directions.

  20. OpenSimulator Interoperability with DRDC Simulation Tools: Compatibility Study

    DTIC Science & Technology

    2014-09-01

    into two components: (1) backend data services consisting of user accounts, login service, assets, and inventory; and (2) the simulator server which...components are combined into a single OpenSimulator process. In grid mode, the two components are separated, placing the backend services into a ROBUST... mobile devices. Potential points of compatibility between Unity and OpenSimulator include: a Unity-based desktop computer OpenSimulator viewer; a

  1. Discrete Event Simulation Modeling and Analysis of Key Leader Engagements

    DTIC Science & Technology

    2012-06-01

    to offer. GreenPlayer agents require four parameters, pC, pKLK, pTK, and pRK , which give probabilities for being corrupt, having key leader...HandleMessageRequest component. The same parameter constraints apply to these four parameters. The parameter pRK is the same parameter from the CreatePlayers component...whether the local Green player has resource critical knowledge by using the parameter pRK . It schedules an EndResourceKnowledgeRequest event, passing

  2. Micro-scale pollution mechanism of dust diffusion in a blasting driving face based on CFD-DEM coupled model.

    PubMed

    Yu, Haiming; Cheng, Weimin; Xie, Yao; Peng, Huitian

    2018-05-23

    In order to investigate the diffuse pollution mechanisms of high-concentration dusts in the blasting driving face, the airflow-dust coupled model was constructed based on CFD-DEM coupled model; the diffusion rules of the dusts with different diameters at microscopic scale were analyzed in combination with the field measured results. The simulation results demonstrate that single-exhaust ventilation exhibited more favorable dust suppression performance than single-forced ventilation. Under single-exhaust ventilation condition, the motion trajectories of the dusts with the diameter smaller than 20 μm were close to the airflow streamline and these dusts were mainly distributed near the footway walls; by contrast, under single-forced ventilation condition, the motion trajectories of the dust particles with a diameter range of 20~40 μm were close to the airflow streamlines, and a large number of dusts with the diameter smaller than 20 μm accumulated in the regions 5 m and 17~25 m away from the head-on section. Moreover, under the single-exhaust ventilation, the relationship between dust diameter D and negative-pressured-induced dust emission ratio P can be expressed as P = - 25.03ln(D) + 110.39, and the dust emission ratio was up to 74.36% for 7-μm dusts, and the path-dependent settling behaviors of the dusts mainly occurred around the head-on section; under single-forced ventilation condition, the z value of the dusts with the diameter over 20 μm decreased and the dusts with a diameter smaller than 7 μm are particularly harmful to human health, but their settling ratios were below 22.36%. Graphical abstract The airflow-dust CFD-DEM coupling model was established. The numerical simulation results were verified. The migration laws of airflow field were obtained in a blasting driving face. The diffusion laws of dusts were obtained after blasting.

  3. Comparative Evaluation of a Four-Implant-Supported Polyetherketoneketone Framework Prosthesis: A Three-Dimensional Finite Element Analysis Based on Cone Beam Computed Tomography and Computer-Aided Design.

    PubMed

    Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol

    The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.

  4. Task management skills and their deficiencies during care delivery in simulated medical emergency situation: A classification.

    PubMed

    Morineau, Thierry; Chapelain, Pascal; Quinio, Philippe

    2016-06-01

    Our objective was to develop the analysis of task management skills by proposing a framework classifying task management stages and deficiencies. Few studies of non-technical skills have detailed the components of task management skills through behavioural markers, despite their central role in care delivery. A post hoc qualitative behavioural analysis was performed of recordings made of professional training sessions based upon simulated scenarios. Four recorded sessions in a high-fidelity simulation setting were observed and recorded. Two scenarios were used (cardiac arrest and respiratory failure), and there were two training sessions per scenario. Four types of task management deficiencies were identified with regards to task constraints: constraint relaxation, unsatisfied constraints, additional constraints and constraint transgression. Both equipment and space constraints were also identified. The lack of prerequisite actions when preparing the environment, corequisite actions for equipment and protocol monitoring, or postrequisite actions to restore the environment were associated with task management deficiencies. Deficiencies in task management behaviours can be identified in simulated as well as actual medical emergency settings. This framework opens perspectives for both training caregivers and designing ergonomic work situations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Finite element techniques for the Navier-Stokes equations in the primitive variable formulation and the vorticity stream-function formulation

    NASA Technical Reports Server (NTRS)

    Glaisner, F.; Tezduyar, T. E.

    1987-01-01

    Finite element procedures for the Navier-Stokes equations in the primitive variable formulation and the vorticity stream-function formulation have been implemented. For both formulations, streamline-upwind/Petrov-Galerkin techniques are used for the discretization of the transport equations. The main problem associated with the vorticity stream-function formulation is the lack of boundary conditions for vorticity at solid surfaces. Here an implicit treatment of the vorticity at no-slip boundaries is incorporated in a predictor-multicorrector time integration scheme. For the primitive variable formulation, mixed finite-element approximations are used. A nine-node element and a four-node + bubble element have been implemented. The latter is shown to exhibit a checkerboard pressure mode and a numerical treatment for this spurious pressure mode is proposed. The two methods are compared from the points of view of simulating internal and external flows and the possibilities of extensions to three dimensions.

  6. Numerical Study Comparing RANS and LES Approaches on a Circulation Control Airfoil

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Nishino, Takafumi

    2011-01-01

    A numerical study over a nominally two-dimensional circulation control airfoil is performed using a large-eddy simulation code and two Reynolds-averaged Navier-Stokes codes. Different Coanda jet blowing conditions are investigated. In addition to investigating the influence of grid density, a comparison is made between incompressible and compressible flow solvers. The incompressible equations are found to yield negligible differences from the compressible equations up to at least a jet exit Mach number of 0.64. The effects of different turbulence models are also studied. Models that do not account for streamline curvature effects tend to predict jet separation from the Coanda surface too late, and can produce non-physical solutions at high blowing rates. Three different turbulence models that account for streamline curvature are compared with each other and with large eddy simulation solutions. All three models are found to predict the Coanda jet separation location reasonably well, but one of the models predicts specific flow field details near the Coanda surface prior to separation much better than the other two. All Reynolds-averaged Navier-Stokes computations produce higher circulation than large eddy simulation computations, with different stagnation point location and greater flow acceleration around the nose onto the upper surface. The precise reasons for the higher circulation are not clear, although it is not solely a function of predicting the jet separation location correctly.

  7. Streamlining Software Aspects of Certification: Report on the SSAC Survey

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Dorsey, Cheryl A.; Knight, John C.; Leveson, Nancy G.; McCormick, G. Frank

    1999-01-01

    The aviation system now depends on information technology more than ever before to ensure safety and efficiency. To address concerns about the efficacy of software aspects of the certification process, the Federal Aviation Administration (FAA) began the Streamlining Software Aspects of Certification (SSAC) program. The SSAC technical team was commissioned to gather data, analyze results, and propose recommendations to maximize efficiency and minimize cost and delay, without compromising safety. The technical team conducted two public workshops to identify and prioritize software approval issues, and conducted a survey to validate the most urgent of those issues. The SSAC survey, containing over two hundred questions about the FAA's software approval process, reached over four hundred industry software developers, aircraft manufacturers, and FAA designated engineering representatives. Three hundred people responded. This report presents the SSAC program rationale, survey process, preliminary findings, and recommendations.

  8. Model-based development of a fault signature matrix to improve solid oxide fuel cell systems on-site diagnosis

    NASA Astrophysics Data System (ADS)

    Polverino, Pierpaolo; Pianese, Cesare; Sorrentino, Marco; Marra, Dario

    2015-04-01

    The paper focuses on the design of a procedure for the development of an on-field diagnostic algorithm for solid oxide fuel cell (SOFC) systems. The diagnosis design phase relies on an in-deep analysis of the mutual interactions among all system components by exploiting the physical knowledge of the SOFC system as a whole. This phase consists of the Fault Tree Analysis (FTA), which identifies the correlations among possible faults and their corresponding symptoms at system components level. The main outcome of the FTA is an inferential isolation tool (Fault Signature Matrix - FSM), which univocally links the faults to the symptoms detected during the system monitoring. In this work the FTA is considered as a starting point to develop an improved FSM. Making use of a model-based investigation, a fault-to-symptoms dependency study is performed. To this purpose a dynamic model, previously developed by the authors, is exploited to simulate the system under faulty conditions. Five faults are simulated, one for the stack and four occurring at BOP level. Moreover, the robustness of the FSM design is increased by exploiting symptom thresholds defined for the investigation of the quantitative effects of the simulated faults on the affected variables.

  9. A full scale hydrodynamic simulation of pyrotechnic combustion

    NASA Astrophysics Data System (ADS)

    Kim, Bohoon; Jang, Seung-Gyo; Yoh, Jack

    2017-06-01

    A full scale hydrodynamic simulation that requires an accurate reproduction of shock-induced detonation was conducted for design of an energetic component system. A series of small scale gap tests and detailed hydrodynamic simulations were used to validate the reactive flow model for predicting the shock propagation in a train configuration and to quantify the shock sensitivity of the energetic materials. The energetic component system is composed of four main components, namely a donor unit (HNS + HMX), a bulkhead (STS), an acceptor explosive (RDX), and a propellant (BKNO3) for gas generation. The pressurized gases generated from the burning propellant were purged into a 10 cc release chamber for study of the inherent oscillatory flow induced by the interferences between shock and rarefaction waves. The pressure fluctuations measured from experiment and calculation were investigated to further validate the peculiar peak at specific characteristic frequency (ωc = 8.3 kHz). In this paper, a step-by-step numerical description of detonation of high explosive components, deflagration of propellant component, and deformation of metal component is given in order to facilitate the proper implementation of the outlined formulation into a shock physics code for a full scale hydrodynamic simulation of the energetic component system.

  10. Modeling of Water Injection into a Vacuum

    NASA Technical Reports Server (NTRS)

    Alred, John W.; Smith, Nicole L.; Wang, K. C.; Lumpkin, Forrest E.; Fitzgerald, Steven M.

    1997-01-01

    A loosely coupled two-phase vacuum water plume model has been developed. This model consists of a source flow model to describe the expansion of water vapor, and the Lagrangian equations of motion for particle trajectories. Gas/Particle interaction is modeled through the drag force induced by the relative velocities. Particles are assumed traveling along streamlines. The equations of motion are integrated to obtain particle velocity along the streamline. This model has been used to predict the mass flux in a 5 meter radius hemispherical domain resulting from the burst of a water jet of 1.5 mm in diameter, mass flow rate of 24.2 g/s, and stagnation pressure of 21.0 psia, which is the nominal Orbiter water dump condition. The result is compared with an empirical water plume model deduced from a video image of the STS-29 water dump. To further improve the model, work has begun to numerically simulate the bubble formation and bursting present in a liquid stream injected into a vacuum. The technique of smoothed particle hydrodynamics was used to formulate this simulation. A status and results of the on-going effort are presented and compared to results from the literature.

  11. Statistical field estimators for multiscale simulations.

    PubMed

    Eapen, Jacob; Li, Ju; Yip, Sidney

    2005-11-01

    We present a systematic approach for generating smooth and accurate fields from particle simulation data using the notions of statistical inference. As an extension to a parametric representation based on the maximum likelihood technique previously developed for velocity and temperature fields, a nonparametric estimator based on the principle of maximum entropy is proposed for particle density and stress fields. Both estimators are applied to represent molecular dynamics data on shear-driven flow in an enclosure which exhibits a high degree of nonlinear characteristics. We show that the present density estimator is a significant improvement over ad hoc bin averaging and is also free of systematic boundary artifacts that appear in the method of smoothing kernel estimates. Similarly, the velocity fields generated by the maximum likelihood estimator do not show any edge effects that can be erroneously interpreted as slip at the wall. For low Reynolds numbers, the velocity fields and streamlines generated by the present estimator are benchmarked against Newtonian continuum calculations. For shear velocities that are a significant fraction of the thermal speed, we observe a form of shear localization that is induced by the confining boundary.

  12. Streamlined Livestock Trailer

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Bull Nose livestock trailer, manufactured by American Trailer, Inc. is one of a line of highway transport vehicles manufactured by American Trailers, Inc. The slant side front end is a streamlining feature based on a NASA Research Program which investigated the aerodynamic characteristics of trailer/tractor combinations and suggested ways of reducing air resistance. Application of NASA's aerodynamic research technology to the bull nose design resulted in a 10 percent reduction in air drag, which translates into annual fuel savings of several hundred dollars.

  13. A novel framework for the local extraction of extra-axial cerebrospinal fluid from MR brain images

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Shen, Mark D.; Kim, SunHyung; Swanson, Meghan; Collins, D. Louis; Fonov, Vladimir; Gerig, Guido; Piven, Joseph; Styner, Martin A.

    2018-03-01

    The quantification of cerebrospinal fluid (CSF) in the human brain has shown to play an important role in early postnatal brain developmental. Extr a-axial fluid (EA-CSF), which is characterized by the CSF in the subarachnoid space, is promising in the early detection of children at risk for neurodevelopmental disorders. Currently, though, there is no tool to extract local EA-CSF measurements in a way that is suitable for localized analysis. In this paper, we propose a novel framework for the localized, cortical surface based analysis of EA-CSF. In our proposed processing, we combine probabilistic brain tissue segmentation, cortical surface reconstruction as well as streamline based local EA-CSF quantification. For streamline computation, we employ the vector field generated by solving a Laplacian partial differential equation (PDE) between the cortical surface and the outer CSF hull. To achieve sub-voxel accuracy while minimizing numerical errors, fourth-order Runge-Kutta (RK4) integration was used to generate the streamlines. Finally, the local EA-CSF is computed by integrating the CSF probability along the generated streamlines. The proposed local EA-CSF extraction tool was used to study the early postnatal brain development in typically developing infants. The results show that the proposed localized EA-CSF extraction pipeline can produce statistically significant regions that are not observed in previous global approach.

  14. On 3-D inelastic analysis methods for hot section components. Volume 1: Special finite element models

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1988-01-01

    This annual status report presents the results of work performed during the fourth year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes permitting more accurate and efficient 3-D analysis of selected hot section components, i.e., combustor liners, turbine blades and turbine vanes. The computer codes embody a progression of math models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. Volume 1 of this report discusses the special finite element models developed during the fourth year of the contract.

  15. Critical review of the building downwash algorithms in AERMOD.

    PubMed

    Petersen, Ron L; Guerra, Sergio A; Bova, Anthony S

    2017-08-01

    The only documentation on the building downwash algorithm in AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model), referred to as PRIME (Plume Rise Model Enhancements), is found in the 2000 A&WMA journal article by Schulman, Strimaitis and Scire. Recent field and wind tunnel studies have shown that AERMOD can overpredict concentrations by factors of 2 to 8 for certain building configurations. While a wind tunnel equivalent building dimension study (EBD) can be conducted to approximately correct the overprediction bias, past field and wind tunnel studies indicate that there are notable flaws in the PRIME building downwash theory. A detailed review of the theory supported by CFD (Computational Fluid Dynamics) and wind tunnel simulations of flow over simple rectangular buildings revealed the following serious theoretical flaws: enhanced turbulence in the building wake starting at the wrong longitudinal location; constant enhanced turbulence extending up to the wake height; constant initial enhanced turbulence in the building wake (does not vary with roughness or stability); discontinuities in the streamline calculations; and no method to account for streamlined or porous structures. This paper documents theoretical and other problems in PRIME along with CFD simulations and wind tunnel observations that support these findings. Although AERMOD/PRIME may provide accurate and unbiased estimates (within a factor of 2) for some building configurations, a major review and update is needed so that accurate estimates can be obtained for other building configurations where significant overpredictions or underpredictions are common due to downwash effects. This will ensure that regulatory evaluations subject to dispersion modeling requirements can be based on an accurate model. Thus, it is imperative that the downwash theory in PRIME is corrected to improve model performance and ensure that the model better represents reality.

  16. Evolution of finite-amplitude localized vortices in planar homogeneous shear flows

    NASA Astrophysics Data System (ADS)

    Karp, Michael; Shukhman, Ilia G.; Cohen, Jacob

    2017-02-01

    An analytical-based method is utilized to follow the evolution of localized initially Gaussian disturbances in flows with homogeneous shear, in which the base velocity components are at most linear functions of the coordinates, including hyperbolic, elliptic, and simple shear. Coherent structures, including counterrotating vortex pairs (CVPs) and hairpin vortices, are formed for the cases where the streamlines of the base flow are open (hyperbolic and simple shear). For hyperbolic base flows, the dominance of shear over rotation leads to elongation of the localized disturbance along the outlet asymptote and formation of CVPs. For simple shear CVPs are formed from linear and nonlinear disturbances, whereas hairpins are observed only for highly nonlinear disturbances. For elliptic base flows CVPs, hairpins and vortex loops form initially, however they do not last and break into various vortical structures that spread in the spanwise direction. The effect of the disturbance's initial amplitude and orientation is examined and the optimal orientation achieving maximal growth is identified.

  17. Air flow quality analysis of modenas engine exhaust system

    NASA Astrophysics Data System (ADS)

    Shahriman A., B.; Mohamad Syafiq A., K.; Hashim, M. S. M.; Razlan, Zuradzman M.; Khairunizam W. A., N.; Hazry, D.; Afendi, Mohd; Daud, R.; Rahman, M. D. Tasyrif Abdul; Cheng, E. M.; Zaaba, S. K.

    2017-09-01

    The simulation process being conducted to determine the air flow effect between the original exhaust system and modified exhaust system. The simulations are conducted to investigate the flow distribution of exhaust gases that will affect the performance of the engine. The back flow pressure in the original exhaust system is predicted toward this simulation. The design modification to the exhaust port, exhaust pipe, and exhaust muffler has been done during this simulation to reduce the back flow effect. The new designs are introduced by enlarging the diameter of the exhaust port, enlarge the diameter of the exhaust pipe and created new design for the exhaust muffler. Based on the result obtained, there the pulsating flow form at the original exhaust port that will increase the velocity and resulting the back pressure occur. The result for new design of exhaust port, the velocity is lower at the valve guide in the exhaust port. New design muffler shows that the streamline of the exhaust flow move smoothly compare to the original muffler. It is proved by using the modification exhaust system, the back pressure are reduced and the engine performance can be improve.

  18. Interfacility Transfers to General Pediatric Floors: A Qualitative Study Exploring the Role of Communication.

    PubMed

    Rosenthal, Jennifer L; Okumura, Megumi J; Hernandez, Lenore; Li, Su-Ting T; Rehm, Roberta S

    2016-01-01

    Children with special health care needs often require health services that are only provided at subspecialty centers. Such children who present to nonspecialty hospitals might require a hospital-to-hospital transfer. When transitioning between medical settings, communication is an integral aspect that can affect the quality of patient care. The objectives of the study were to identify barriers and facilitators to effective interfacility pediatric transfer communication to general pediatric floors from the perspectives of referring and accepting physicians, and then develop a conceptual model for effective interfacility transfer communication. This was a single-center qualitative study using grounded theory methodology. Referring and accepting physicians of children with special health care needs were interviewed. Four researchers coded the data using ATLAS.ti (version 7, Scientific Software Development GMBH, Berlin, Germany), using a 2-step process of open coding, followed by focused coding until no new codes emerged. The research team reached consensus on the final major categories and subsequently developed a conceptual model. Eight referring and 9 accepting physicians were interviewed. Theoretical coding resulted in 3 major categories: streamlined transfer process, quality handoff and 2-way communication, and positive relationships between physicians across facilities. The conceptual model unites these categories and shows how these categories contribute to effective interfacility transfer communication. Proposed interventions involved standardizing the communication process and incorporating technology such as telemedicine during transfers. Communication is perceived to be an integral component of interfacility transfers. We recommend that transfer systems be re-engineered to make the process more streamlined, to improve the quality of the handoff and 2-way communication, and to facilitate positive relationships between physicians across facilities. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  19. Generalized Fourier analyses of the advection-diffusion equation - Part I: one-dimensional domains

    NASA Astrophysics Data System (ADS)

    Christon, Mark A.; Martinez, Mario J.; Voth, Thomas E.

    2004-07-01

    This paper presents a detailed multi-methods comparison of the spatial errors associated with finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. The errors are reported in terms of non-dimensional phase and group speed, discrete diffusivity, artificial diffusivity, and grid-induced anisotropy. It is demonstrated that Fourier analysis provides an automatic process for separating the discrete advective operator into its symmetric and skew-symmetric components and characterizing the spectral behaviour of each operator. For each of the numerical methods considered, asymptotic truncation error and resolution estimates are presented for the limiting cases of pure advection and pure diffusion. It is demonstrated that streamline upwind Petrov-Galerkin and its control-volume finite element analogue, the streamline upwind control-volume method, produce both an artificial diffusivity and a concomitant phase speed adjustment in addition to the usual semi-discrete artifacts observed in the phase speed, group speed and diffusivity. The Galerkin finite element method and its streamline upwind derivatives are shown to exhibit super-convergent behaviour in terms of phase and group speed when a consistent mass matrix is used in the formulation. In contrast, the CVFEM method and its streamline upwind derivatives yield strictly second-order behaviour. In Part II of this paper, we consider two-dimensional semi-discretizations of the advection-diffusion equation and also assess the affects of grid-induced anisotropy observed in the non-dimensional phase speed, and the discrete and artificial diffusivities. Although this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common analysis framework. Published in 2004 by John Wiley & Sons, Ltd.

  20. Single-cell isolation using a DVD optical pickup

    PubMed Central

    Kasukurti, A.; Potcoava, M.; Desai, S.A.; Eggleton, C.; Marr, D. W. M.

    2011-01-01

    A low-cost single-cell isolation system incorporating a digital versatile disc burner (DVD RW) optical pickup has been developed. We show that these readily available modules have the required laser power and focusing optics to provide a steady Gaussian beam capable of optically trapping micron-sized colloids and red blood cells. Utility of the pickup is demonstrated through the non-destructive isolation of such particles in a laminar-flow based microfluidic device that captures and translates single microscale objects across streamlines into designated channel exits. In this, the integrated objective lens focusing coils are used to steer the optical trap across the channel, resulting in the isolation of colloids and red blood cells using a very inexpensive off-the-shelf optical component. PMID:21643294

  1. A hypersonic aeroheating calculation method based on inviscid outer edge of boundary layer parameters

    NASA Astrophysics Data System (ADS)

    Meng, ZhuXuan; Fan, Hu; Peng, Ke; Zhang, WeiHua; Yang, HuiXin

    2016-12-01

    This article presents a rapid and accurate aeroheating calculation method for hypersonic vehicles. The main innovation is combining accurate of numerical method with efficient of engineering method, which makes aeroheating simulation more precise and faster. Based on the Prandtl boundary layer theory, the entire flow field is divided into inviscid and viscid flow at the outer edge of the boundary layer. The parameters at the outer edge of the boundary layer are numerically calculated from assuming inviscid flow. The thermodynamic parameters of constant-volume specific heat, constant-pressure specific heat and the specific heat ratio are calculated, the streamlines on the vehicle surface are derived and the heat flux is then obtained. The results of the double cone show that at the 0° and 10° angle of attack, the method of aeroheating calculation based on inviscid outer edge of boundary layer parameters reproduces the experimental data better than the engineering method. Also the proposed simulation results of the flight vehicle reproduce the viscid numerical results well. Hence, this method provides a promising way to overcome the high cost of numerical calculation and improves the precision.

  2. USING THE ECLPSS SOFTWARE ENVIRONMENT TO BUILD A SPATIALLY EXPLICIT COMPONENT-BASED MODEL OF OZONE EFFECTS ON FOREST ECOSYSTEMS. (R827958)

    EPA Science Inventory

    We have developed a modeling framework to support grid-based simulation of ecosystems at multiple spatial scales, the Ecological Component Library for Parallel Spatial Simulation (ECLPSS). ECLPSS helps ecologists to build robust spatially explicit simulations of ...

  3. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    USGS Publications Warehouse

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data could result in better informed management decisions and assist in guidance for more effective estuarine restoration projects.

  4. Aerodynamic Characteristics of a Refined Deep-step Planing-tail Flying-boat Hull with Various Forebody and Afterbody Shapes

    NASA Technical Reports Server (NTRS)

    Riebe, John M; Naeseth, Rodger L

    1952-01-01

    An investigation was made in the Langley 300-mph 7- by 10-foot tunnel to determine the aerodynamic characteristics of a refined deep-step planing-tail hull with various forebody and afterbody shapes and, for comparison, a streamline body simulating the fuselage of a modern transport airplane. The results of the tests indicated that the configurations incorporating a forebody with a length-beam ratio of 7 had lower minimum drag coefficients than the configurations incorporating a forebody with length-beam ratio of 5. The lowest minimum drag coefficients, which were considerably less than that of a conventional hull and slightly less than that of a streamline body, were obtained on the length-beam-ratio-7 forebody, alone and with round center boom. Drag coefficients and longitudinal- and lateral-stability parameters presented include the interference of a 21-percent-thick support wing.

  5. Improving chemical shift encoding‐based water–fat separation based on a detailed consideration of magnetic field contributions

    PubMed Central

    Ruschke, Stefan; Eggers, Holger; Meineke, Jakob; Rummeny, Ernst J.; Karampinos, Dimitrios C.

    2018-01-01

    Purpose To improve the robustness of existing chemical shift encoding‐based water–fat separation methods by incorporating a priori information of the magnetic field distortions in complex‐based water–fat separation. Methods Four major field contributions are considered: inhomogeneities of the scanner magnet, the shim field, an object‐based field map estimate, and a residual field. The former two are completely determined by spherical harmonic expansion coefficients directly available from the magnetic resonance (MR) scanner. The object‐based field map is forward simulated from air–tissue interfaces inside the field of view (FOV). The missing residual field originates from the object outside the FOV and is investigated by magnetic field simulations on a numerical whole body phantom. In vivo the spatially linear first‐order component of the residual field is estimated by measuring echo misalignments after demodulation of other field contributions resulting in a linear residual field. Gradient echo datasets of the cervical and the ankle region without and with shimming were acquired, where all four contributions were incorporated in the water–fat separation with two algorithms from the ISMRM water–fat toolbox and compared to water–fat separation with less incorporated field contributions. Results Incorporating all four field contributions as demodulation steps resulted in reduced temporal and spatial phase wraps leading to almost swap‐free water–fat separation results in all datasets. Conclusion Demodulating estimates of major field contributions reduces the phase evolution to be driven by only small differences in local tissue susceptibility, which supports the field smoothness assumption of existing water–fat separation techniques. PMID:29424458

  6. Amazon Business And GSA Advantage: A Comparative Analysis

    DTIC Science & Technology

    2017-12-01

    training for businesses or a customer -ordering guide; however, the site does offer a help center where businesses and users can submit questions...Electronic Offer FAR Federal Acquisition Regulation FAS Federal Acquisition Service FASA Federal Acquisition Streamlining Act FGO Field Grade Officer...component of GSA Advantage, is an online procurement tool that allows customers to request quotes for (1) commercial supplies and services under

  7. An Overview of Recent Developments in Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Bennett, Robert M.; Edwards, John W.

    2004-01-01

    The motivation for Computational Aeroelasticity (CA) and the elements of one type of the analysis or simulation process are briefly reviewed. The need for streamlining and improving the overall process to reduce elapsed time and improve overall accuracy is discussed. Further effort is needed to establish the credibility of the methodology, obtain experience, and to incorporate the experience base to simplify the method for future use. Experience with the application of a variety of Computational Aeroelasticity programs is summarized for the transonic flutter of two wings, the AGARD 445.6 wing and a typical business jet wing. There is a compelling need for a broad range of additional flutter test cases for further comparisons. Some existing data sets that may offer CA challenges are presented.

  8. A testing program to evaluate the effects of simulant mixed wastes on plastic transportation packaging components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigrey, P.J.; Dickens, T.G.; Dickman, P.T.

    1997-08-01

    Based on regulatory requirements for Type A and B radioactive material packaging, a Testing Program was developed to evaluate the effects of mixed wastes on plastic materials which could be used as liners and seals in transportation containers. The plastics evaluated in this program were butadiene-acrylonitrile copolymer (Nitrile rubber), cross-linked polyethylene, epichlorohydrin, ethylene-propylene rubber (EPDM), fluorocarbons, high-density polyethylene (HDPE), butyl rubber, polypropylene, polytetrafluoroethylene, and styrene-butadiene rubber (SBR). These plastics were first screened in four simulant mixed wastes. The liner materials were screened using specific gravity measurements and seal materials by vapor transport rate (VTR) measurements. For the screening of linermore » materials, Kel-F, HDPE, and XLPE were found to offer the greatest resistance to the combination of radiation and chemicals. The tests also indicated that while all seal materials passed exposure to the aqueous simulant mixed waste, EPDM and SBR had the lowest VTRs. In the chlorinated hydrocarbon simulant mixed waste, only Viton passed the screening tests. In both the simulant scintillation fluid mixed waste and the ketone mixture waste, none of the seal materials met the screening criteria. Those materials which passed the screening tests were subjected to further comprehensive testing in each of the simulant wastes. The materials were exposed to four different radiation doses followed by exposure to a simulant mixed waste at three temperatures and four different exposure times (7, 14, 28, 180 days). Materials were tested by measuring specific gravity, dimensional, hardness, stress cracking, VTR, compression set, and tensile properties. The second phase of this Testing Program involving the comprehensive testing of plastic liner has been completed and for seal materials is currently in progress.« less

  9. Science and Technology: The Making of the Air Force Research Laboratory

    DTIC Science & Technology

    2000-01-01

    AFRL . . . . . . . . . . . 187 11 Air Force Research Laboratory : Before and After...United States Air Force during my tenure as chief of staff—the crea - tion of the Air Force Research Laboratory ( AFRL ). As the “high technology” service...consolidate four existing laboratories into one Air Force Research Laboratory ( AFRL ) designed to lead to a more efficient and streamlined

  10. Impact of Redesigning a Large-Lecture Introductory Earth Science Course to Increase Student Achievement and Streamline Faculty Workload

    ERIC Educational Resources Information Center

    Kapp, Jessica L.; Slater, Timothy F.; Slater, Stephanie J.; Lyons, Daniel J.; Manhart, Kelly; Wehunt, Mary D.; Richardson, Randall M.

    2011-01-01

    A Geological Perspective is a general education survey course for non-science majors at a large southwestern research extensive university. The class has traditionally served 600 students per semester in four 150-student lectures taught by faculty, and accompanied by optional weekly study groups run by graduate teaching assistants. We radically…

  11. Clustered streamlined forms in Athabasca Valles, Mars: Evidence for sediment deposition during floodwater ponding

    USGS Publications Warehouse

    Burr, D.

    2005-01-01

    A unique clustering of layered streamlined forms in Athabasca Valles is hypothesized to reflect a significant hydraulic event. The forms, interpreted as sedimentary, are attributed to extensive sediment deposition during ponding and then streamlining of this sediment behind flow obstacles during ponded water outflow. These streamlined forms are analogous to those found in depositional basins and other loci of ponding in terrestrial catastrophic flood landscapes. These terrestrial streamlined forms can provide the best opportunity for reconstructing the history of the terrestrial flooding. Likewise, the streamlined forms in Athabasca Valles may provide the best opportunity to reconstruct the recent geologic history of this young Martian outflow channel. ?? 2005 Elsevier B.V. All rights reserved.

  12. Demographic and Component Allee Effects in Southern Lake Superior Gray Wolves

    PubMed Central

    Stenglein, Jennifer L.; Van Deelen, Timothy R.

    2016-01-01

    Recovering populations of carnivores suffering Allee effects risk extinction because positive population growth requires a minimum number of cooperating individuals. Conservationists seldom consider these issues in planning for carnivore recovery because of data limitations, but ignoring Allee effects could lead to overly optimistic predictions for growth and underestimates of extinction risk. We used Bayesian splines to document a demographic Allee effect in the time series of gray wolf (Canis lupus) population counts (1980–2011) in the southern Lake Superior region (SLS, Wisconsin and the upper peninsula of Michigan, USA) in each of four measures of population growth. We estimated that the population crossed the Allee threshold at roughly 20 wolves in four to five packs. Maximum per-capita population growth occurred in the mid-1990s when there were approximately 135 wolves in the SLS population. To infer mechanisms behind the demographic Allee effect, we evaluated a potential component Allee effect using an individual-based spatially explicit model for gray wolves in the SLS region. Our simulations varied the perception neighborhoods for mate-finding and the mean dispersal distances of wolves. Simulation of wolves with long-distance dispersals and reduced perception neighborhoods were most likely to go extinct or experience Allee effects. These phenomena likely restricted population growth in early years of SLS wolf population recovery. PMID:26930665

  13. Demographic and Component Allee Effects in Southern Lake Superior Gray Wolves.

    PubMed

    Stenglein, Jennifer L; Van Deelen, Timothy R

    2016-01-01

    Recovering populations of carnivores suffering Allee effects risk extinction because positive population growth requires a minimum number of cooperating individuals. Conservationists seldom consider these issues in planning for carnivore recovery because of data limitations, but ignoring Allee effects could lead to overly optimistic predictions for growth and underestimates of extinction risk. We used Bayesian splines to document a demographic Allee effect in the time series of gray wolf (Canis lupus) population counts (1980-2011) in the southern Lake Superior region (SLS, Wisconsin and the upper peninsula of Michigan, USA) in each of four measures of population growth. We estimated that the population crossed the Allee threshold at roughly 20 wolves in four to five packs. Maximum per-capita population growth occurred in the mid-1990s when there were approximately 135 wolves in the SLS population. To infer mechanisms behind the demographic Allee effect, we evaluated a potential component Allee effect using an individual-based spatially explicit model for gray wolves in the SLS region. Our simulations varied the perception neighborhoods for mate-finding and the mean dispersal distances of wolves. Simulation of wolves with long-distance dispersals and reduced perception neighborhoods were most likely to go extinct or experience Allee effects. These phenomena likely restricted population growth in early years of SLS wolf population recovery.

  14. Small Engine Technology (SET) - Task 14 Axisymmetric Engine Simulation Environment

    NASA Technical Reports Server (NTRS)

    Miller, Max J.

    1999-01-01

    As part of the NPSS (Numerical Propulsion Simulation System) project, NASA Lewis has a goal of developing an U.S. industry standard for an axisymmetric engine simulation environment. In this program, AlliedSignal Engines (AE) contributed to this goal by evaluating the ENG20 software and developing support tools. ENG20 is a NASA developed axisymmetric engine simulation tool. The project was divided into six subtasks which are summarized below: Evaluate the capabilities of the ENG20 code using an existing test case to see how this procedure can capture the component interactions for a full engine. Link AE's compressor and turbine axisymmetric streamline curvature codes (UD0300M and TAPS) with ENG20, which will provide the necessary boundary conditions for an ENG20 engine simulation. Evaluate GE's Global Data System (GDS), attempt to use GDS to do the linking of codes described in Subtask 2 above. Use a turbofan engine test case to evaluate various aspects of the system, including the linkage of UD0300M and TAPS with ENG20 and the GE data storage system. Also, compare the solution results with cycle deck results, axisymmetric solutions (UD0300M and TAPS), and test data to determine the accuracy of the solution. Evaluate the order of accuracy and the convergence time for the solution. Provide a monthly status report and a final formal report documenting AE's evaluation of ENG20. Provide the developed interfaces that link UD0300M and TAPS with ENG20, to NASA. The interface that links UD0300M with ENG20 will be compatible with the industr,, version of UD0300M.

  15. Accurate determination of the geoid undulation N

    NASA Astrophysics Data System (ADS)

    Lambrou, E.; Pantazis, G.; Balodimos, D. D.

    2003-04-01

    This work is related to the activities of the CERGOP Study Group Geodynamics of the Balkan Peninsula, presents a method for the determination of the variation ΔN and, indirectly, of the geoid undulation N with an accuracy of a few millimeters. It is based on the determination of the components xi, eta of the deflection of the vertical using modern geodetic instruments (digital total station and GPS receiver). An analysis of the method is given. Accuracy of the order of 0.01arcsec in the estimated values of the astronomical coordinates Φ and Δ is achieved. The result of applying the proposed method in an area around Athens is presented. In this test application, a system is used which takes advantage of the capabilities of modern geodetic instruments. The GPS receiver permits the determination of the geodetic coordinates at a chosen reference system and, in addition, provides accurate timing information. The astronomical observations are performed through a digital total station with electronic registering of angles and time. The required accuracy of the values of the coordinates is achieved in about four hours of fieldwork. In addition, the instrumentation is lightweight, easily transportable and can be setup in the field very quickly. Combined with a stream-lined data reduction procedure and the use of up-to-date astrometric data, the values of the components xi, eta of the deflection of the vertical and, eventually, the changes ΔN of the geoid undulation are determined easily and accurately. In conclusion, this work demonstrates that it is quite feasible to create an accurate map of the geoid undulation, especially in areas that present large geoid variations and other methods are not capable to give accurate and reliable results.

  16. Streamlining the Change-Over Protocol for the RPA Mission Intelligence Coordinator by way of Situation Awareness Oriented Design and Discrete Event Simulation

    DTIC Science & Technology

    2012-03-01

    this list adding “out-of-the-loop syndrome ”, mode awareness problems, and vigilance decrements to the SA challenges faced by RPA crews. 18...Systems, Man, and Cybernetics, vol. 19, no. 3, May/June. Ouma, J., Chappelle, W., & Salinas, A. (2011) “Faces of occupational burnout among U.S

  17. A Simulation Environment for Aerodynamic Analysis and Design of Flapping Wing Micro Air Vehicles

    DTIC Science & Technology

    2010-01-01

    parametric study involving numerous configurations with multiple flight conditions must be conducted in order to determine the potential "best design...virilis Honey Bee : Apis mellifica Bumble Bee : Bombus terrestris Hummingbird: Archi lochus colubris Hawkmoth: Manduca Sexta Hummingbird...Sf. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) B.PERFORMING ORGANIZATION REPORT NUMBER Streamline Numerics , Inc. SNI-CR

  18. Streamlining Simulation Development using a Commercial Game Engine

    DTIC Science & Technology

    2009-10-01

    few years. The realism is stunning and the Commercial Game Industry fuels the fire of cutting edge advances in hardware and immersive experiences...Technology applies to Military training in more than just the obvious upgrades in game engines and hardware. The increased visual realism and performance...elaborate storytelling and cinematic effects provide a more immersive and compelling experience to the player. The underlying game engine technology

  19. Creation of an active learning healthcare communications course using simulations relevant to pharmacy practice.

    PubMed

    Collier, Izabela A; Baker, David M

    2017-07-01

    The purpose of this project was to design and develop a health care communications course built around practice-like simulations and active learning in the first year of a professional pharmacy program. A three-credit health care communications course was divided into one didactic (two hours per week) and three simulation components (one hour per week). The simulation components consisted of one written patient education pamphlet, three group presentations, and three one-on-one patient counseling sessions. This was accomplished by breaking the class of approximately 75 students into eight separate sections, each consisting of 8-10 students and one instructor. Each week four sections were devoted to counseling role-plays: half in the role of pharmacists and half as patients. The other four sections were devoted to hour-long professional group presentations-half in the presenting group and half as audience. The students' performance in the simulated counseling sessions and group presentations has been tracked and analyzed to determine if the simulated exercises had a positive impact on the students' active communications skills. Consistently, over the first four years of the implementation of the course, students' communications skills, as measured by faculty assessments, in both professional group presentations and one-on-one counseling sessions significantly improved. Incorporation of active-learning simulation exercises into a healthcare communications course has a positive impact on the development of students' communications skills. This creates a foundation upon which students can build over the remainder of the professional program and into their future careers. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. The anatomy of extended limbic pathways in Asperger syndrome: a preliminary diffusion tensor imaging tractography study.

    PubMed

    Pugliese, Luca; Catani, Marco; Ameis, Stephanie; Dell'Acqua, Flavio; Thiebaut de Schotten, Michel; Murphy, Clodagh; Robertson, Dene; Deeley, Quinton; Daly, Eileen; Murphy, Declan G M

    2009-08-15

    It has been suggested that people with autistic spectrum disorder (ASD) have altered development (and connectivity) of limbic circuits. However, direct evidence of anatomical differences specific to white matter pathways underlying social behaviour and emotions in ASD is lacking. We used Diffusion Tensor Imaging Tractography to compare, in vivo, the microstructural integrity and age-related differences in the extended limbic pathways between subjects with Asperger syndrome and healthy controls. Twenty-four males with Asperger syndrome (mean age 23+/-12 years, age range: 9-54 years) and 42 age-matched male controls (mean age 25+/-10 years, age range: 9-54 years) were studied. We quantified tract-specific diffusivity measurements as indirect indexes of microstructural integrity (e.g. fractional anisotropy, FA; mean diffusivity, MD) and tract volume (e.g. number of streamlines) of the main limbic tracts. The dissected limbic pathways included the inferior longitudinal fasciculus, inferior frontal occipital fasciculus, uncinate, cingulum and fornix. There were no significant between-group differences in FA and MD. However, compared to healthy controls, individuals with Asperger syndrome had a significantly higher number of streamlines in the right (p=.003) and left (p=.03) cingulum, and in the right (p=.03) and left (p=.04) inferior longitudinal fasciculus. In contrast, people with Asperger syndrome had a significantly lower number of streamlines in the right uncinate (p=.02). Within each group there were significant age-related differences in MD and number of streamlines, but not FA. However, the only significant age-related between-group difference was in mean diffusivity of the left uncinate fasciculus (Z(obs)=2.05) (p=.02). Our preliminary findings suggest that people with Asperger syndrome have significant differences in the anatomy, and maturation, of some (but not all) limbic tracts.

  1. Removal of furan and phenolic compounds from simulated biomass hydrolysates by batch adsorption and continuous fixed-bed column adsorption methods.

    PubMed

    Lee, Sang Cheol; Park, Sunkyu

    2016-09-01

    It has been proposed to remove all potential inhibitors and sulfuric acid in biomass hydrolysates generated from dilute-acid pretreatment of biomass, based on three steps of sugar purification process. This study focused on its first step in which furan and phenolic compounds were selectively removed from the simulated hydrolysates using activated charcoal. Batch adsorption experiments demonstrated that the affinity of activated charcoal for each component was highest in the order of vanillic acid, 4-hydroxybenzoic acid, furfural, acetic acid, sulfuric acid, and xylose. The affinity of activated charcoal for furan and phenolic compounds proved to be significantly higher than that of the other three components. Four separation strategies were conducted with a combination of batch adsorption and continuous fixed-bed column adsorption methods. It was observed that xylose loss was negligible with near complete removal of furan and phenolic compounds, when at least one fixed-bed column adsorption was implemented in the strategy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Universal immunogenicity validation and assessment during early biotherapeutic development to support a green laboratory.

    PubMed

    Bautista, Ami C; Zhou, Lei; Jawa, Vibha

    2013-10-01

    Immunogenicity support during nonclinical biotherapeutic development can be resource intensive if supported by conventional methodologies. A universal indirect species-specific immunoassay can eliminate the need for biotherapeutic-specific anti-drug antibody immunoassays without compromising quality. By implementing the R's of sustainability (reduce, reuse, rethink), conservation of resources and greener laboratory practices were achieved in this study. Statistical analysis across four biotherapeutics supported identification of consistent product performance standards (cut points, sensitivity and reference limits) and a streamlined universal anti-drug antibody immunoassay method implementation strategy. We propose an efficient, fit-for-purpose, scientifically and statistically supported nonclinical immunogenicity assessment strategy. Utilization of a universal method and streamlined validation, while retaining comparability to conventional immunoassays and meeting the industry recommended standards, provides environmental credits in the scientific laboratory. Collectively, individual reductions in critical material consumption, energy usage, waste and non-environment friendly consumables, such as plastic and paper, support a greener laboratory environment.

  3. Study on the variable cycle engine modeling techniques based on the component method

    NASA Astrophysics Data System (ADS)

    Zhang, Lihua; Xue, Hui; Bao, Yuhai; Li, Jijun; Yan, Lan

    2016-01-01

    Based on the structure platform of the gas turbine engine, the components of variable cycle engine were simulated by using the component method. The mathematical model of nonlinear equations correspondeing to each component of the gas turbine engine was established. Based on Matlab programming, the nonlinear equations were solved by using Newton-Raphson steady-state algorithm, and the performance of the components for engine was calculated. The numerical simulation results showed that the model bulit can describe the basic performance of the gas turbine engine, which verified the validity of the model.

  4. Control volume analyses of glottal flow using a fully-coupled numerical fluid-structure interaction model

    NASA Astrophysics Data System (ADS)

    Yang, Jubiao; Krane, Michael; Zhang, Lucy

    2013-11-01

    Vocal fold vibrations and the glottal jet are successfully simulated using the modified Immersed Finite Element method (mIFEM), a fully coupled dynamics approach to model fluid-structure interactions. A self-sustained and steady vocal fold vibration is captured given a constant pressure input at the glottal entrance. The flow rates at different axial locations in the glottis are calculated, showing small variations among them due to the vocal fold motion and deformation. To further facilitate the understanding of the phonation process, two control volume analyses, specifically with Bernoulli's equation and Newton's 2nd law, are carried out for the glottal flow based on the simulation results. A generalized Bernoulli's equation is derived to interpret the correlations between the velocity and pressure temporally and spatially along the center line which is a streamline using a half-space model with symmetry boundary condition. A specialized Newton's 2nd law equation is developed and divided into terms to help understand the driving mechanism of the glottal flow.

  5. Progress report on daily flow-routing simulation for the Carson River, California and Nevada

    USGS Publications Warehouse

    Hess, G.W.

    1996-01-01

    A physically based flow-routing model using Hydrological Simulation Program-FORTRAN (HSPF) was constructed for modeling streamflow in the Carson River at daily time intervals as part of the Truckee-Carson Program of the U.S. Geological Survey (USGS). Daily streamflow data for water years 1978-92 for the mainstem river, tributaries, and irrigation ditches from the East Fork Carson River near Markleeville and West Fork Carson River at Woodfords down to the mainstem Carson River at Fort Churchill upstream from Lahontan Reservoir were obtained from several agencies and were compiled into a comprehensive data base. No previous physically based flow-routing model of the Carson River has incorporated multi-agency streamflow data into a single data base and simulated flow at a daily time interval. Where streamflow data were unavailable or incomplete, hydrologic techniques were used to estimate some flows. For modeling purposes, the Carson River was divided into six segments, which correspond to those used in the Alpine Decree that governs water rights along the river. Hydraulic characteristics were defined for 48 individual stream reaches based on cross-sectional survey data obtained from field surveys and previous studies. Simulation results from the model were compared with available observed and estimated streamflow data. Model testing demonstrated that hydraulic characteristics of the Carson River are adequately represented in the models for a range of flow regimes. Differences between simulated and observed streamflow result mostly from inadequate data characterizing inflow and outflow from the river. Because irrigation return flows are largely unknown, irrigation return flow percentages were used as a calibration parameter to minimize differences between observed and simulated streamflows. Observed and simulated streamflow were compared for daily periods for the full modeled length of the Carson River and for two major subreaches modeled with more detailed input data. Hydrographs and statistics presented in this report describe these differences. A sensitivity analysis of four estimated components of the hydrologic system evaluated which components were significant in the model. Estimated ungaged tributary streamflow is not a significant component of the model during low runoff, but is significant during high runoff. The sensitivity analysis indicates that changes in the estimated irrigation diversion and estimated return flow creates a noticeable change in the statistics. The modeling for this study is preliminary. Results of the model are constrained by current availability and accuracy of observed hydrologic data. Several inflows and outflows of the Carson River are not described by time-series data and therefore are not represented in the model.

  6. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  7. Study on flow over finite wing with respect to F-22 raptor, Supermarine Spitfire, F-7 BG aircraft wing and analyze its stability performance and experimental values

    NASA Astrophysics Data System (ADS)

    Ali, Md. Nesar; Alam, Mahbubul

    2017-06-01

    A finite wing is a three-dimensional body, and consequently the flow over the finite wing is three-dimensional; that is, there is a component of flow in the span wise direction. The physical mechanism for generating lift on the wing is the existence of a high pressure on the bottom surface and a low pressure on the top surface. The net imbalance of the pressure distribution creates the lift. As a by-product of this pressure imbalance, the flow near the wing tips tends to curl around the tips, being forced from the high-pressure region just underneath the tips to the low-pressure region on top. This flow around the wing tips is shown in the front view of the wing. As a result, on the top surface of the wing, there is generally a span wise component of flow from the tip toward the wing root, causing the streamlines over the top surface to bend toward the root. On the bottom surface of the wing, there is generally a span wise component of flow from the root toward the tip, causing the streamlines over the bottom surface to bend toward the tip. Clearly, the flow over the finite wing is three-dimensional, and therefore we would expect the overall aerodynamic properties of such a wing to differ from those of its airfoil sections. The tendency for the flow to "leak" around the wing tips has another important effect on the aerodynamics of the wing. This flow establishes a circulatory motion that trails downstream of the wing; that is, a trailing vortex is created at each wing tip. The aerodynamics of finite wings is analyzed using the classical lifting line model. This simple model allows a closed-form solution that captures most of the physical effects applicable to finite wings. The model is based on the horseshoe-shaped vortex that introduces the concept of a vortex wake and wing tip vortices. The downwash induced by the wake creates an induced drag that did not exist in the two-dimensional analysis. Furthermore, as wingspan is reduced, the wing lift slope decreases, and the induced drag increases, reducing overall efficiency. To complement the high aspect ratio wing case, a slender wing model is formulated so that the lift and drag can be estimated for this limiting case as well. We analyze the stability performance of F-22 raptor, Supermarine Spitfire, F-7 BG Aircraft wing by using experimental method and simulation software. The experimental method includes fabrication of F-22 raptor, Supermarine Spitfire, F-7 BG Aircraft wing which making material is Gamahr wood. Testing this model wing in wind tunnel test and after getting expected data we also compared this value with analyzing software data for furthermore experiment.

  8. A novel capacitive absolute positioning sensor based on time grating with nanometer resolution

    NASA Astrophysics Data System (ADS)

    Pu, Hongji; Liu, Hongzhong; Liu, Xiaokang; Peng, Kai; Yu, Zhicheng

    2018-05-01

    The present work proposes a novel capacitive absolute positioning sensor based on time grating. The sensor includes a fine incremental-displacement measurement component combined with a coarse absolute-position measurement component to obtain high-resolution absolute positioning measurements. A single row type sensor was proposed to achieve fine displacement measurement, which combines the two electrode rows of a previously proposed double-row type capacitive displacement sensor based on time grating into a single row. To achieve absolute positioning measurement, the coarse measurement component is designed as a single-row type displacement sensor employing a single spatial period over the entire measurement range. In addition, this component employs a rectangular induction electrode and four groups of orthogonal discrete excitation electrodes with half-sinusoidal envelope shapes, which were formed by alternately extending the rectangular electrodes of the fine measurement component. The fine and coarse measurement components are tightly integrated to form a compact absolute positioning sensor. A prototype sensor was manufactured using printed circuit board technology for testing and optimization of the design in conjunction with simulations. Experimental results show that the prototype sensor achieves a ±300 nm measurement accuracy with a 1 nm resolution over a displacement range of 200 mm when employing error compensation. The proposed sensor is an excellent alternative to presently available long-range absolute nanometrology sensors owing to its low cost, simple structure, and ease of manufacturing.

  9. A Numerical Investigation of the Startup Transient in a Wave Rotor

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.

    1996-01-01

    The startup process is investigated for a hypothetical four-port wave rotor, envisioned as a topping cycle for a small gas turbine engine. The investigation is conducted numerically using a multi-passage, one-dimensional CFD-based wave rotor simulation in combination with lumped volume models for the combustor, exhaust valve plenum, and rotor center cavity components. The simulation is described and several startup transients are presented which illustrate potential difficulties for the specific cycle design investigated. In particular it is observed that, prior to combustor light-off, or just after, the flow through the combustor loop is reversed from the design direction. The phenomenon is demonstrated and several possible modifications techniques are discussed which avoid or overcome the problem.

  10. Seakeeping with the semi-Lagrangian particle finite element method

    NASA Astrophysics Data System (ADS)

    Nadukandi, Prashanth; Servan-Camas, Borja; Becker, Pablo Agustín; Garcia-Espinosa, Julio

    2017-07-01

    The application of the semi-Lagrangian particle finite element method (SL-PFEM) for the seakeeping simulation of the wave adaptive modular vehicle under spray generating conditions is presented. The time integration of the Lagrangian advection is done using the explicit integration of the velocity and acceleration along the streamlines (X-IVAS). Despite the suitability of the SL-PFEM for the considered seakeeping application, small time steps were needed in the X-IVAS scheme to control the solution accuracy. A preliminary proposal to overcome this limitation of the X-IVAS scheme for seakeeping simulations is presented.

  11. Subgrid or Reynolds stress-modeling for three-dimensional turbulence computations

    NASA Technical Reports Server (NTRS)

    Rubesin, M. W.

    1975-01-01

    A review is given of recent advances in two distinct computational methods for evaluating turbulence fields, namely, statistical Reynolds stress modeling and turbulence simulation, where large eddies are followed in time. It is shown that evaluation of the mean Reynolds stresses, rather than use of a scalar eddy viscosity, permits an explanation of streamline curvature effects found in several experiments. Turbulence simulation, with a new volume averaging technique and third-order accurate finite-difference computing is shown to predict the decay of isotropic turbulence in incompressible flow with rather modest computer storage requirements, even at Reynolds numbers of aerodynamic interest.

  12. Experimental and analytical investigation on metal damage suffered from simulated lightning currents

    NASA Astrophysics Data System (ADS)

    Yakun, LIU; Zhengcai, FU; Quanzhen, LIU; Baoquan, LIU; Anirban, GUHA

    2017-12-01

    The damage of two typical metal materials, Al alloy 3003 and steel alloy Q235B, subjected to four representative lightning current components are investigated by laboratory and analytical studies to provide fundamental data for lightning protection. The four lightning components simulating the natural lightning consist of the first return stroke, the continuing current of interval stroke, the long continuing current, and the subsequent stroke, with amplitudes 200 kA, 8 kA, 400 A, and 100 kA, respectively. The damage depth and area suffered from different lightning components are measured by the ultrasonic scanning system. And the temperature rise is measured by the thermal imaging camera. The results show that, for both Al 3003 and steel Q235B, the first return stroke component results in the largest damage area with damage depth 0.02 mm uttermost. The long continuing current component leads to the deepest damage depth of 3.3 mm for Al 3003 and much higher temperature rise than other components. The correlation analysis between damage results and lightning parameters indicates that the damage depth has a positive correlation with charge transfer. The damage area is mainly determined by the current amplitude and the temperature rise increases linearly with the charge transfer larger.

  13. Control software for two dimensional airfoil tests using a self-streamlining flexible walled transonic test section

    NASA Technical Reports Server (NTRS)

    Wolf, S. W. D.; Goodyer, M. J.

    1982-01-01

    Operation of the Transonic Self-Streamlining Wind Tunnel (TSWT) involved on-line data acquisition with automatic wall adjustment. A tunnel run consisted of streamlining the walls from known starting contours in iterative steps and acquiring model data. Each run performs what is described as a streamlining cycle. The associated software is presented.

  14. Population Representation in the Military Services: Fiscal Year 2002

    DTIC Science & Technology

    2004-03-01

    a more streamlined force. Although much progress has been achieved with regard to gender equity, much work remains . The representation of women has...primary reason for the difference by gender is lower retention rates among enlisted women . 4-1 Chapter 4 ACTIVE COMPONENT OFFICERS The...for women . The occupational differences by gender are illustrated in Table 3.8. In FY 2002, almost half of enlisted women were in functional

  15. Adaptive Planning: Understanding Organizational Workload to Capability/ Capacity through Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Hase, Chris

    2010-01-01

    In August 2003, the Secretary of Defense (SECDEF) established the Adaptive Planning (AP) initiative [1] with an objective of reducing the time necessary to develop and revise Combatant Commander (COCOM) contingency plans and increase SECDEF plan visibility. In addition to reducing the traditional plan development timeline from twenty-four months to less than twelve months (with a goal of six months)[2], AP increased plan visibility to Department of Defense (DoD) leadership through In-Progress Reviews (IPRs). The IPR process, as well as the increased number of campaign and contingency plans COCOMs had to develop, increased the workload while the number of planners remained fixed. Several efforts from collaborative planning tools to streamlined processes were initiated to compensate for the increased workload enabling COCOMS to better meet shorter planning timelines. This paper examines the Joint Strategic Capabilities Plan (JSCP) directed contingency planning and staffing requirements assigned to a combatant commander staff through the lens of modeling and simulation. The dynamics of developing a COCOM plan are captured with an ExtendSim [3] simulation. The resulting analysis provides a quantifiable means by which to measure a combatant commander staffs workload associated with development and staffing JSCP [4] directed contingency plans with COCOM capability/capacity. Modeling and simulation bring significant opportunities in measuring the sensitivity of key variables in the assessment of workload to capability/capacity analysis. Gaining an understanding of the relationship between plan complexity, number of plans, planning processes, and number of planners with time required for plan development provides valuable information to DoD leadership. Through modeling and simulation AP leadership can gain greater insight in making key decisions on knowing where to best allocate scarce resources in an effort to meet DoD planning objectives.

  16. The CMIP6 Sea-Ice Model Intercomparison Project (SIMIP): Understanding sea ice through climate-model simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Notz, Dirk; Jahn, Alexandra; Holland, Marika

    A better understanding of the role of sea ice for the changing climate of our planet is the central aim of the diagnostic Coupled Model Intercomparison Project 6 (CMIP6)-endorsed Sea-Ice Model Intercomparison Project (SIMIP). To reach this aim, SIMIP requests sea-ice-related variables from climate-model simulations that allow for a better understanding and, ultimately, improvement of biases and errors in sea-ice simulations with large-scale climate models. This then allows us to better understand to what degree CMIP6 model simulations relate to reality, thus improving our confidence in answering sea-ice-related questions based on these simulations. Furthermore, the SIMIP protocol provides a standardmore » for sea-ice model output that will streamline and hence simplify the analysis of the simulated sea-ice evolution in research projects independent of CMIP. To reach its aims, SIMIP provides a structured list of model output that allows for an examination of the three main budgets that govern the evolution of sea ice, namely the heat budget, the momentum budget, and the mass budget. Furthermore, we explain the aims of SIMIP in more detail and outline how its design allows us to answer some of the most pressing questions that sea ice still poses to the international climate-research community.« less

  17. The CMIP6 Sea-Ice Model Intercomparison Project (SIMIP): Understanding sea ice through climate-model simulations

    DOE PAGES

    Notz, Dirk; Jahn, Alexandra; Holland, Marika; ...

    2016-09-23

    A better understanding of the role of sea ice for the changing climate of our planet is the central aim of the diagnostic Coupled Model Intercomparison Project 6 (CMIP6)-endorsed Sea-Ice Model Intercomparison Project (SIMIP). To reach this aim, SIMIP requests sea-ice-related variables from climate-model simulations that allow for a better understanding and, ultimately, improvement of biases and errors in sea-ice simulations with large-scale climate models. This then allows us to better understand to what degree CMIP6 model simulations relate to reality, thus improving our confidence in answering sea-ice-related questions based on these simulations. Furthermore, the SIMIP protocol provides a standardmore » for sea-ice model output that will streamline and hence simplify the analysis of the simulated sea-ice evolution in research projects independent of CMIP. To reach its aims, SIMIP provides a structured list of model output that allows for an examination of the three main budgets that govern the evolution of sea ice, namely the heat budget, the momentum budget, and the mass budget. Furthermore, we explain the aims of SIMIP in more detail and outline how its design allows us to answer some of the most pressing questions that sea ice still poses to the international climate-research community.« less

  18. Surface flow and heating distributions on a cylinder in near wake of Aeroassist Flight Experiment (AFE) configuration at incidence in Mach 10 Air

    NASA Technical Reports Server (NTRS)

    Wells, William L.

    1990-01-01

    Experimental heat transfer distributions and surface streamline directions are presented for a cylinder in the near wake of the Aeroassist Flight Experiment forebody configuration. Tests were conducted in air at a nominal free stream Mach number of 10, with post shock Reynolds numbers based on model base height of 6,450 to 50,770, and angles of attack of 5, 0, -5, and -10 degrees. Heat transfer data were obtained with thin film resistance gage and surface streamline directions by the oil flow technique. Comparisons between measured values and predicted values were made by using a Navier-Stokes computer code.

  19. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  20. Fully automatic guidance and control for rotorcraft nap-of-the-Earth flight following planned profiles. Volume 1: Real-time piloted simulation

    NASA Technical Reports Server (NTRS)

    Clement, Warren F.; Gorder, Peter J.; Jewell, Wayne F.

    1991-01-01

    Developing a single-pilot, all-weather nap-of-the-earth (NOE) capability requires fully automatic NOE (ANOE) navigation and flight control. Innovative guidance and control concepts are investigated in a four-fold research effort that: (1) organizes the on-board computer-based storage and real-time updating of NOE terrain profiles and obstacles in course-oriented coordinates indexed to the mission flight plan; (2) defines a class of automatic anticipative pursuit guidance algorithms and necessary data preview requirements to follow the vertical, lateral, and longitudinal guidance commands dictated by the updated flight profiles; (3) automates a decision-making process for unexpected obstacle avoidance; and (4) provides several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the forehand knowledge of the recorded environment (terrain, cultural features, threats, and targets), which is then used to determine an appropriate evasive maneuver if a nonconformity of the sensed and recorded environments is observed. This four-fold research effort was evaluated in both fixed-based and moving-based real-time piloted simulations, thereby, providing a practical demonstration for evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and re-engagement of the automatic system. Volume one describes the major components of the guidance and control laws as well as the results of the piloted simulations. Volume two describes the complete mathematical model of the fully automatic guidance system for rotorcraft NOE flight following planned flight profiles.

  1. Planar Superconducting Millimeter-Wave/Terahertz Channelizing Filter

    NASA Technical Reports Server (NTRS)

    Ehsan, Negar; U-yen, Kongpop; Brown, Ari; Hsieh, Wen-Ting; Wollack, Edward; Moseley, Samuel

    2013-01-01

    This innovation is a compact, superconducting, channelizing bandpass filter on a single-crystal (0.45 m thick) silicon substrate, which operates from 300 to 600 GHz. This device consists of four channels with center frequencies of 310, 380, 460, and 550 GHz, with approximately 50-GHz bandwidth per channel. The filter concept is inspired by the mammalian cochlea, which is a channelizing filter that covers three decades of bandwidth and 3,000 channels in a very small physical space. By using a simplified physical cochlear model, and its electrical analog of a channelizing filter covering multiple octaves bandwidth, a large number of output channels with high inter-channel isolation and high-order upper stopband response can be designed. A channelizing filter is a critical component used in spectrometer instruments that measure the intensity of light at various frequencies. This embodiment was designed for MicroSpec in order to increase the resolution of the instrument (with four channels, the resolution will be increased by a factor of four). MicroSpec is a revolutionary wafer-scale spectrometer that is intended for the SPICA (Space Infrared Telescope for Cosmology and Astrophysics) Mission. In addition to being a vital component of MicroSpec, the channelizing filter itself is a low-resolution spectrometer when integrated with only an antenna at its input, and a detector at each channel s output. During the design process for this filter, the available characteristic impedances, possible lumped element ranges, and fabrication tolerances were identified for design on a very thin silicon substrate. Iterations between full-wave and lumped-element circuit simulations were performed. Each channel s circuit was designed based on the availability of characteristic impedances and lumped element ranges. This design was based on a tabular type bandpass filter with no spurious harmonic response. Extensive electromagnetic modeling for each channel was performed. Four channels, with 50-GHz bandwidth, were designed, each using multiple transmission line media such as microstrip, coplanar waveguide, and quasi-lumped components on 0.45- m thick silicon. In the design process, modeling issues had to be overcome. Due to the extremely high frequencies, very thin Si substrate, and the superconducting metal layers, most commercially available software fails in various ways. These issues were mitigated by using alternative software that was capable of handling them at the expense of greater simulation time. The design of on-chip components for the filter characterization, such as a broadband antenna, Wilkinson power dividers, attenuators, detectors, and transitions has been completed.

  2. Particle dynamics and deposition in true-scale pulmonary acinar models.

    PubMed

    Fishler, Rami; Hofemeier, Philipp; Etzion, Yael; Dubowski, Yael; Sznitman, Josué

    2015-09-11

    Particle transport phenomena in the deep alveolated airways of the lungs (i.e. pulmonary acinus) govern deposition outcomes following inhalation of hazardous or pharmaceutical aerosols. Yet, there is still a dearth of experimental tools for resolving acinar particle dynamics and validating numerical simulations. Here, we present a true-scale experimental model of acinar structures consisting of bifurcating alveolated ducts that capture breathing-like wall motion and ensuing respiratory acinar flows. We study experimentally captured trajectories of inhaled polydispersed smoke particles (0.2 to 1 μm in diameter), demonstrating how intrinsic particle motion, i.e. gravity and diffusion, is crucial in determining dispersion and deposition of aerosols through a streamline crossing mechanism, a phenomenon paramount during flow reversal and locally within alveolar cavities. A simple conceptual framework is constructed for predicting the fate of inhaled particles near an alveolus by identifying capture and escape zones and considering how streamline crossing may shift particles between them. In addition, we examine the effect of particle size on detailed deposition patterns of monodispersed microspheres between 0.1-2 μm. Our experiments underline local modifications in the deposition patterns due to gravity for particles ≥0.5 μm compared to smaller particles, and show good agreement with corresponding numerical simulations.

  3. Particle dynamics and deposition in true-scale pulmonary acinar models

    PubMed Central

    Fishler, Rami; Hofemeier, Philipp; Etzion, Yael; Dubowski, Yael; Sznitman, Josué

    2015-01-01

    Particle transport phenomena in the deep alveolated airways of the lungs (i.e. pulmonary acinus) govern deposition outcomes following inhalation of hazardous or pharmaceutical aerosols. Yet, there is still a dearth of experimental tools for resolving acinar particle dynamics and validating numerical simulations. Here, we present a true-scale experimental model of acinar structures consisting of bifurcating alveolated ducts that capture breathing-like wall motion and ensuing respiratory acinar flows. We study experimentally captured trajectories of inhaled polydispersed smoke particles (0.2 to 1 μm in diameter), demonstrating how intrinsic particle motion, i.e. gravity and diffusion, is crucial in determining dispersion and deposition of aerosols through a streamline crossing mechanism, a phenomenon paramount during flow reversal and locally within alveolar cavities. A simple conceptual framework is constructed for predicting the fate of inhaled particles near an alveolus by identifying capture and escape zones and considering how streamline crossing may shift particles between them. In addition, we examine the effect of particle size on detailed deposition patterns of monodispersed microspheres between 0.1–2 μm. Our experiments underline local modifications in the deposition patterns due to gravity for particles ≥0.5 μm compared to smaller particles, and show good agreement with corresponding numerical simulations. PMID:26358580

  4. Aerodynamic design on high-speed trains

    NASA Astrophysics Data System (ADS)

    Ding, San-San; Li, Qiang; Tian, Ai-Qin; Du, Jian; Liu, Jia-Li

    2016-04-01

    Compared with the traditional train, the operational speed of the high-speed train has largely improved, and the dynamic environment of the train has changed from one of mechanical domination to one of aerodynamic domination. The aerodynamic problem has become the key technological challenge of high-speed trains and significantly affects the economy, environment, safety, and comfort. In this paper, the relationships among the aerodynamic design principle, aerodynamic performance indexes, and design variables are first studied, and the research methods of train aerodynamics are proposed, including numerical simulation, a reduced-scale test, and a full-scale test. Technological schemes of train aerodynamics involve the optimization design of the streamlined head and the smooth design of the body surface. Optimization design of the streamlined head includes conception design, project design, numerical simulation, and a reduced-scale test. Smooth design of the body surface is mainly used for the key parts, such as electric-current collecting system, wheel truck compartment, and windshield. The aerodynamic design method established in this paper has been successfully applied to various high-speed trains (CRH380A, CRH380AM, CRH6, CRH2G, and the Standard electric multiple unit (EMU)) that have met expected design objectives. The research results can provide an effective guideline for the aerodynamic design of high-speed trains.

  5. Numerical Modeling of Three-Dimensional Confined Flows

    NASA Technical Reports Server (NTRS)

    Greywall, M. S.

    1981-01-01

    A three dimensional confined flow model is presented. The flow field is computed by calculating velocity and enthalpy along a set of streamlines. The finite difference equations are obtained by applying conservation principles to streamtubes constructed around the chosen streamlines. With appropriate substitutions for the body force terms, the approach computes three dimensional magnetohydrodynamic channel flows. A listing of a computer code, based on this approach is presented in FORTRAN IV language. The code computes three dimensional compressible viscous flow through a rectangular duct, with the duct cross section specified along the axis.

  6. MJO Signals in Latent Heating: Results from TRMM Retrievals

    NASA Technical Reports Server (NTRS)

    Zhang, Chidong; Ling, Jian; Hagos, Samson; Tao, Wei-Kuo; Lang, Steve; Takayabu, Yukari N.; Shige, Shoichi; Katsumata, Masaki; Olson, William S.; L'Ecuyer, Tristan

    2010-01-01

    The Madden-Julian Oscillation (MJO) is the dominant intraseasonal signal in the global tropical atmosphere. Almost all numerical climate models have difficulty to simulate realistic MJO. Four TRMM datasets of latent heating were diagnosed for signals in the MJO. In all four datasets, vertical structures of latent heating are dominated by two components, one deep with its peak above the melting level and one shallow with its peak below. Profiles of the two components are nearly ubiquitous in longitude, allowing a separation of the vertical and zonal/temporal variations when the latitudinal dependence is not considered. All four datasets exhibit robust MJO spectral signals in the deep component as eastward propagating spectral peaks centered at period of 50 days and zonal wavenumber 1, well distinguished from lower- and higher-frequency power and much stronger than the corresponding westward power. The shallow component shows similar but slightly less robust MJO spectral peaks. MJO signals were further extracted from a combination of band-pass (30 - 90 day) filtered deep and shallow components. Largest amplitudes of both deep and shallow components of the MJO are confined to the Indian and western Pacific Oceans. There is a local minimum in the deep components over the Maritime Continent. The shallow components of the MJO differ substantially among the four TRMM datasets in their detailed zonal distributions in the eastern hemisphere. In composites of the heating evolution through the life cycle of the MJO, the shallow components lead the deep ones in some datasets and at certain longitudes. In many respects, the four TRMM datasets agree well in their deep components, but not in their shallow components and the phase relations between the deep and shallow components. These results indicate that caution must be exercised in applications of these latent heating data.

  7. The Influence of Aerosol Hygroscopicity on Precipitation Intensity During a Mesoscale Convective Event

    NASA Astrophysics Data System (ADS)

    Kawecki, Stacey; Steiner, Allison L.

    2018-01-01

    We examine how aerosol composition affects precipitation intensity using the Weather and Research Forecasting Model with Chemistry (version 3.6). By changing the prescribed default hygroscopicity values to updated values from laboratory studies, we test model assumptions about individual component hygroscopicity values of ammonium, sulfate, nitrate, and organic species. We compare a baseline simulation (BASE, using default hygroscopicity values) with four sensitivity simulations (SULF, increasing the sulfate hygroscopicity; ORG, decreasing organic hygroscopicity; SWITCH, using a concentration-dependent hygroscopicity value for ammonium; and ALL, including all three changes) to understand the role of aerosol composition on precipitation during a mesoscale convective system (MCS). Overall, the hygroscopicity changes influence the spatial patterns of precipitation and the intensity. Focusing on the maximum precipitation in the model domain downwind of an urban area, we find that changing the individual component hygroscopicities leads to bulk hygroscopicity changes, especially in the ORG simulation. Reducing bulk hygroscopicity (e.g., ORG simulation) initially causes fewer activated drops, weakened updrafts in the midtroposphere, and increased precipitation from larger hydrometeors. Increasing bulk hygroscopicity (e.g., SULF simulation) simulates more numerous and smaller cloud drops and increases precipitation. In the ALL simulation, a stronger cold pool and downdrafts lead to precipitation suppression later in the MCS evolution. In this downwind region, the combined changes in hygroscopicity (ALL) reduces the overprediction of intense events (>70 mm d-1) and better captures the range of moderate intensity (30-60 mm d-1) events. The results of this single MCS analysis suggest that aerosol composition can play an important role in simulating high-intensity precipitation events.

  8. A weight-of-evidence approach to identify nanomaterials in consumer products: a case study of nanoparticles in commercial sunscreens.

    PubMed

    Cuddy, Michael F; Poda, Aimee R; Moser, Robert D; Weiss, Charles A; Cairns, Carolyn; Steevens, Jeffery A

    2016-01-01

    Nanoscale ingredients in commercial products represent a point of emerging environmental concern due to recent findings that correlate toxicity with small particle size. A weight-of-evidence (WOE) approach based upon multiple lines of evidence (LOE) is developed here to assess nanomaterials as they exist in consumer product formulations, providing a qualitative assessment regarding the presence of nanomaterials, along with a baseline estimate of nanoparticle concentration if nanomaterials do exist. Electron microscopy, analytical separations, and X-ray detection methods were used to identify and characterize nanomaterials in sunscreen formulations. The WOE/LOE approach as applied to four commercial sunscreen products indicated that all four contained at least 10% dispersed primary particles having at least one dimension <100 nm in size. Analytical analyses confirmed that these constituents were comprised of zinc oxide (ZnO) or titanium dioxide (TiO2). The screening approaches developed herein offer a streamlined, facile means to identify potentially hazardous nanomaterial constituents with minimal abrasive processing of the raw material.

  9. The German VR Simulation Realism Scale--psychometric construction for virtual reality applications with virtual humans.

    PubMed

    Poeschl, Sandra; Doering, Nicola

    2013-01-01

    Virtual training applications with high levels of immersion or fidelity (for example for social phobia treatment) produce high levels of presence and therefore belong to the most successful Virtual Reality developments. Whereas display and interaction fidelity (as sub-dimensions of immersion) and their influence on presence are well researched, realism of the displayed simulation depends on the specific application and is therefore difficult to measure. We propose to measure simulation realism by using a self-report questionnaire. The German VR Simulation Realism Scale for VR training applications was developed based on a translation of scene realism items from the Witmer-Singer-Presence Questionnaire. Items for realism of virtual humans (for example for social phobia training applications) were supplemented. A sample of N = 151 students rated simulation realism of a Fear of Public Speaking application. Four factors were derived by item- and principle component analysis (Varimax rotation), representing Scene Realism, Audience Behavior, Audience Appearance and Sound Realism. The scale developed can be used as a starting point for future research and measurement of simulation realism for applications including virtual humans.

  10. Terradynamically streamlined shapes in animals and robots enhance traversability through densely cluttered terrain.

    PubMed

    Li, Chen; Pullin, Andrew O; Haldane, Duncan W; Lam, Han K; Fearing, Ronald S; Full, Robert J

    2015-06-22

    Many animals, modern aircraft, and underwater vehicles use fusiform, streamlined body shapes that reduce fluid dynamic drag to achieve fast and effective locomotion in air and water. Similarly, numerous small terrestrial animals move through cluttered terrain where three-dimensional, multi-component obstacles like grass, shrubs, vines, and leaf litter also resist motion, but it is unknown whether their body shape plays a major role in traversal. Few ground vehicles or terrestrial robots have used body shape to more effectively traverse environments such as cluttered terrain. Here, we challenged forest-floor-dwelling discoid cockroaches (Blaberus discoidalis) possessing a thin, rounded body to traverse tall, narrowly spaced, vertical, grass-like compliant beams. Animals displayed high traversal performance (79 ± 12% probability and 3.4 ± 0.7 s time). Although we observed diverse obstacle traversal strategies, cockroaches primarily (48 ± 9% probability) used a novel roll maneuver, a form of natural parkour, allowing them to rapidly traverse obstacle gaps narrower than half their body width (2.0 ± 0.5 s traversal time). Reduction of body roundness by addition of artificial shells nearly inhibited roll maneuvers and decreased traversal performance. Inspired by this discovery, we added a thin, rounded exoskeletal shell to a legged robot with a nearly cuboidal body, common to many existing terrestrial robots. Without adding sensory feedback or changing the open-loop control, the rounded shell enabled the robot to traverse beam obstacles with gaps narrower than shell width via body roll. Such terradynamically 'streamlined' shapes can reduce terrain resistance and enhance traversability by assisting effective body reorientation via distributed mechanical feedback. Our findings highlight the need to consider body shape to improve robot mobility in real-world terrain often filled with clutter, and to develop better locomotor-ground contact models to understand interaction with 3D, multi-component terrain.

  11. Air Permitting Streamlining Techniques and Approaches for Greenhouse Gases, 2012

    EPA Pesticide Factsheets

    This report presents potential GHG permit streamlining options and observations developed by the Clean Air Act Advisory Committee (CAAAC): Permits, New Source Review and Toxics Subcommittee GHG Permit Streamlining Workgroup

  12. A Service Delivery Model for Children with DCD Based on Principles of Best Practice.

    PubMed

    Camden, Chantal; Léger, France; Morel, Julie; Missiuna, Cheryl

    2015-01-01

    In this perspective article, we propose the Apollo model as an example of an innovative interdisciplinary, community-based service delivery model for children with Developmental Coordination Disorder (DCD) characterized by the use of graduated levels of intensity and evidence-based interventions that focus on function and participation. We describe the context that led to the creation of the Apollo model, describe the approach to service delivery and the services offered. The Apollo model has 5 components: first contact, service delivery coordination, community-, group-, and individual-interventions. This model guided the development of a streamlined set of services offered to children with DCD, including early-intake to share educational information with families, community interventions, inter-disciplinary and occupational therapy groups, and individual interventions. Following implementation of the Apollo model, wait-times decreased and the number of children receiving services increased, without compromising service quality. Lessons learned are shared to facilitate development of other practice models to support children with DCD.

  13. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces between the three disciplines used in the design process: weights and sizing, trajectory, and structural analysis. The tools used to perform such analysis are INtegrated Rocket Sizing (INTROS), Program to Optimize Simulated Trajectories (POST), and Launch Vehicle Analysis (LVA) respectively. The methods each discipline uses to streamline their particular part of the design process will also be discussed.

  14. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces between the three disciplines used in the design process: weights and sizing, trajectory, and structural analysis. The tools used to perform such analysis are INtegrated Rocket Sizing (INTROS), Program to Optimize Simulated Trajectories (POST), and Launch Vehicle Analysis (LVA) respectively. The methods each discipline uses to streamline their particular part of the design process will also be discussed.

  15. Modelling and scale-up of chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.A.; Lake, L.W.; Sepehrnoori, K.

    1990-03-01

    The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. We have continued to develop, test, and apply our chemical flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agents. Part I is an update on the Application of Higher-Order Methods in Chemical Flooding Simulation.more » This update focuses on the comparison of grid orientation effects for four different numerical methods implemented in UTCHEM. Part II is on Simulation Design Studies and is a continuation of Saad's Big Muddy surfactant pilot simulation study reported last year. Part III reports on the Simulation of Gravity Effects under conditions similar to those of some of the oil reservoirs in the North Sea. Part IV is on Determining Oil Saturation from Interwell Tracers UTCHEM is used for large-scale interwell tracer tests. A systematic procedure for estimating oil saturation from interwell tracer data is developed and a specific example based on the actual field data provided by Sun E P Co. is given. Part V reports on the Application of Vectorization and Microtasking for Reservoir Simulation. Part VI reports on Alkaline Simulation. The alkaline/surfactant/polymer flood compositional simulator (UTCHEM) reported last year is further extended to include reactions involving chemical species containing magnesium, aluminium and silicon as constituent elements. Part VII reports on permeability and trapping of microemulsion.« less

  16. Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.

    PubMed

    Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan

    2017-09-01

    In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Visualization in hydrological and atmospheric modeling and observation

    NASA Astrophysics Data System (ADS)

    Helbig, C.; Rink, K.; Kolditz, O.

    2013-12-01

    In recent years, visualization of geoscientific and climate data has become increasingly important due to challenges such as climate change, flood prediction or the development of water management schemes for arid and semi-arid regions. Models for simulations based on such data often have a large number of heterogeneous input data sets, ranging from remote sensing data and geometric information (such as GPS data) to sensor data from specific observations sites. Data integration using such information is not straightforward and a large number of potential problems may occur due to artifacts, inconsistencies between data sets or errors based on incorrectly calibrated or stained measurement devices. Algorithms to automatically detect various of such problems are often numerically expensive or difficult to parameterize. In contrast, combined visualization of various data sets is often a surprisingly efficient means for an expert to detect artifacts or inconsistencies as well as to discuss properties of the data. Therefore, the development of general visualization strategies for atmospheric or hydrological data will often support researchers during assessment and preprocessing of the data for model setup. When investigating specific phenomena, visualization is vital for assessing the progress of the ongoing simulation during runtime as well as evaluating the plausibility of the results. We propose a number of such strategies based on established visualization methods that - are applicable to a large range of different types of data sets, - are computationally inexpensive to allow application for time-dependent data - can be easily parameterized based on the specific focus of the research. Examples include the highlighting of certain aspects of complex data sets using, for example, an application-dependent parameterization of glyphs, iso-surfaces or streamlines. In addition, we employ basic rendering techniques allowing affine transformations, changes in opacity as well as variation of transfer functions. We found that similar strategies can be applied for hydrological and atmospheric data such as the use of streamlines for visualization of wind or fluid flow or iso-surfaces as indicators of groundwater recharge levels in the subsurface or levels of humidity in the atmosphere. We applied these strategies for a wide range of hydrological and climate applications such as groundwater flow, distribution of chemicals in water bodies, development of convection cells in the atmosphere or heat flux on the earth's surface. Results have been evaluated in discussions with experts from hydrogeology and meteorology.

  18. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  19. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    NASA Technical Reports Server (NTRS)

    Jones, Scott

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines.OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.

  20. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    NASA Technical Reports Server (NTRS)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.

  1. A shock-layer theory based on thirteen-moment equations and DSMC calculations of rarefied hypersonic flows

    NASA Technical Reports Server (NTRS)

    Cheng, H. K.; Wong, Eric Y.; Dogra, V. K.

    1991-01-01

    Grad's thirteen-moment equations are applied to the flow behind a bow shock under the formalism of a thin shock layer. Comparison of this version of the theory with Direct Simulation Monte Carlo calculations of flows about a flat plate at finite attack angle has lent support to the approach as a useful extension of the continuum model for studying translational nonequilibrium in the shock layer. This paper reassesses the physical basis and limitations of the development with additional calculations and comparisons. The streamline correlation principle, which allows transformation of the 13-moment based system to one based on the Navier-Stokes equations, is extended to a three-dimensional formulation. The development yields a strip theory for planar lifting surfaces at finite incidences. Examples reveal that the lift-to-drag ratio is little influenced by planform geometry and varies with altitudes according to a 'bridging function' determined by correlated two-dimensional calculations.

  2. Relationship of CogScreen-AE to flight simulator performance and pilot age.

    PubMed

    Taylor, J L; O'Hara, R; Mumenthaler, M S; Yesavage, J A

    2000-04-01

    We report on the relationship between CogScreen-Aeromedical Edition (AE) factor scores and flight simulator performance in aircraft pilots aged 50-69. Some 100 licensed, civilian aviators (average age 58+/-5.3 yr) performed aviation tasks in a Frasca model 141 flight simulator and the CogScreen-AE battery. The aviation performance indices were: a) staying on course; b) dialing in communication frequencies; c) avoiding conflicting traffic; d) monitoring cockpit instruments; e) executing the approach; and f) a summary score, which was the mean of these scores. The CogScreen predictors were based on a factor structure reported by Kay (11), which comprised 28 CogScreen scores. Through principal components analysis of Kay's nine factors, we reduced the number of predictors to five composite CogScreen scores: Speed/Working Memory (WM), Visual Associative Memory, Motor Coordination, Tracking, and Attribute Identification. Speed/WM scores had the highest correlation with the flight summary score, Spearman r(rho) = 0.57. A stepwise-forward multiple regression analysis indicated that four CogScreen variables could explain 45% of the variance in flight summary scores. Significant predictors, in order of entry, were: Speed/WM, Visual Associative Memory, Motor Coordination, and Tracking (p<0.05). Pilot age was found to significantly improve prediction beyond that which could be predicted by the four cognitive variables. In addition, there was some evidence for specific ability relationships between certain flight component scores and CogScreen scores, such as approach performance and tracking errors. These data support the validity of CogScreen-AE as a cognitive battery that taps skills relevant to piloting.

  3. A variational multiscale method for particle-cloud tracking in turbomachinery flows

    NASA Astrophysics Data System (ADS)

    Corsini, A.; Rispoli, F.; Sheard, A. G.; Takizawa, K.; Tezduyar, T. E.; Venturini, P.

    2014-11-01

    We present a computational method for simulation of particle-laden flows in turbomachinery. The method is based on a stabilized finite element fluid mechanics formulation and a finite element particle-cloud tracking method. We focus on induced-draft fans used in process industries to extract exhaust gases in the form of a two-phase fluid with a dispersed solid phase. The particle-laden flow causes material wear on the fan blades, degrading their aerodynamic performance, and therefore accurate simulation of the flow would be essential in reliable computational turbomachinery analysis and design. The turbulent-flow nature of the problem is dealt with a Reynolds-Averaged Navier-Stokes model and Streamline-Upwind/Petrov-Galerkin/Pressure-Stabilizing/Petrov-Galerkin stabilization, the particle-cloud trajectories are calculated based on the flow field and closure models for the turbulence-particle interaction, and one-way dependence is assumed between the flow field and particle dynamics. We propose a closure model utilizing the scale separation feature of the variational multiscale method, and compare that to the closure utilizing the eddy viscosity model. We present computations for axial- and centrifugal-fan configurations, and compare the computed data to those obtained from experiments, analytical approaches, and other computational methods.

  4. A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON

    PubMed Central

    King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix

    2008-01-01

    As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597

  5. Polycyclic aromatic hydrocarbons in the atmospheres of Titan and Jupiter

    NASA Technical Reports Server (NTRS)

    Sagan, Carl; Khare, B. N.; Thompson, W. R.; Mcdonald, G. D.; Wing, Michael R.; Bada, Jeffrey L.; Vo-Dinh, Tuan; Arakawa, E. T.

    1993-01-01

    PAHs are important components of the interstellar medium and carbonaceous chondrites, but have never been identified in the reducing atmospheres of the outer solar system. Incompletely characterized complex organic solids (tholins) produced by irradiating simulated Titan atmospheres reproduce well the observed UV/visible/IR optical constants of the Titan stratospheric haze. Titan tholin and a tholin generated in a crude simulation of the atmosphere of Jupiter are examined by two-step laser desorption/multiphoton ionization mass spectrometry. A range of two- to four-ring PAHs, some with one to four alkylation sites, are identified, with a net abundance of about 0.0001 g/g (grams per gram) of tholins produced. Synchronous fluorescence techniques confirm this detection. Titan tholins have proportionately more one- and two-ring PAHs than do Jupiter tholins, which in turn have more four-ring and larger PAHs. The four-ringed PAH chrysene, prominent in some discussions of interstellar grains, is found in Jupiter tholins.

  6. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-08-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  7. Performance evaluation of PCA-based spike sorting algorithms.

    PubMed

    Adamos, Dimitrios A; Kosmidis, Efstratios K; Theophilidis, George

    2008-09-01

    Deciphering the electrical activity of individual neurons from multi-unit noisy recordings is critical for understanding complex neural systems. A widely used spike sorting algorithm is being evaluated for single-electrode nerve trunk recordings. The algorithm is based on principal component analysis (PCA) for spike feature extraction. In the neuroscience literature it is generally assumed that the use of the first two or most commonly three principal components is sufficient. We estimate the optimum PCA-based feature space by evaluating the algorithm's performance on simulated series of action potentials. A number of modifications are made to the open source nev2lkit software to enable systematic investigation of the parameter space. We introduce a new metric to define clustering error considering over-clustering more favorable than under-clustering as proposed by experimentalists for our data. Both the program patch and the metric are available online. Correlated and white Gaussian noise processes are superimposed to account for biological and artificial jitter in the recordings. We report that the employment of more than three principal components is in general beneficial for all noise cases considered. Finally, we apply our results to experimental data and verify that the sorting process with four principal components is in agreement with a panel of electrophysiology experts.

  8. Streamline-curvature effect in three-dimensional boundary layers

    NASA Technical Reports Server (NTRS)

    Reed, Helen L.; Lin, Ray-Sing; Petraglia, Media M.

    1992-01-01

    The effect of including wall and streamline curvature terms in swept-wing boundary-layer stability calculations is studied. The linear disturbance equations are cast on a fixed, body-intrinsic, curvilinear coordinate system. Those nonparallel terms which contribute mainly to the streamline-curvature effect are retained in this formulation and approximated by their local finite-difference values. Convex-wall curvature has a stabilizing effect, while streamline curvature is destabilizing if the curvature exceeds a critical value.

  9. Flow characteristics in a canine aneurysm model: A comparison of 4D accelerated phase-contrast MR measurements and computational fluid dynamics simulations

    PubMed Central

    Jiang, Jingfeng; Johnson, Kevin; Valen-Sendstad, Kristian; Mardal, Kent-Andre; Wieben, Oliver; Strother, Charles

    2011-01-01

    Purpose: Our purpose was to compare quantitatively velocity fields in and around experimental canine aneurysms as measured using an accelerated 4D PC-MR angiography (MRA) method and calculated based on animal-specific CFD simulations. Methods: Two animals with a surgically created bifurcation aneurysm were imaged using an accelerated 4D PC-MRA method. Meshes were created based on the geometries obtained from the PC-MRA and simulations using “subject-specific” pulsatile velocity waveforms and geometries were then solved using a commercial CFD solver. Qualitative visual assessments and quantitative comparisons of the time-resolved velocity fields obtained from the PC-MRA measurements and the CFD simulations were performed using a defined similarity metric combining both angular and magnitude differences of vector fields. Results: PC-MRA and image-based CFD not only yielded visually consistent representations of 3D streamlines in and around both aneurysms, but also showed good agreement with regard to the spatial velocity distributions. The estimated similarity between time-resolved velocity fields from both techniques was reasonably high (mean value >0.60; one being the highest and zero being the lowest). Relative differences in inflow and outflow zones among selected planes were also reasonable (on the order of 10%–20%). The correlation between CFD-calculated and PC-MRA-measured time-averaged wall shear stresses was low (0.22 and 0.31, p < 0.001). Conclusions: In two experimental canine aneurysms, PC-MRA and image-based CFD showed favorable agreement in intra-aneurismal velocity fields. Combining these two complementary techniques likely will further improve the ability to characterize and interpret the complex flow that occurs in human intracranial aneurysms. PMID:22047395

  10. View-Dependent Streamline Deformation and Exploration

    PubMed Central

    Tong, Xin; Edwards, John; Chen, Chun-Ming; Shen, Han-Wei; Johnson, Chris R.; Wong, Pak Chung

    2016-01-01

    Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual clutter in 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures. Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely. PMID:26600061

  11. View-Dependent Streamline Deformation and Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, Xin; Edwards, John; Chen, Chun-Ming

    Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual cluttering for visualizing 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures.more » Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely.« less

  12. View-Dependent Streamline Deformation and Exploration.

    PubMed

    Tong, Xin; Edwards, John; Chen, Chun-Ming; Shen, Han-Wei; Johnson, Chris R; Wong, Pak Chung

    2016-07-01

    Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual clutter in 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures. Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely.

  13. On 3-D inelastic analysis methods for hot section components. Volume 1: Special finite element models

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.

  14. A numerical study on combustion process in a small compression ignition engine run dual-fuel mode (diesel-biogas)

    NASA Astrophysics Data System (ADS)

    Ambarita, H.; Widodo, T. I.; Nasution, D. M.

    2017-01-01

    In order to reduce the consumption of fossil fuel of a compression ignition (CI) engines which is usually used in transportation and heavy machineries, it can be operated in dual-fuel mode (diesel-biogas). However, the literature reviews show that the thermal efficiency is lower due to incomplete combustion process. In order to increase the efficiency, the combustion process in the combustion chamber need to be explored. Here, a commercial CFD code is used to explore the combustion process of a small CI engine run on dual fuel mode (diesel-biogas). The turbulent governing equations are solved based on finite volume method. A simulation of compression and expansions strokes at an engine speed and load of 1000 rpm and 2500W, respectively has been carried out. The pressure and temperature distributions and streamlines are plotted. The simulation results show that at engine power of 732.27 Watt the thermal efficiency is 9.05%. The experiment and simulation results show a good agreement. The method developed in this study can be used to investigate the combustion process of CI engine run on dual-fuel mode.

  15. A Streamlined Protocol for Molecular Testing of the DMD Gene within a Diagnostic Laboratory: A Combination of Array Comparative Genomic Hybridization and Bidirectional Sequence Analysis

    PubMed Central

    Marquis-Nicholson, Renate; Lai, Daniel; Love, Jennifer M.; Love, Donald R.

    2013-01-01

    Purpose. The aim of this study was to develop a streamlined mutation screening protocol for the DMD gene in order to confirm a clinical diagnosis of Duchenne or Becker muscular dystrophy in affected males and to clarify the carrier status of female family members. Methods. Sequence analysis and array comparative genomic hybridization (aCGH) were used to identify mutations in the dystrophin DMD gene. We analysed genomic DNA from six individuals with a range of previously characterised mutations and from eight individuals who had not previously undergone any form of molecular analysis. Results. We successfully identified the known mutations in all six patients. A molecular diagnosis was also made in three of the four patients with a clinical diagnosis who had not undergone prior genetic screening, and testing for familial mutations was successfully completed for the remaining four patients. Conclusion. The mutation screening protocol described here meets best practice guidelines for molecular testing of the DMD gene in a diagnostic laboratory. The aCGH method is a superior alternative to more conventional assays such as multiplex ligation-dependent probe amplification (MLPA). The combination of aCGH and sequence analysis will detect mutations in 98% of patients with the Duchenne or Becker muscular dystrophy. PMID:23476807

  16. Simulation of mixture microstructures via particle packing models and their direct comparison with real mixtures

    NASA Astrophysics Data System (ADS)

    Gulliver, Eric A.

    The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.

  17. A Distributed Approach to System-Level Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Bregon, Anibal; Roychoudhury, Indranil

    2012-01-01

    Prognostics, which deals with predicting remaining useful life of components, subsystems, and systems, is a key technology for systems health management that leads to improved safety and reliability with reduced costs. The prognostics problem is often approached from a component-centric view. However, in most cases, it is not specifically component lifetimes that are important, but, rather, the lifetimes of the systems in which these components reside. The system-level prognostics problem can be quite difficult due to the increased scale and scope of the prognostics problem and the relative Jack of scalability and efficiency of typical prognostics approaches. In order to address these is ues, we develop a distributed solution to the system-level prognostics problem, based on the concept of structural model decomposition. The system model is decomposed into independent submodels. Independent local prognostics subproblems are then formed based on these local submodels, resul ting in a scalable, efficient, and flexible distributed approach to the system-level prognostics problem. We provide a formulation of the system-level prognostics problem and demonstrate the approach on a four-wheeled rover simulation testbed. The results show that the system-level prognostics problem can be accurately and efficiently solved in a distributed fashion.

  18. Computational design and in vitro characterization of an integrated maglev pump-oxygenator.

    PubMed

    Zhang, Juntao; Taskin, M Ertan; Koert, Andrew; Zhang, Tao; Gellman, Barry; Dasse, Kurt A; Gilbert, Richard J; Griffith, Bartley P; Wu, Zhongjun J

    2009-10-01

    For the need for respiratory support for patients with acute or chronic lung diseases to be addressed, a novel integrated maglev pump-oxygenator (IMPO) is being developed as a respiratory assist device. IMPO was conceptualized to combine a magnetically levitated pump/rotor with uniquely configured hollow fiber membranes to create an assembly-free, ultracompact system. IMPO is a self-contained blood pump and oxygenator assembly to enable rapid deployment for patients requiring respiratory support or circulatory support. In this study, computational fluid dynamics (CFD) and computer-aided design were conducted to design and optimize the hemodynamics, gas transfer, and hemocompatibility performances of this novel device. In parallel, in vitro experiments including hydrodynamic, gas transfer, and hemolysis measurements were conducted to evaluate the performance of IMPO. Computational results from CFD analysis were compared with experimental data collected from in vitro evaluation of the IMPO. The CFD simulation demonstrated a well-behaved and streamlined flow field in the main components of this device. The results of hydrodynamic performance, oxygen transfer, and hemolysis predicted by computational simulation, along with the in vitro experimental data, indicate that this pump-lung device can provide the total respiratory need of an adult with lung failure, with a low hemolysis rate at the targeted operating condition. These detailed CFD designs and analyses can provide valuable guidance for further optimization of this IMPO for long-term use.

  19. Large Eddy Simulation of stratified flows over structures

    NASA Astrophysics Data System (ADS)

    Fuka, V.; Brechler, J.

    2013-04-01

    We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model) to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  20. Four-component numerical simulation model of radiative convective interactions in large-scale oxygen-hydrogen turbulent fire balls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Surzhikov, S.T.

    1996-12-31

    Two-dimensional radiative gas dynamics model for numerical simulation of oxygen-hydrogen fire ball which may be generated by an explosion of a launch vehicle with cryogenic (LO{sub 2}-LH{sub 2}) fuel components is presented. The following physical-chemical processes are taken into account in the numerical model: and effective chemical reaction between the gaseous components (O{sub 2}-H{sub 2}) of the propellant, turbulent mixing and diffusion of the components, and radiative heat transfer. The results of numerical investigations of the following problems are presented: The influence of radiative heat transfer on fire ball gas dynamics during the first 13 sec after explosion, the effectmore » of the fuel gaseous components afterburning on fire ball gas dynamics, and the effect of turbulence on fire ball gas dynamics (in a framework of algebraic model of turbulent mixing).« less

  1. User's guide and description of the streamline divergence computer program. [turbulent convective heat transfer

    NASA Technical Reports Server (NTRS)

    Sulyma, P. R.; Mcanally, J. V.

    1975-01-01

    The streamline divergence program was developed to demonstrate the capability to trace inviscid surface streamlines and to calculate outflow-corrected laminar and turbulent convective heating rates on surfaces subjected to exhaust plume impingement. The analytical techniques used in formulating this program are discussed. A brief description of the streamline divergence program is given along with a user's guide. The program input and output for a sample case are also presented.

  2. Aerofoil testing in a self-streamlining flexible walled wind tunnel. Ph.D. Thesis - Jul. 1987

    NASA Technical Reports Server (NTRS)

    Lewis, Mark Charles

    1988-01-01

    Two-dimensional self-streamlining flexible walled test sections eliminate, as far as experimentally possible, the top and bottom wall interference effects in transonic airfoil testing. The test section sidewalls are rigid, while the impervious top and bottom walls are flexible and contoured to streamline shapes by a system of jacks, without reference to the airfoil model. The concept of wall contouring to eliminate or minimize test section boundary interference in 2-D testing was first demonstrated by NPL in England during the early 40's. The transonic streamlining strategy proposed, developed and used by NPL has been compared with several modern strategies. The NPL strategy has proved to be surprisingly good at providing a wall interference-free test environment, giving model performance indistinguishable from that obtained using the modern strategies over a wide range of test conditions. In all previous investigations the achievement of wall streamlining in flexible walled test sections has been limited to test sections up to those resulting in the model's shock just extending to a streamlined wall. This work however, has also successfully demonstrated the feasibility of 2-D wall streamlining at test conditions where both model shocks have reached and penetrated through their respective flexible walls. Appropriate streamlining procedures have been established and are uncomplicated, enabling flexible walled test sections to cope easily with these high transonic flows.

  3. Streamline three-dimensional thermal model of a lithium titanate pouch cell battery in extreme temperature conditions with module simulation

    NASA Astrophysics Data System (ADS)

    Jaguemont, Joris; Omar, Noshin; Martel, François; Van den Bossche, Peter; Van Mierlo, Joeri

    2017-11-01

    In this paper, the development of a three-dimensional (3D) lithium titanium oxide (LTO) pouch cell is presented to first better comprehend its thermal behavior within electrified vehicle applications, but also to propose a strong modeling base for future thermal management system. Current 3D-thermal models are based on electrochemical reactions which are in need for elaborated meshing effort and long computational time. There lacks a fast electro-thermal model which can capture voltage, current and thermal distribution variation during the whole process. The proposed thermal model is a reduce-effort temperature simulation approach involving a 0D-electrical model accommodating a 3D-thermal model to exclude electrochemical processes. The thermal model is based on heat-transfer theory and its temperature distribution prediction incorporates internal conduction and heat generation effect as well as convection. In addition, experimental tests are conducted to validate the model. Results show that both the heat dissipation rate and surface temperature uniformity data are in agreement with simulation results, which satisfies the application requirements for electrified vehicles. Additionally, a LTO battery pack sizing and modeling is also designed, applied and displays a non-uniformity of the cells under driving operation. Ultimately, the model will serve as a basis for the future development of a thermal strategy for LTO cells that operate in a large temperature range, which is a strong contribution to the existing body of scientific literature.

  4. A diffusion tensor imaging tractography algorithm based on Navier-Stokes fluid mechanics.

    PubMed

    Hageman, Nathan S; Toga, Arthur W; Narr, Katherine L; Shattuck, David W

    2009-03-01

    We introduce a fluid mechanics based tractography method for estimating the most likely connection paths between points in diffusion tensor imaging (DTI) volumes. We customize the Navier-Stokes equations to include information from the diffusion tensor and simulate an artificial fluid flow through the DTI image volume. We then estimate the most likely connection paths between points in the DTI volume using a metric derived from the fluid velocity vector field. We validate our algorithm using digital DTI phantoms based on a helical shape. Our method segmented the structure of the phantom with less distortion than was produced using implementations of heat-based partial differential equation (PDE) and streamline based methods. In addition, our method was able to successfully segment divergent and crossing fiber geometries, closely following the ideal path through a digital helical phantom in the presence of multiple crossing tracts. To assess the performance of our algorithm on anatomical data, we applied our method to DTI volumes from normal human subjects. Our method produced paths that were consistent with both known anatomy and directionally encoded color images of the DTI dataset.

  5. A Diffusion Tensor Imaging Tractography Algorithm Based on Navier-Stokes Fluid Mechanics

    PubMed Central

    Hageman, Nathan S.; Toga, Arthur W.; Narr, Katherine; Shattuck, David W.

    2009-01-01

    We introduce a fluid mechanics based tractography method for estimating the most likely connection paths between points in diffusion tensor imaging (DTI) volumes. We customize the Navier-Stokes equations to include information from the diffusion tensor and simulate an artificial fluid flow through the DTI image volume. We then estimate the most likely connection paths between points in the DTI volume using a metric derived from the fluid velocity vector field. We validate our algorithm using digital DTI phantoms based on a helical shape. Our method segmented the structure of the phantom with less distortion than was produced using implementations of heat-based partial differential equation (PDE) and streamline based methods. In addition, our method was able to successfully segment divergent and crossing fiber geometries, closely following the ideal path through a digital helical phantom in the presence of multiple crossing tracts. To assess the performance of our algorithm on anatomical data, we applied our method to DTI volumes from normal human subjects. Our method produced paths that were consistent with both known anatomy and directionally encoded color (DEC) images of the DTI dataset. PMID:19244007

  6. CASE STUDIES EXAMINING LCA STREAMLINING TECHNIQUES

    EPA Science Inventory

    Pressure is mounting for more streamlined Life Cycle Assessment (LCA) methods that allow for evaluations that are quick and simple, but accurate. As part of an overall research effort to develop and demonstrate streamlined LCA, the U.S. Environmental Protection Agency has funded ...

  7. Acquisition streamlining: A cultural change

    NASA Technical Reports Server (NTRS)

    Stewart, Jesse

    1992-01-01

    The topics are presented in viewgraph form and include the following: the defense systems management college, educational philosophy, the defense acquisition environment, streamlining initiatives, organizational streamlining types, defense law review, law review purpose, law review objectives, the Public Law Pilot Program, and cultural change.

  8. Boundary-layer processes: key findings from MATERHORN-X field campaigns

    NASA Astrophysics Data System (ADS)

    Di Sabatino, Silvana; Leo, Laura S.; Pardyjak, Eric R.; Fernando, Harindra JS

    2017-04-01

    Understanding of atmospheric boundary-layer processes in complex terrain continues to be an active area of research considering its profound implications on numerical weather prediction (WP). It is largely recognized that nocturnal circulation, non-stationary processes involved in evening and morning transitions as well gusty conditions near mountains are poorly captured by current WP models. The search for novel understanding of boundary-layer phenomena especially in critical conditions for WP models has been one of the goals of the interdisciplinary Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) program (2011-2016). The program developed with four main pillars: modelling (MATERHORN-M), experiments (MATERHORN-X), technology (MATERHORN-T), and parameterizations (MATERHORN-P), all synergistically working to meet new scientific challenges, address them effectively through dedicated field and laboratory studies, and transfer the acquired knowledge for model improvements. Specifically, MATERHORN-X is at the core of the MATERHORN program. It was built upon two major field experiments carried out in 31 September-October 2012 and in May 2013 at the Granite Mountain Atmospheric Science Testbed 32 (GMAST) of the Dugway Proving Ground (DPG). In this talk we will focus on results of data analyses from MATERHORN-X with emphasis on several aspects of the nocturnal circulation under low synoptic forcing when stable stratification occurs. The first part of the talk will discuss the evolution of nocturnal flows including both evening transitions on slopes and valleys as well as the occurrence of isolated flow bursts under very stable conditions. As far as the former is concerned we report on our latest understanding of mechanisms leading to evening transitions (e.g. shadow front, slab flow, and transitional front). As far as the latter is concerned, it is hypothesized that a link exists between isolated bursts in turbulent kinetic energy and low-level jets structure, a feature which is commonly found in the first 50-100 m from the ground. The second part of the talk will discuss the interaction between an isolated hill and an approaching (undisturbed) stably-stratified flow with emphasis on the dividing streamline concept. The hill was located northwest of and close to the Granite Mountain, and was approximately 60m in height. A suite of (smoke) flow-visualization, remote sensing and in-situ measurement assets were deployed. At small Froude numbers (Fr<1), a stratified flow approaching the hill either possesses sufficient kinetic energy to pass over the summit or else flows around the sides, with the dividing streamline separating the two scenarios. By applying a logarithmic approach velocity profile to the well-known Sheppard's formula based on simple energetics, an explicit representation for the dividing streamline height was derived and a new set of parameters were identified to determine the dividing streamline height. The analysis shows that there will always be a dividing streamline for real atmospheric stratified shear flows. This has relevant implications for modelling air-flow and dispersion in mountainous regions.

  9. Feedback loops and temporal misalignment in component-based hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa M.; Goodall, Jonathan L.; Castronova, Anthony M.

    2011-12-01

    In component-based modeling, a complex system is represented as a series of loosely integrated components with defined interfaces and data exchanges that allow the components to be coupled together through shared boundary conditions. Although the component-based paradigm is commonly used in software engineering, it has only recently been applied for modeling hydrologic and earth systems. As a result, research is needed to test and verify the applicability of the approach for modeling hydrologic systems. The objective of this work was therefore to investigate two aspects of using component-based software architecture for hydrologic modeling: (1) simulation of feedback loops between components that share a boundary condition and (2) data transfers between temporally misaligned model components. We investigated these topics using a simple case study where diffusion of mass is modeled across a water-sediment interface. We simulated the multimedia system using two model components, one for the water and one for the sediment, coupled using the Open Modeling Interface (OpenMI) standard. The results were compared with a more conventional numerical approach for solving the system where the domain is represented by a single multidimensional array. Results showed that the component-based approach was able to produce the same results obtained with the more conventional numerical approach. When the two components were temporally misaligned, we explored the use of different interpolation schemes to minimize mass balance error within the coupled system. The outcome of this work provides evidence that component-based modeling can be used to simulate complicated feedback loops between systems and guidance as to how different interpolation schemes minimize mass balance error introduced when components are temporally misaligned.

  10. Coupled Aerodynamic-Thermal-Structural (CATS) Analysis

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Coupled Aerodynamic-Thermal-Structural (CATS) Analysis is a focused effort within the Numerical Propulsion System Simulation (NPSS) program to streamline multidisciplinary analysis of aeropropulsion components and assemblies. Multidisciplinary analysis of axial-flow compressor performance has been selected for the initial focus of this project. CATS will permit more accurate compressor system analysis by enabling users to include thermal and mechanical effects as an integral part of the aerodynamic analysis of the compressor primary flowpath. Thus, critical details, such as the variation of blade tip clearances and the deformation of the flowpath geometry, can be more accurately modeled and included in the aerodynamic analyses. The benefits of this coupled analysis capability are (1) performance and stall line predictions are improved by the inclusion of tip clearances and hot geometries, (2) design alternatives can be readily analyzed, and (3) higher fidelity analysis by researchers in various disciplines is possible. The goals for this project are a 10-percent improvement in stall margin predictions and a 2:1 speed-up in multidisciplinary analysis times. Working cooperatively with Pratt & Whitney, the Lewis CATS team defined the engineering processes and identified the software products necessary for streamlining these processes. The basic approach is to integrate the aerodynamic, thermal, and structural computational analyses by using data management and Non-Uniform Rational B-Splines (NURBS) based data mapping. Five software products have been defined for this task: (1) a primary flowpath data mapper, (2) a two-dimensional data mapper, (3) a database interface, (4) a blade structural pre- and post-processor, and (5) a computational fluid dynamics code for aerothermal analysis of the drum rotor. Thus far (1) a cooperative agreement has been established with Pratt & Whitney, (2) a Primary Flowpath Data Mapper has been prototyped and delivered to General Electric Aircraft Engines and Pratt & Whitney for evaluation, (3) a collaborative effort has been initiated with the National Institute of Standards and Testing to develop a Standard Data Access Interface, and (4) a blade tip clearance capability has been implemented into the Structural Airfoil Blade Engineering Routine (SABER) program. We plan to continue to develop the data mappers and data management tools. As progress is made, additional efforts will be made to apply these tools to propulsion system applications.

  11. CBT for Pediatric Migraine: A Qualitative Study of Patient and Parent Experience.

    PubMed

    Kroon Van Diest, Ashley M; Ernst, Michelle M; Vaughn, Lisa; Slater, Shalonda; Powers, Scott W

    2018-03-08

    The goal of this study was to determine which cognitive behavioral therapy (CBT-HA) treatment components pediatric headache patient stakeholders would report to be most helpful and essential to reducing headache frequency and related disability to develop a streamlined, less burdensome treatment package that would be more accessible to patients and families. Pediatric migraine is a prevalent and disabling condition. CBT-HA has been shown to reduce headache frequency and related disability, but may not be readily available or accepted by many migraine sufferers due to treatment burden entailed. Research is needed to determine systematic ways of reducing barriers to CBT-HA. Qualitative interviews were conducted with 10 patients and 9 of their parents who had undergone CBT-HA. Interviews were analyzed using an inductive thematic analysis approach based upon modified grounded theory. Patients were 13-17.5 years of age (M = 15.4, SD = 1.63) and had undergone CBT-HA ∼1-2 years prior to participating in the study. Overall, patients and their parents reported that CBT-HA was helpful in reducing headache frequency and related disability. Although patients provided mixed reports on the effectiveness of different CBT-HA skills, the majority of patients indicated that the mind and body relaxation skills of CBT-HA (deep breathing, progressive muscle relaxation, and activity pacing in particular) were the most helpful and most frequently used skills. Patients and parents also generally reported that treatment was easy to learn, and noted at least some aspect of treatment was enjoyable. Results from these qualitative interviews indicate that mind and body CBT-HA relaxation skills emerged as popular and effective based on patient and parent report. Future research examining the effectiveness of streamlined pediatric migraine nonpharmacological interventions should include these patient-preferred skills. © 2018 American Headache Society.

  12. Streamlining and Large Ancestral Genomes in Archaea Inferred with a Phylogenetic Birth-and-Death Model

    PubMed Central

    Miklós, István

    2009-01-01

    Homologous genes originate from a common ancestor through vertical inheritance, duplication, or horizontal gene transfer. Entire homolog families spawned by a single ancestral gene can be identified across multiple genomes based on protein sequence similarity. The sequences, however, do not always reveal conclusively the history of large families. To study the evolution of complete gene repertoires, we propose here a mathematical framework that does not rely on resolved gene family histories. We show that so-called phylogenetic profiles, formed by family sizes across multiple genomes, are sufficient to infer principal evolutionary trends. The main novelty in our approach is an efficient algorithm to compute the likelihood of a phylogenetic profile in a model of birth-and-death processes acting on a phylogeny. We examine known gene families in 28 archaeal genomes using a probabilistic model that involves lineage- and family-specific components of gene acquisition, duplication, and loss. The model enables us to consider all possible histories when inferring statistics about archaeal evolution. According to our reconstruction, most lineages are characterized by a net loss of gene families. Major increases in gene repertoire have occurred only a few times. Our reconstruction underlines the importance of persistent streamlining processes in shaping genome composition in Archaea. It also suggests that early archaeal genomes were as complex as typical modern ones, and even show signs, in the case of the methanogenic ancestor, of an extremely large gene repertoire. PMID:19570746

  13. Classical Testing in Functional Linear Models.

    PubMed

    Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab

    2016-01-01

    We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications.

  14. Classical Testing in Functional Linear Models

    PubMed Central

    Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab

    2016-01-01

    We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications. PMID:28955155

  15. 77 FR 14700 - Streamlining Inherited Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-13

    ... contains notices to the public of #0;the proposed issuance of rules and regulations. The purpose of these... X [Docket No. CFPB-2011-0039] Streamlining Inherited Regulations AGENCY: Bureau of Consumer... the public for streamlining regulations it recently inherited from other Federal agencies (the...

  16. How runoff begins (and ends): characterizing hydrologic response at the catchment scale

    USGS Publications Warehouse

    Mirus, Benjamin B.; Loague, Keith

    2013-01-01

    Improved understanding of the complex dynamics associated with spatially and temporally variable runoff response is needed to better understand the hydrology component of interdisciplinary problems. The objective of this study was to quantitatively characterize the environmental controls on runoff generation for the range of different streamflow-generation mechanisms illustrated in the classic Dunne diagram. The comprehensive physics-based model of coupled surface-subsurface flow, InHM, is employed in a heuristic mode. InHM has been employed previously to successfully simulate the observed hydrologic response at four diverse, well-characterized catchments, which provides the foundation for this study. The C3 and CB catchments are located within steep, forested terrain; the TW and R5 catchments are located in gently sloping rangeland. The InHM boundary-value problems for these four catchments provide the corner-stones for alternative simulation scenarios designed to address the question of how runoff begins (and ends). Simulated rainfall-runoff events are used to systematically explore the impact of soil-hydraulic properties and rainfall characteristics. This approach facilitates quantitative analysis of both integrated and distributed hydrologic responses at high-spatial and temporal resolution over the wide range of environmental conditions represented by the four catchments. The results from 140 unique simulation scenarios illustrate how rainfall intensity/depth, subsurface permeability contrasts, characteristic curve shapes, and topography provide important controls on the hydrologic-response dynamics. The processes by which runoff begins (and ends) are shown, in large part, to be defined by the relative rates of rainfall, infiltration, lateral flow convergence, and storage dynamics within the variably saturated soil layers.

  17. The effect of Cardiac Arrhythmias Simulation Software on the nurses' learning and professional development.

    PubMed

    Bazrafkan, Leila; Hemmati, Mehdi

    2018-04-01

    One of the important tasks of nurses in intensive care unit is interpretation of ECG. The use of training simulator is a new paradigm in the age of computers. This study was performed to evaluate the impact of cardiac arrhythmias simulator software on nurses' learning in the subspecialty Vali-Asr Hospital in 2016. This study was conducted by quasi-experimental randomized Salomon four group design with the participation of 120 nurses in subspecialty Vali-Asr Hospital in Tehran, Iran in 2016 that were selected purposefully and allocated in 4 groups. By this design other confounding factors such as the prior information, maturation and the role of sex and age were controlled by Solomon 4 design. The valid and reliable multiple choice test tools were used to gather information; the validity of the test was approved by experts and its reliability was obtained by Cronbach's alpha coefficient 0.89. At first, the knowledge and skills of the participants were assessed by a pre-test; following the educational intervention with cardiac arrhythmias simulator software during 14 days in ICUs, the mentioned factors were measured for the two groups again by a post-test in the four groups. Data were analyzed using the two way ANOVA. The significance level was considered as p<0.05. Based on randomized four-group Solomon designs and our test results, using cardiac arrhythmias simulator software as an intervention was effective in the nurses' learning since a significant difference was found between pre-test and post-test in the first group (p<0.05). Also, other comparisons by ANOVA test showed that there was no interaction between pre-test and intervention in all of the three knowledge areas of cardiac arrhythmias, their treatments and their diagnosis (P>0.05). The use of software-based simulator for cardiac arrhythmias was effective in nurses' learning in light of its attractive components and interactive method. This intervention increased the knowledge of the nurses in cognitive domain of cardiac arrhythmias in addition to their diagnosis and treatment. Also, the package can be used for training in other areas such as continuing medical education.

  18. Simulation-Based e-Learning Tools for Science,Engineering, and Technology Education(SimBeLT)

    NASA Astrophysics Data System (ADS)

    Davis, Doyle V.; Cherner, Y.

    2006-12-01

    The focus of Project SimBeLT is the research, development, testing, and dissemination of a new type of simulation-based integrated e-learning set of modules for two-year college technical and engineering curricula in the areas of thermodynamics, fluid physics, and fiber optics that can also be used in secondary schools and four-year colleges. A collection of sophisticated virtual labs is the core component of the SimBeLT modules. These labs will be designed to enhance the understanding of technical concepts and underlying fundamental principles of these topics, as well as to master certain performance based skills online. SimBeLT software will help educators to meet the National Science Education Standard that "learning science and technology is something that students do, not something that is done to them". A major component of Project SimBeLT is the development of multi-layered technology-oriented virtual labs that realistically mimic workplace-like environments. Dynamic data exchange between simulations will be implemented and links with instant instructional messages and data handling tools will be realized. A second important goal of Project SimBeLT labs is to bridge technical skills and scientific knowledge by enhancing the teaching and learning of specific scientific or engineering subjects. SimBeLT builds upon research and outcomes of interactive teaching strategies and tools developed through prior NSF funding (http://webphysics.nhctc.edu/compact/index.html) (Project SimBeLT is partially supported by a grant from the National Science Foundation DUE-0603277)

  19. Influence of Coanda surface curvature on performance of bladeless fan

    NASA Astrophysics Data System (ADS)

    Li, Guoqi; Hu, Yongjun; Jin, Yingzi; Setoguchi, Toshiaki; Kim, Heuy Dong

    2014-10-01

    The unique Coanda surface has a great influence on the performance of bladeless fan. However, there is few studies to explain the relationship between the performance and Coanda surface curvature at present. In order to gain a qualitative understanding of effect of the curvature on the performance of bladeless fan, numerical studies are performed in this paper. Firstly, three-dimensional numerical simulation is done by Fluent software. For the purpose to obtain detailed information of the flow field around the Coanda surface, two-dimensional numerical simulation is also conducted. Five types of Coanda surfaces with different curvature are designed, and the flow behaviour and the performance of them are analyzed and compared with those of the prototype. The analysis indicates that the curvature of Coanda surface is strongly related to blowing performance, It is found that there is an optimal curvature of Coanda surfaces among the studied models. Simulation result shows that there is a special low pressure region. With increasing curvature in Y direction, several low pressure regions gradually enlarged, then begin to merge slowly, and finally form a large area of low pressure. From the analyses of streamlines and velocity angle, it is found that the magnitude of the curvature affects the flow direction and reasonable curvature can induce fluid flow close to the wall. Thus, it leads to that the curvature of the streamlines is consistent with that of Coanda surface. Meanwhile, it also causes the fluid movement towards the most suitable direction. This study will provide useful information to performance improvements of bladeless fans.

  20. Study on lockage safety of LNG-fueled ships based on FSA.

    PubMed

    Lv, Pengfei; Zhuang, Yuan; Deng, Jian; Su, Wei

    2017-01-01

    In the present study, formal safety assessment (FSA) is introduced to investigate lockage safety of LNG-fueled ships. Risk sources during lockage of LNG-fueled ships in four typical scenarios, namely, navigation between two dams, lockage, anchorage, and fueling, are identified, and studied in combination with fundamental leakage probabilities of various components of LNG storage tanks, and simulation results of accident consequences. Some suggestions for lockage safety management of LNG-fueled ships are then proposed. The present research results have certain practical significance for promoting applications of LNG-fueled ships along Chuanjiang River and in Three Gorges Reservoir Region.

  1. Simulation-driven machine learning: Bearing fault classification

    NASA Astrophysics Data System (ADS)

    Sobie, Cameron; Freitas, Carina; Nicolai, Mike

    2018-01-01

    Increasing the accuracy of mechanical fault detection has the potential to improve system safety and economic performance by minimizing scheduled maintenance and the probability of unexpected system failure. Advances in computational performance have enabled the application of machine learning algorithms across numerous applications including condition monitoring and failure detection. Past applications of machine learning to physical failure have relied explicitly on historical data, which limits the feasibility of this approach to in-service components with extended service histories. Furthermore, recorded failure data is often only valid for the specific circumstances and components for which it was collected. This work directly addresses these challenges for roller bearings with race faults by generating training data using information gained from high resolution simulations of roller bearing dynamics, which is used to train machine learning algorithms that are then validated against four experimental datasets. Several different machine learning methodologies are compared starting from well-established statistical feature-based methods to convolutional neural networks, and a novel application of dynamic time warping (DTW) to bearing fault classification is proposed as a robust, parameter free method for race fault detection.

  2. The Perception and Costs of the Interview Process for Plastic Surgery Residency Programs: Can the Process Be Streamlined?

    PubMed

    Susarla, Srinivas M; Swanson, Edward W; Slezak, Sheri; Lifchez, Scott D; Redett, Richard J

    2017-01-01

    The purpose of this study was to assess applicant perceptions and costs associated with the interview process for plastic surgery residency positions. This was a cross-sectional survey of applicants to the integrated- and independent-track residencies at the authors' institution. All applicants who were interviewed were invited to complete a Web-based survey on costs and perceptions of various components of the interview process. Descriptive and bivariate statistics were computed to compare applicants to the two program tracks. Fifty-three applicants were interviewed for residency positions; 48 completed a survey (90.5 percent response rate). Thirty-four applicants were candidates for the integrated program; 16 applicants were candidates for the independent program. The program spent $2763 per applicant interviewed; 63 percent of applicants spent more than $5000 on the interview process. More than 70 percent of applicants missed more than 7 days of work to attend interviews. Independent applicants felt less strongly that interviews were critical to the selection process and placed less value on physically visiting the hospital and direct, in-person interaction. Applicants placed little value on program informational talks. Applicants who had experience with virtual interviews felt more positively about the format of a video interview relative to those who did not. The residency interview process is resource intensive for programs and applicants. Removing informational talks may improve the process. Making physical tours and in-person interviews optional are other alternatives that merit future study.

  3. Nutrition for Health and Performance, 2000: Nutritional Guidance for Military Operations in Temperate and Extreme Environments.

    DTIC Science & Technology

    1999-12-01

    processed , high-salt foods (read food labels). • If You Drink Alcoholic Beverages, Do So in Moderation - Alcoholic beverages supply calories but...military. Rations are made from "real foods" (commercially grown and processed ). Commercial brand name foods and military ration items are often...designed to simplify and streamline the process of providing group meals in the field by integrating components of A-Rations, and Heat & Serve (H & S

  4. Bioprocess considerations for expanded-bed chromatography of crude canola extract: sample preparation and adsorbent reuse.

    PubMed

    Bai, Yun; Glatz, Charles E

    2003-03-30

    Compared to the conventional microbial and mammalian systems, transgenic plants produce proteins in a different matrix. This provides opportunities and challenges for downstream processing. In the context of the plant host Brassica napus (canola), this work addresses the bioprocessing challenges of solid fractionation, resin fouling by native plant components (e.g., oil, phenolics, etc.), hydrodynamic stability, and resin reuse for expanded bed adsorption for product capture. Plant tissue processing and subsequent protein extraction typically result in an extract with a high content of solids containing a wide particle-size distribution. Without removal of larger particles, the column inlet distributor plugged. The larger particles (> 50 microm) were easily removed through centrifugal settling comparable to that attainable with a scroll decanter. The remaining solids did not affect the column performance. Less than 4% of the lipids and phenolics in the fed extract bound to STREAMLINE trade mark DEAE resin, and this small proportion could be satisfactorily removed using recommended clean-in-place (CIP) procedures. Hydrodynamic expansion and adsorption kinetics of the STREAMLINE trade mark DEAE resin were maintained throughout 10 cycles of reuse, as was the structural integrity of the resin beads. No significant accumulation of N-rich (e.g., proteins) and C/O-rich components (e.g., oil and phenolics) occurred over the same period. Copyright 2003 Wiley Periodicals Inc. Biotechnol Bioeng 81: 775-782, 2003.

  5. Pros and Cons: A Balanced View of Robotics in Knee Arthroplasty.

    PubMed

    Lonner, Jess H; Fillingham, Yale A

    2018-07-01

    In both unicompartmental knee arthroplasty (UKA) and total knee arthroplasty (TKA), compared with conventional techniques robotic technology has been shown to optimize the precision of bone preparation and component alignment, reducing outliers and increasing the percentage of components aligned within 2° or 3° of the target goal. In addition, soft tissue balance can be quantified through a range of motion in UKA and TKA using the various robotic technologies available. Although the presumption has been that the improved alignment associated with robotics will improve function and implant durability, there are limited data to support that notion. Based on recent and emerging data, it may be unreasonable to presume that robotics is necessary for both UKA and TKA. In fact, despite improvements in various proxy measures, the precision of robotics may be more important for UKA than TKA, although if system costs and surgical efficiencies continue to improve, streamlining perioperative processes, reducing instrument inventory, and achieving comparable outcomes in TKA may be a reasonable goal of robotic surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Time for a second look at SOX compliance.

    PubMed

    Bigalke, John T; Burrill, Stephen J

    2007-08-01

    Incentives for tax-exempt healthcare organizations to comply with the Sarbanes-Oxley Act (SOX) abound from many quarters, including government, various associations, and the capital markets. New proposals from the Securities and Exchange Commission and the Public Company Accountability Oversight Board will streamline the processes of SOX compliance, even as the cost of compliance is dropping. Voluntary SOX compliance can best be achieved by adopting a four-phased control rationalization approach to implementation and maintenance.

  7. Handbook for Conducting Analysis of the Manpower, Personnel, and Training Elements for a MANPRINT Assessment

    DTIC Science & Technology

    1991-04-01

    Traditional LCSMM and the Streamlined LCSMM. The Traditional LCSMM is divided into four phases: Concept Exploration, Demonstration and Validation, Full-Scale...to go faster. Whether that person is a typist, pianist or rifleman, his accuracy is nearly always decreased.) Figure 11 provides a graphic...each). Organizations are divided into two levels: primary and secondary. 44 Table 4 EXAMPLES OF MPT QUESTIONS USED FOR ANALYSIS OF RPV MANPOWER

  8. Stabilizers influence drug–polymer interactions and physicochemical properties of disulfiram-loaded poly-lactide-co-glycolide nanoparticles

    PubMed Central

    Hoda, Muddasarul; Sufi, Shamim Akhtar; Cavuturu, Bindumadhuri; Rajagopalan, Rukkumani

    2018-01-01

    Aim: Stabilizers are known to be an integral component of polymeric nanostructures. Ideally, they manipulate physicochemical properties of nanoparticles. Based on this hypothesis, we demonstrated that disulfiram (drug) and Poly-lactide-co-glycolide (polymer) interactions and physicochemical properties of their nanoparticles formulations are significantly influenced by the choice of stabilizers. Methodology: Electron microscopy, differential scanning calorimetry, x-ray diffraction, Raman spectrum analysis, isothermal titration calorimetry and in silico docking studies were performed. Results & discussion: Polysorbate 80 imparted highest crystallinity while Triton-X 100 imparted highest rigidity, possibly influencing drug bioavailability, blood-retention time, cellular uptake and sustained drug release. All the molecular interactions were hydrophobic in nature and entropy driven. Therefore, polymeric nanoparticles may be critically manipulated to streamline the passive targeting of drug-loaded nanoparticles. PMID:29379637

  9. Comparison of large-scale human brain functional and anatomical networks in schizophrenia.

    PubMed

    Nelson, Brent G; Bassett, Danielle S; Camchong, Jazmin; Bullmore, Edward T; Lim, Kelvin O

    2017-01-01

    Schizophrenia is a disease with disruptions in thought, emotion, and behavior. The dysconnectivity hypothesis suggests these disruptions are due to aberrant brain connectivity. Many studies have identified connectivity differences but few have been able to unify gray and white matter findings into one model. Here we develop an extension of the Network-Based Statistic (NBS) called NBSm (Multimodal Network-based statistic) to compare functional and anatomical networks in schizophrenia. Structural, resting functional, and diffusion magnetic resonance imaging data were collected from 29 chronic patients with schizophrenia and 29 healthy controls. Images were preprocessed, and average time courses were extracted for 90 regions of interest (ROI). Functional connectivity matrices were estimated by pairwise correlations between wavelet coefficients of ROI time series. Following diffusion tractography, anatomical connectivity matrices were estimated by white matter streamline counts between each pair of ROIs. Global and regional strength were calculated for each modality. NBSm was used to find significant overlap between functional and anatomical components that distinguished health from schizophrenia. Global strength was decreased in patients in both functional and anatomical networks. Regional strength was decreased in all regions in functional networks and only one region in anatomical networks. NBSm identified a distinguishing functional component consisting of 46 nodes with 113 links (p < 0.001), a distinguishing anatomical component with 47 nodes and 50 links (p = 0.002), and a distinguishing intermodal component with 26 nodes (p < 0.001). NBSm is a powerful technique for understanding network-based group differences present in both anatomical and functional data. In light of the dysconnectivity hypothesis, these results provide compelling evidence for the presence of significant overlapping anatomical and functional disruption in people with schizophrenia.

  10. Simulation of data safety components for corporative systems

    NASA Astrophysics Data System (ADS)

    Yaremko, Svetlana A.; Kuzmina, Elena M.; Savchuk, Tamara O.; Krivonosov, Valeriy E.; Smolarz, Andrzej; Arman, Abenov; Smailova, Saule; Kalizhanova, Aliya

    2017-08-01

    The article deals with research of designing data safety components for corporations by means of mathematical simulations and modern information technologies. Simulation of threats ranks has been done which is based on definite values of data components. The rules of safety policy for corporative information systems have been presented. The ways of realization of safety policy rules have been proposed on the basis of taken conditions and appropriate class of valuable data protection.

  11. Galactic Angular Momentum in Cosmological Zoom-in Simulations. I. Disk and Bulge Components and the Galaxy-Halo Connection

    NASA Astrophysics Data System (ADS)

    Sokołowska, Aleksandra; Capelo, Pedro R.; Fall, S. Michael; Mayer, Lucio; Shen, Sijing; Bonoli, Silvia

    2017-02-01

    We investigate the angular momentum evolution of four disk galaxies residing in Milky-Way-sized halos formed in cosmological zoom-in simulations with various sub-grid physics and merging histories. We decompose these galaxies, kinematically and photometrically, into their disk and bulge components. The simulated galaxies and their components lie on the observed sequences in the j *-M * diagram, relating the specific angular momentum and mass of the stellar component. We find that galaxies in low-density environments follow the relation {j}* \\propto {M}* α past major mergers, with α ˜ 0.6 in the case of strong feedback, when bulge-to-disk ratios are relatively constant, and α ˜ 1.4 in the other cases, when secular processes operate on shorter timescales. We compute the retention factors (I.e., the ratio of the specific angular momenta of stars and dark matter) for both disks and bulges and show that they vary relatively slowly after averaging over numerous but brief fluctuations. For disks, the retention factors are usually close to unity, while for bulges, they are a few times smaller. Our simulations therefore indicate that galaxies and their halos grow in a quasi-homologous way.

  12. Multi-faceted informatics system for digitising and streamlining the reablement care model.

    PubMed

    Bond, Raymond R; Mulvenna, Maurice D; Finlay, Dewar D; Martin, Suzanne

    2015-08-01

    Reablement is new paradigm to increase independence in the home amongst the ageing population. And it remains a challenge to design an optimal electronic system to streamline and integrate reablement into current healthcare infrastructure. Furthermore, given reablement requires collaboration with a range of organisations (including national healthcare institutions and community/voluntary service providers), such a system needs to be co-created with all stakeholders involved. Thus, the purpose of this study is, (1) to bring together stakeholder groups to elicit a comprehensive set of requirements for a digital reablement system, (2) to utilise emerging technologies to implement a system and a data model based on the requirements gathered and (3) to involve user groups in a usability assessment of the system. In this study we employed a mixed qualitative approach that included a series of stakeholder-involved activities. Collectively, 73 subjects were recruited to participate in an ideation event, a quasi-hackathon and a usability study. The study unveiled stakeholder-led requirements, which resulted in a novel cloud-based system that was created using emerging web technologies. The system is driven by a unique data model and includes interactive features that are necessary for streamlining the reablement care model. In summary, this system allows community based interventions (or services) to be prescribed to occupants whilst also monitoring the occupant's progress of independent living. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Visualizing whole-brain DTI tractography with GPU-based Tuboids and LoD management.

    PubMed

    Petrovic, Vid; Fallon, James; Kuester, Falko

    2007-01-01

    Diffusion Tensor Imaging (DTI) of the human brain, coupled with tractography techniques, enable the extraction of large-collections of three-dimensional tract pathways per subject. These pathways and pathway bundles represent the connectivity between different brain regions and are critical for the understanding of brain related diseases. A flexible and efficient GPU-based rendering technique for DTI tractography data is presented that addresses common performance bottlenecks and image-quality issues, allowing interactive render rates to be achieved on commodity hardware. An occlusion query-based pathway LoD management system for streamlines/streamtubes/tuboids is introduced that optimizes input geometry, vertex processing, and fragment processing loads, and helps reduce overdraw. The tuboid, a fully-shaded streamtube impostor constructed entirely on the GPU from streamline vertices, is also introduced. Unlike full streamtubes and other impostor constructs, tuboids require little to no preprocessing or extra space over the original streamline data. The supported fragment processing levels of detail range from texture-based draft shading to full raycast normal computation, Phong shading, environment mapping, and curvature-correct text labeling. The presented text labeling technique for tuboids provides adaptive, aesthetically pleasing labels that appear attached to the surface of the tubes. Furthermore, an occlusion query aggregating and scheduling scheme for tuboids is described that reduces the query overhead. Results for a tractography dataset are presented, and demonstrate that LoD-managed tuboids offer benefits over traditional streamtubes both in performance and appearance.

  14. Skylab

    NASA Image and Video Library

    1971-11-01

    The Apollo Telescope Mount (ATM), designed and developed by the Marshall Space Flight Center, was one of four major components comprising the Skylab (1973-1979). The ATM housed the first manned scientific telescope in space. This photograph shows the ATM rigged for altitude and space simulation tests at the Space Environment Simulation Laboratory of the Manned Spacecraft Center (MSC). The MSC was renamed the Johnson Space Center (JSC) in early 1973.

  15. SolarTherm: A flexible Modelica-based simulator for CSP systems

    NASA Astrophysics Data System (ADS)

    Scott, Paul; Alonso, Alberto de la Calle; Hinkley, James T.; Pye, John

    2017-06-01

    Annual performance simulations provide a valuable tool for analysing the viability and overall impact of different concentrating solar power (CSP) component and system designs. However, existing tools work best with conventional systems and are difficult or impossible to adapt when novel components, configurations and operating strategies are of interest. SolarTherm is a new open source simulation tool that fulfils this need for the solar community. It includes a simulation framework and a library of flexible CSP components and control strategies that can be adapted or replaced with new designs to meet the special needs of end users. This paper provides an introduction to SolarTherm and a comparison of models for an energy-based trough system and a physical tower system to those in the well-established and widely-used simulator SAM. Differences were found in some components where the inner workings of SAM are undocumented or not well understood, while the other parts show strong agreement. These results help to validate the fundamentals of SolarTherm and demonstrate that, while at an early stage of development, it is already a useful tool for performing annual simulations.

  16. A Low-Cost Simulation Model for R-Wave Synchronized Atrial Pacing in Pediatric Patients with Postoperative Junctional Ectopic Tachycardia

    PubMed Central

    Michel, Miriam; Egender, Friedemann; Heßling, Vera; Dähnert, Ingo; Gebauer, Roman

    2016-01-01

    Background Postoperative junctional ectopic tachycardia (JET) occurs frequently after pediatric cardiac surgery. R-wave synchronized atrial (AVT) pacing is used to re-establish atrioventricular synchrony. AVT pacing is complex, with technical pitfalls. We sought to establish and to test a low-cost simulation model suitable for training and analysis in AVT pacing. Methods A simulation model was developed based on a JET simulator, a simulation doll, a cardiac monitor, and a pacemaker. A computer program simulated electrocardiograms. Ten experienced pediatric cardiologists tested the model. Their performance was analyzed using a testing protocol with 10 working steps. Results Four testers found the simulation model realistic; 6 found it very realistic. Nine claimed that the trial had improved their skills. All testers considered the model useful in teaching AVT pacing. The simulation test identified 5 working steps in which major mistakes in performance test may impede safe and effective AVT pacing and thus permitted specific training. The components of the model (exclusive monitor and pacemaker) cost less than $50. Assembly and training-session expenses were trivial. Conclusions A realistic, low-cost simulation model of AVT pacing is described. The model is suitable for teaching and analyzing AVT pacing technique. PMID:26943363

  17. Design, Performance, and Operation of Efficient Ramjet/Scramjet Combined Cycle Hypersonic Propulsion

    DTIC Science & Technology

    2009-10-16

    simulations, the blending of the RANS and LES portions is handled by the standard DES equations, now referred to as DES97. The one-equation Spalart...think that RANS can capture these dynamics. • Much remains to be learned about how to model chemistry-turbulence interactions in scramjet flows...BILLIG, F. S., R. BAURLE, AND C. TAM 1999 Design and Analysis of Streamline Traced Hypersonic Inlets. AIAA Paper 1999-4974. BILLIG, F.S., AND

  18. Numerical Simulation of Sintering Process in Ceramic Powder Injection Moulded Components

    NASA Astrophysics Data System (ADS)

    Song, J.; Barriere, T.; Liu, B.; Gelin, J. C.

    2007-05-01

    A phenomenological model based on viscoplastic constitutive law is presented to describe the sintering process of ceramic components obtained by powder injection moulding. The parameters entering in the model are identified through sintering experiments in dilatometer with the proposed optimization method. The finite element simulations are carried out to predict the density variations and dimensional changes of the components during sintering. A simulation example on the sintering process of hip implant in alumina has been conducted. The simulation results have been compared with the experimental ones. A good agreement is obtained.

  19. Virtual reality in ophthalmology training.

    PubMed

    Khalifa, Yousuf M; Bogorad, David; Gibson, Vincent; Peifer, John; Nussbaum, Julian

    2006-01-01

    Current training models are limited by an unstructured curriculum, financial costs, human costs, and time constraints. With the newly mandated resident surgical competency, training programs are struggling to find viable methods of assessing and documenting the surgical skills of trainees. Virtual-reality technologies have been used for decades in flight simulation to train and assess competency, and there has been a recent push in surgical specialties to incorporate virtual-reality simulation into residency programs. These efforts have culminated in an FDA-approved carotid stenting simulator. What role virtual reality will play in the evolution of ophthalmology surgical curriculum is uncertain. The current apprentice system has served the art of surgery for over 100 years, and we foresee virtual reality working synergistically with our current curriculum modalities to streamline and enhance the resident's learning experience.

  20. Users manual for the NASA Lewis three-dimensional ice accretion code (LEWICE 3D)

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.; Potapczuk, Mark G.

    1993-01-01

    A description of the methodology, the algorithms, and the input and output data along with an example case for the NASA Lewis 3D ice accretion code (LEWICE3D) has been produced. The manual has been designed to help the user understand the capabilities, the methodologies, and the use of the code. The LEWICE3D code is a conglomeration of several codes for the purpose of calculating ice shapes on three-dimensional external surfaces. A three-dimensional external flow panel code is incorporated which has the capability of calculating flow about arbitrary 3D lifting and nonlifting bodies with external flow. A fourth order Runge-Kutta integration scheme is used to calculate arbitrary streamlines. An Adams type predictor-corrector trajectory integration scheme has been included to calculate arbitrary trajectories. Schemes for calculating tangent trajectories, collection efficiencies, and concentration factors for arbitrary regions of interest for single droplets or droplet distributions have been incorporated. A LEWICE 2D based heat transfer algorithm can be used to calculate ice accretions along surface streamlines. A geometry modification scheme is incorporated which calculates the new geometry based on the ice accretions generated at each section of interest. The three-dimensional ice accretion calculation is based on the LEWICE 2D calculation. Both codes calculate the flow, pressure distribution, and collection efficiency distribution along surface streamlines. For both codes the heat transfer calculation is divided into two regions, one above the stagnation point and one below the stagnation point, and solved for each region assuming a flat plate with pressure distribution. Water is assumed to follow the surface streamlines, hence starting at the stagnation zone any water that is not frozen out at a control volume is assumed to run back into the next control volume. After the amount of frozen water at each control volume has been calculated the geometry is modified by adding the ice at each control volume in the surface normal direction.

  1. ACHP | News | Nationwide Programmatic Agreement Streamlines 106 Process for

    Science.gov Websites

    Publications Search skip specific nav links Home arrow News arrow Nationwide Programmatic Agreement Streamlines 106 Process for NPS Nationwide Programmatic Agreement Streamlines 106 Process for NPS Pursuant to Service (NPS) on November 14, 2008, executed a nationwide Programmatic Agreement (PA) with the Advisory

  2. Enhanced Performance of Streamline-Traced External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2015-01-01

    A computational design study was conducted to enhance the aerodynamic performance of streamline-traced, external-compression inlets for Mach 1.6. The current study explored a new parent flowfield for the streamline tracing and several variations of inlet design factors, including the axial displacement and angle of the subsonic cowl lip, the vertical placement of the engine axis, and the use of porous bleed in the subsonic diffuser. The performance was enhanced over that of an earlier streamline-traced inlet such as to increase the total pressure recovery and reduce total pressure distortion

  3. Enhanced Performance of Streamline-Traced External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2015-01-01

    A computational design study was conducted to enhance the aerodynamic performance of streamline-traced, external-compression inlets for Mach 1.6. Compared to traditional external-compression, two-dimensional and axisymmetric inlets, streamline-traced inlets promise reduced cowl wave drag and sonic boom, but at the expense of reduced total pressure recovery and increased total pressure distortion. The current study explored a new parent flowfield for the streamline tracing and several variations of inlet design factors, including the axial displacement and angle of the subsonic cowl lip, the vertical placement of the engine axis, and the use of porous bleed in the subsonic diffuser. The performance was enhanced over that of an earlier streamline-traced inlet such as to increase the total pressure recovery and reduce total pressure distortion.

  4. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    PubMed

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  5. Optimizing liquid effluent monitoring at a large nuclear complex.

    PubMed

    Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M

    2003-12-01

    Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.

  6. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    PubMed Central

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  7. 3-D CFD Simulation and Validation of Oxygen-Rich Hydrocarbon Combustion in a Gas-Centered Swirl Coaxial Injector using a Flamelet-Based Approach

    NASA Technical Reports Server (NTRS)

    Richardson, Brian; Kenny, Jeremy

    2015-01-01

    Injector design is a critical part of the development of a rocket Thrust Chamber Assembly (TCA). Proper detailed injector design can maximize propulsion efficiency while minimizing the potential for failures in the combustion chamber. Traditional design and analysis methods for hydrocarbon-fuel injector elements are based heavily on empirical data and models developed from heritage hardware tests. Using this limited set of data produces challenges when trying to design a new propulsion system where the operating conditions may greatly differ from heritage applications. Time-accurate, Three-Dimensional (3-D) Computational Fluid Dynamics (CFD) modeling of combusting flows inside of injectors has long been a goal of the fluid analysis group at Marshall Space Flight Center (MSFC) and the larger CFD modeling community. CFD simulation can provide insight into the design and function of an injector that cannot be obtained easily through testing or empirical comparisons to existing hardware. However, the traditional finite-rate chemistry modeling approach utilized to simulate combusting flows for complex fuels, such as Rocket Propellant-2 (RP-2), is prohibitively expensive and time consuming even with a large amount of computational resources. MSFC has been working, in partnership with Streamline Numerics, Inc., to develop a computationally efficient, flamelet-based approach for modeling complex combusting flow applications. In this work, a flamelet modeling approach is used to simulate time-accurate, 3-D, combusting flow inside a single Gas Centered Swirl Coaxial (GCSC) injector using the flow solver, Loci-STREAM. CFD simulations were performed for several different injector geometries. Results of the CFD analysis helped guide the design of the injector from an initial concept to a tested prototype. The results of the CFD analysis are compared to data gathered from several hot-fire, single element injector tests performed in the Air Force Research Lab EC-1 test facility located at Edwards Air Force Base.

  8. Streamlined genetic education is effective in preparing women newly diagnosed with breast cancer for decision making about treatment-focused genetic testing: a randomized controlled noninferiority trial.

    PubMed

    Quinn, Veronica F; Meiser, Bettina; Kirk, Judy; Tucker, Kathy M; Watts, Kaaren J; Rahman, Belinda; Peate, Michelle; Saunders, Christobel; Geelhoed, Elizabeth; Gleeson, Margaret; Barlow-Stewart, Kristine; Field, Michael; Harris, Marion; Antill, Yoland C; Cicciarelli, Linda; Crowe, Karen; Bowen, Michael T; Mitchell, Gillian

    2017-04-01

    Increasingly, women newly diagnosed with breast cancer are being offered treatment-focused genetic testing (TFGT). As the demand for TFGT increases, streamlined methods of genetic education are needed. In this noninferiority trial, women aged <50 years with either a strong family history (FH+) or other features suggestive of a germ-line mutation (FH-) were randomized before definitive breast cancer surgery to receive TFGT education either as brief written materials (intervention group (IG)) or during a genetic counseling session at a familial cancer clinic (usual-care group (UCG)). Women completed self-report questionnaires at four time points over 12 months. A total of 135 women were included in the analysis, all of whom opted for TFGT. Decisional conflict about TFGT choice (primary outcome) was not inferior in the IG compared with the UCG (noninferiority margin of -10; mean difference = 2.45; 95% confidence interval -2.87-7.76; P = 0.36). Costs per woman counseled in the IG were significantly lower (AUD$89) compared with the UCG (AUD$173; t(115) = 6.02; P < 0.001). A streamlined model of educating women newly diagnosed with breast cancer about TFGT seems to be a cost-effective way of delivering education while ensuring that women feel informed and supported in their decision making, thus freeing resources for other women to access TFGT.Genet Med 19 4, 448-456.

  9. Method based on the Laplace equations to reconstruct the river terrain for two-dimensional hydrodynamic numerical modeling

    NASA Astrophysics Data System (ADS)

    Lai, Ruixun; Wang, Min; Yang, Ming; Zhang, Chao

    2018-02-01

    The accuracy of the widely-used two-dimensional hydrodynamic numerical model depends on the quality of the river terrain model, particularly in the main channel. However, in most cases, the bathymetry of the river channel is difficult or expensive to obtain in the field, and there is a lack of available data to describe the geometry of the river channel. We introduce a method that originates from the grid generation with the elliptic equation to generate streamlines of the river channel. The streamlines are numerically solved with the Laplace equations. In the process, streamlines in the physical domain are first computed in a computational domain, and then transformed back to the physical domain. The interpolated streamlines are integrated with the surrounding topography to reconstruct the entire river terrain model. The approach was applied to a meandering reach in the Qinhe River, which is a tributary in the middle of the Yellow River, China. Cross-sectional validation and the two-dimensional shallow-water equations are used to test the performance of the river terrain generated. The results show that the approach can reconstruct the river terrain using the data from measured cross-sections. Furthermore, the created river terrain can maintain a geometrical shape consistent with the measurements, while generating a smooth main channel. Finally, several limitations and opportunities for future research are discussed.

  10. A physics based method for combining multiple anatomy models with application to medical simulation.

    PubMed

    Zhu, Yanong; Magee, Derek; Ratnalingam, Rishya; Kessel, David

    2009-01-01

    We present a physics based approach to the construction of anatomy models by combining components from different sources; different image modalities, protocols, and patients. Given an initial anatomy, a mass-spring model is generated which mimics the physical properties of the solid anatomy components. This helps maintain valid spatial relationships between the components, as well as the validity of their shapes. Combination can be either replacing/modifying an existing component, or inserting a new component. The external forces that deform the model components to fit the new shape are estimated from Gradient Vector Flow and Distance Transform maps. We demonstrate the applicability and validity of the described approach in the area of medical simulation, by showing the processes of non-rigid surface alignment, component replacement, and component insertion.

  11. Physical habitat simulation system reference manual: version II

    USGS Publications Warehouse

    Milhous, Robert T.; Updike, Marlys A.; Schneider, Diane M.

    1989-01-01

    There are four major components of a stream system that determine the productivity of the fishery (Karr and Dudley 1978). These are: (1) flow regime, (2) physical habitat structure (channel form, substrate distribution, and riparian vegetation), (3) water quality (including temperature), and (4) energy inputs from the watershed (sediments, nutrients, and organic matter). The complex interaction of these components determines the primary production, secondary production, and fish population of the stream reach. The basic components and interactions needed to simulate fish populations as a function of management alternatives are illustrated in Figure I.1. The assessment process utilizes a hierarchical and modular approach combined with computer simulation techniques. The modular components represent the "building blocks" for the simulation. The quality of the physical habitat is a function of flow and, therefore, varies in quality and quantity over the range of the flow regime. The conceptual framework of the Incremental Methodology and guidelines for its application are described in "A Guide to Stream Habitat Analysis Using the Instream Flow Incremental Methodology" (Bovee 1982). Simulation of physical habitat is accomplished using the physical structure of the stream and streamflow. The modification of physical habitat by temperature and water quality is analyzed separately from physical habitat simulation. Temperature in a stream varies with the seasons, local meteorological conditions, stream network configuration, and the flow regime; thus, the temperature influences on habitat must be analysed on a stream system basis. Water quality under natural conditions is strongly influenced by climate and the geological materials, with the result that there is considerable natural variation in water quality. When we add the activities of man, the possible range of water quality possibilities becomes rather large. Consequently, water quality must also be analysed on a stream system basis. Such analysis is outside the scope of this manual, which concentrates on simulation of physical habitat based on depth, velocity, and a channel index. The results form PHABSIM can be used alone or by using a series of habitat time series programs that have been developed to generate monthly or daily habitat time series from the Weighted Usable Area versus streamflow table resulting from the habitat simulation programs and streamflow time series data. Monthly and daily streamflow time series may be obtained from USGS gages near the study site or as the output of river system management models.

  12. RLV/X-33 operations overview

    NASA Astrophysics Data System (ADS)

    Black, Stephen T.; Eshleman, Wally

    1997-01-01

    This paper describes the VentureStar™ SSTO RLV and X-33 operations concepts. Applications of advanced technologies, automated ground support systems, advanced aircraft and launch vehicle lessons learned have been integrated to develop a streamlined vehicle and mission processing concept necessary to meet the goals of a commercial SSTO RLV. These concepts will be validated by the X-33 flight test program where financial and technical risk mitigation are required. The X-33 flight test program totally demonstrates the vehicle performance, technology, and efficient ground operations at the lowest possible cost. The Skunk Work's test program approach and test site proximity to the production plant are keys. The X-33 integrated flight and ground test program incrementally expands the knowledge base of the overall system allowing minimum risk progression to the next flight test program milestone. Subsequent X-33 turnaround processing flows will be performed with an aircraft operations philosophy. The differences will be based on research and development, component reliability and flight test requirements.

  13. A two-dimensional numerical study of the flow inside the combustion chambers of a motored rotary engine

    NASA Technical Reports Server (NTRS)

    Shih, T. I. P.; Yang, S. L.; Schock, H. J.

    1986-01-01

    A numerical study was performed to investigate the unsteady, multidimensional flow inside the combustion chambers of an idealized, two-dimensional, rotary engine under motored conditions. The numerical study was based on the time-dependent, two-dimensional, density-weighted, ensemble-averaged conservation equations of mass, species, momentum, and total energy valid for two-component ideal gas mixtures. The ensemble-averaged conservation equations were closed by a K-epsilon model of turbulence. This K-epsilon model of turbulence was modified to account for some of the effects of compressibility, streamline curvature, low-Reynolds number, and preferential stress dissipation. Numerical solutions to the conservation equations were obtained by the highly efficient implicit-factored method of Beam and Warming. The grid system needed to obtain solutions were generated by an algebraic grid generation technique based on transfinite interpolation. Results of the numerical study are presented in graphical form illustrating the flow patterns during intake, compression, gaseous fuel injection, expansion, and exhaust.

  14. Ub-ISAP: a streamlined UNIX pipeline for mining unique viral vector integration sites from next generation sequencing data.

    PubMed

    Kamboj, Atul; Hallwirth, Claus V; Alexander, Ian E; McCowage, Geoffrey B; Kramer, Belinda

    2017-06-17

    The analysis of viral vector genomic integration sites is an important component in assessing the safety and efficiency of patient treatment using gene therapy. Alongside this clinical application, integration site identification is a key step in the genetic mapping of viral elements in mutagenesis screens that aim to elucidate gene function. We have developed a UNIX-based vector integration site analysis pipeline (Ub-ISAP) that utilises a UNIX-based workflow for automated integration site identification and annotation of both single and paired-end sequencing reads. Reads that contain viral sequences of interest are selected and aligned to the host genome, and unique integration sites are then classified as transcription start site-proximal, intragenic or intergenic. Ub-ISAP provides a reliable and efficient pipeline to generate large datasets for assessing the safety and efficiency of integrating vectors in clinical settings, with broader applications in cancer research. Ub-ISAP is available as an open source software package at https://sourceforge.net/projects/ub-isap/ .

  15. A two-dimensional numerical study of the flow inside the combustion chamber of a motored rotary engine

    NASA Technical Reports Server (NTRS)

    Shih, T. I-P.; Yang, S. L.; Schock, H. J.

    1986-01-01

    A numerical study was performed to investigate the unsteady, multidimensional flow inside the combustion chambers of an idealized, two-dimensional, rotary engine under motored conditions. The numerical study was based on the time-dependent, two-dimensional, density-weighted, ensemble-averaged conservation equations of mass, species, momentum, and total energy valid for two-component ideal gas mixtures. The ensemble-averaged conservation equations were closed by a K-epsilon model of turbulence. This K-epsilon model of turbulence was modified to account for some of the effects of compressibility, streamline curvature, low-Reynolds number, and preferential stress dissipation. Numerical solutions to the conservation equations were obtained by the highly efficient implicit-factored method of Beam and Warming. The grid system needed to obtain solutions were generated by an algebraic grid generation technique based on transfinite interpolation. Results of the numerical study are presented in graphical form illustrating the flow patterns during intake, compression, gaseous fuel injection, expansion, and exhaust.

  16. Scalability of grid- and subbasin-based land surface modeling approaches for hydrologic simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tesfa, Teklu K.; Ruby Leung, L.; Huang, Maoyi

    2014-03-27

    This paper investigates the relative merits of grid- and subbasin-based land surface modeling approaches for hydrologic simulations, with a focus on their scalability (i.e., abilities to perform consistently across a range of spatial resolutions) in simulating runoff generation. Simulations produced by the grid- and subbasin-based configurations of the Community Land Model (CLM) are compared at four spatial resolutions (0.125o, 0.25o, 0.5o and 1o) over the topographically diverse region of the U.S. Pacific Northwest. Using the 0.125o resolution simulation as the “reference”, statistical skill metrics are calculated and compared across simulations at 0.25o, 0.5o and 1o spatial resolutions of each modelingmore » approach at basin and topographic region levels. Results suggest significant scalability advantage for the subbasin-based approach compared to the grid-based approach for runoff generation. Basin level annual average relative errors of surface runoff at 0.25o, 0.5o, and 1o compared to 0.125o are 3%, 4%, and 6% for the subbasin-based configuration and 4%, 7%, and 11% for the grid-based configuration, respectively. The scalability advantages of the subbasin-based approach are more pronounced during winter/spring and over mountainous regions. The source of runoff scalability is found to be related to the scalability of major meteorological and land surface parameters of runoff generation. More specifically, the subbasin-based approach is more consistent across spatial scales than the grid-based approach in snowfall/rainfall partitioning, which is related to air temperature and surface elevation. Scalability of a topographic parameter used in the runoff parameterization also contributes to improved scalability of the rain driven saturated surface runoff component, particularly during winter. Hence this study demonstrates the importance of spatial structure for multi-scale modeling of hydrological processes, with implications to surface heat fluxes in coupled land-atmosphere modeling.« less

  17. Modelling carbonaceous aerosol from residential solid fuel burning with different assumptions for emissions

    NASA Astrophysics Data System (ADS)

    Ots, Riinu; Heal, Mathew R.; Young, Dominique E.; Williams, Leah R.; Allan, James D.; Nemitz, Eiko; Di Marco, Chiara; Detournay, Anais; Xu, Lu; Ng, Nga L.; Coe, Hugh; Herndon, Scott C.; Mackenzie, Ian A.; Green, David C.; Kuenen, Jeroen J. P.; Reis, Stefan; Vieno, Massimo

    2018-04-01

    Evidence is accumulating that emissions of primary particulate matter (PM) from residential wood and coal combustion in the UK may be underestimated and/or spatially misclassified. In this study, different assumptions for the spatial distribution and total emission of PM from solid fuel (wood and coal) burning in the UK were tested using an atmospheric chemical transport model. Modelled concentrations of the PM components were compared with measurements from aerosol mass spectrometers at four sites in central and Greater London (ClearfLo campaign, 2012), as well as with measurements from the UK black carbon network.The two main alternative emission scenarios modelled were Base4x and combRedist. For Base4x, officially reported PM2.5 from the residential and other non-industrial combustion source sector were increased by a factor of four. For the combRedist experiment, half of the baseline emissions from this same source were redistributed by residential population density to simulate the effect of allocating some emissions to the smoke control areas (that are assumed in the national inventory to have no emissions from this source). The Base4x scenario yielded better daily and hourly correlations with measurements than the combRedist scenario for year-long comparisons of the solid fuel organic aerosol (SFOA) component at the two London sites. However, the latter scenario better captured mean measured concentrations across all four sites. A third experiment, Redist - all emissions redistributed linearly to population density, is also presented as an indicator of the maximum concentrations an assumption like this could yield.The modelled elemental carbon (EC) concentrations derived from the combRedist experiments also compared well with seasonal average concentrations of black carbon observed across the network of UK sites. Together, the two model scenario simulations of SFOA and EC suggest both that residential solid fuel emissions may be higher than inventory estimates and that the spatial distribution of residential solid fuel burning emissions, particularly in smoke control areas, needs re-evaluation. The model results also suggest the assumed temporal profiles for residential emissions may require review to place greater emphasis on evening (including discretionary) solid fuel burning.

  18. Characterization of Emissions from Open Burning of Meals ...

    EPA Pesticide Factsheets

    Emissions from burning current and candidate Meals Ready-to-Eat (MRE) packaging and shipping containers were characterized in an effort to assuage concerns that combustive disposal of waste at forward operating bases could pose an environmental or inhalation threat. Four types of container materials, both box and liners, including the currently used fiberboard, new corrugated fiberboard with Spektrakote polymer, new fiberboard without Spektrakote polymer, and the current fiberboard without wet strength were burned in an open burn test facility that simulated the burn pit disposal methods in Iraq and Afghanistan. MREs, including both current and proposed packaging materials, were added to a single container type to examine their effect on emissions. One quarter of the food was left in the packaging to represent unused meal components. The proposed packaging, consisting of a nano-composite polymer, was added in 25 % increments compared to traditional MRE packaging to create a range of usage levels. Emission factors, mass of pollutant per mass of burned material, were increased over the emission factors of the package containers themselves by the addition of the multi-component MREs, with the exception of Volatile Organic Compounds (VOCs). In general, little distinction was observed when comparing emission factors from the four container materials and when comparing the four MRE compositions. The majority of Particulate Matter (PM) emissions were of particles that

  19. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  20. Assessment of zero-equation SGS models for simulating indoor environment

    NASA Astrophysics Data System (ADS)

    Taghinia, Javad; Rahman, Md Mizanur; Tse, Tim K. T.

    2016-12-01

    The understanding of air-flow in enclosed spaces plays a key role to designing ventilation systems and indoor environment. The computational fluid dynamics aspects dictate that the large eddy simulation (LES) offers a subtle means to analyze complex flows with recirculation and streamline curvature effects, providing more robust and accurate details than those of Reynolds-averaged Navier-Stokes simulations. This work assesses the performance of two zero-equation sub-grid scale models: the Rahman-Agarwal-Siikonen-Taghinia (RAST) model with a single grid-filter and the dynamic Smagorinsky model with grid-filter and test-filter scales. This in turn allows a cross-comparison of the effect of two different LES methods in simulating indoor air-flows with forced and mixed (natural + forced) convection. A better performance against experiments is indicated with the RAST model in wall-bounded non-equilibrium indoor air-flows; this is due to its sensitivity toward both the shear and vorticity parameters.

  1. ARC-2012-ACD12-0020-005

    NASA Image and Video Library

    2012-02-10

    Then and Now: These images illustrate the dramatic improvement in NASA computing power over the last 23 years, and its effect on the number of grid points used for flow simulations. At left, an image from the first full-body Navier-Stokes simulation (1988) of an F-16 fighter jet showing pressure on the aircraft body, and fore-body streamlines at Mach 0.90. This steady-state solution took 25 hours using a single Cray X-MP processor to solve the 500,000 grid-point problem. Investigator: Neal Chaderjian, NASA Ames Research Center At right, a 2011 snapshot from a Navier-Stokes simulation of a V-22 Osprey rotorcraft in hover. The blade vortices interact with the smaller turbulent structures. This very detailed simulation used 660 million grid points, and ran on 1536 processors on the Pleiades supercomputer for 180 hours. Investigator: Neal Chaderjian, NASA Ames Research Center; Image: Tim Sandstrom, NASA Ames Research Center

  2. Simulation of dilute polymeric fluids in a three-dimensional contraction using a multiscale FENE model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griebel, M., E-mail: griebel@ins.uni-bonn.de, E-mail: ruettgers@ins.uni-bonn.de; Rüttgers, A., E-mail: griebel@ins.uni-bonn.de, E-mail: ruettgers@ins.uni-bonn.de

    The multiscale FENE model is applied to a 3D square-square contraction flow problem. For this purpose, the stochastic Brownian configuration field method (BCF) has been coupled with our fully parallelized three-dimensional Navier-Stokes solver NaSt3DGPF. The robustness of the BCF method enables the numerical simulation of high Deborah number flows for which most macroscopic methods suffer from stability issues. The results of our simulations are compared with that of experimental measurements from literature and show a very good agreement. In particular, flow phenomena such as a strong vortex enhancement, streamline divergence and a flow inversion for highly elastic flows are reproduced.more » Due to their computational complexity, our simulations require massively parallel computations. Using a domain decomposition approach with MPI, the implementation achieves excellent scale-up results for up to 128 processors.« less

  3. 20180312 - Uncertainty and Variability in High-Throughput Toxicokinetics for Risk Prioritization (SOT)

    EPA Science Inventory

    Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...

  4. Removal and recovery of acetic acid and two furans during sugar purification of simulated phenols-free biomass hydrolysates.

    PubMed

    Lee, Sang Cheol

    2017-12-01

    A cost-effective five-step sugar purification process involving simultaneous removal and recovery of fermentation inhibitors from biomass hydrolysates was first proposed here. Only the three separation steps (PB, PC and PD) in the process were investigated here. Furfural was selectively removed up to 98.4% from a simulated five-component hydrolysate in a cross-current three-stage extraction system with n-hexane. Most of acetic acid in a simulated four-component hydrolysate was selectively removed by emulsion liquid membrane, and it could be concentrated in the stripping solution up to 4.5 times its initial concentration in the feed solution. 5-Hydroxymethylfurfural was selectively removed from a simulated three-component hydrolysate in batch and continuous fixed-bed column adsorption systems with L-493 adsorbent. Also, 5-hydroxymethylfurfural could be concentrated to about 9 times its feed concentration in the continuous adsorption system through a fixed-bed column desorption experiment with aqueous ethanol solution. These results have shown that the proposed purification process was valid. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Integrating information from disparate sources: the Walter Reed National Surgical Quality Improvement Program Data Transfer Project.

    PubMed

    Nelson, Victoria; Nelson, Victoria Ruth; Li, Fiona; Green, Susan; Tamura, Tomoyoshi; Liu, Jun-Min; Class, Margaret

    2008-11-06

    The Walter Reed National Surgical Quality Improvement Program Data Transfer web module integrates with medical and surgical information systems, and leverages outside standards, such as the National Library of Medicine's RxNorm, to process surgical and risk assessment data. Key components of the project included a needs assessment with nurse reviewers and a data analysis for federated (standards were locally controlled) data sources. The resulting interface streamlines nurse reviewer workflow by integrating related tasks and data.

  6. Numerical Simulations of Wind Accretion in Symbiotic Binaries

    NASA Astrophysics Data System (ADS)

    de Val-Borro, M.; Karovska, M.; Sasselov, D.

    2009-08-01

    About half of the binary systems are close enough to each other for mass to be exchanged between them at some point in their evolution, yet the accretion mechanism in wind accreting binaries is not well understood. We study the dynamical effects of gravitational focusing by a binary companion on winds from late-type stars. In particular, we investigate the mass transfer and formation of accretion disks around the secondary in detached systems consisting of an asymptotic giant branch (AGB) mass-losing star and an accreting companion. The presence of mass outflows is studied as a function of mass-loss rate, wind temperature, and binary orbital parameters. A two-dimensional hydrodynamical model is used to study the stability of mass transfer in wind accreting symbiotic binary systems. In our simulations we use an adiabatic equation of state and a modified version of the isothermal approximation, where the temperature depends on the distance from the mass losing star and its companion. The code uses a block-structured adaptive mesh refinement method that allows us to have high resolution at the position of the secondary and resolve the formation of bow shocks and accretion disks. We explore the accretion flow between the components and formation of accretion disks for a range of orbital separations and wind parameters. Our results show the formation of stream flow between the stars and accretion disks of various sizes for certain orbital configurations. For a typical slow and massive wind from an AGB star the flow pattern is similar to a Roche lobe overflow with accretion rates of 10% of the mass loss from the primary. Stable disks with exponentially decreasing density profiles and masses of the order 10-4 solar masses are formed when wind acceleration occurs at several stellar radii. The disks are geometrically thin with eccentric streamlines and close to Keplerian velocity profiles. The formation of tidal streams and accretion disks is found to be weakly dependent on the mass loss from the AGB star. Our simulations of gravitationally focused wind accretion in symbiotic binaries show the formation of stream flows and enhanced accretion rates onto the compact component. We conclude that mass transfer through a focused wind is an important mechanism in wind accreting interacting binaries and can have a significant impact on the evolution of the binary itself and the individual components.

  7. Transport properties and equation of state for HCNO mixtures in and beyond the warm dense matter regime

    DOE PAGES

    Ticknor, Christopher; Collins, Lee A.; Kress, Joel D.

    2015-08-04

    We present simulations of a four component mixture of HCNO with orbital free molecular dynamics (OFMD). These simulations were conducted for 5–200 eV with densities ranging between 0.184 and 36.8 g/cm 3. We extract the equation of state from the simulations and compare to average atom models. We found that we only need to add a cold curve model to find excellent agreement. In addition, we studied mass transport properties. We present fits to the self-diffusion and shear viscosity that are able to reproduce the transport properties over the parameter range studied. We compare these OFMD results to models basedmore » on the Coulomb coupling parameter and one-component plasmas.« less

  8. Orbits: Computer simulation

    NASA Technical Reports Server (NTRS)

    Muszynska, A.

    1985-01-01

    In rotating machinery dynamics an orbit (Lissajous curve) represents the dynamic path of the shaft centerline motion during shaft rotation and resulting precession. The orbit can be observed with an oscilloscope connected to XY promixity probes. The orbits can also be simulated by a computer. The software for HP computer simulates orbits for two cases: (1) Symmetric orbit with four frequency components with different radial amplitudes and relative phase angles; and (2) Nonsymmetric orbit with two frequency components with two different vertical/horizontal amplitudes and two different relative phase angles. Each orbit carries a Keyphasor mark (one-per-turn reference). The frequencies, amplitudes, and phase angles, as well as number of time steps for orbit computation, have to be chosen and introduced to the computer by the user. The orbit graphs can be observed on the computer screen.

  9. Numerical design of streamlined tunnel walls for a two-dimensional transonic test

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Anderson, E. C.

    1978-01-01

    An analytical procedure is discussed for designing wall shapes for streamlined, nonporous, two-dimensional, transonic wind tunnels. It is based upon currently available 2-D inviscid transonic and boundary layer analysis computer programs. Predicted wall shapes are compared with experimental data obtained from the NASA Langley 6 by 19 inch Transonic Tunnel where the slotted walls were replaced by flexible nonporous walls. Comparisons are presented for the empty tunnel operating at a Mach number of 0.9 and for a supercritical test of an NACA 0012 airfoil at zero lift. Satisfactory agreement is obtained between the analytically and experimentally determined wall shapes.

  10. Effect of Loss on Multiplexed Single-Photon Sources (Open Access Publisher’s Version)

    DTIC Science & Technology

    2015-04-28

    lossy components on near- and long-term experimental goals, we simulate themultiplexed sources when used formany- photon state generation under various...efficient integer factorization and digital quantum simulation [7, 8], which relies critically on the development of a high-performance, on-demand photon ...SPDC) or spontaneous four-wave mixing: parametric processes which use a pump laser in a nonlinearmaterial to spontaneously generate photon pairs

  11. An Adverse Drug Event and Medication Error Reporting System for Ambulatory Care (MEADERS)

    PubMed Central

    Zafar, Atif; Hickner, John; Pace, Wilson; Tierney, William

    2008-01-01

    The Institute of Medicine (IOM) has identified the mitigation of Adverse Drug Events (ADEs) and Medication Errors (MEs) as top national priorities. Currently available reporting tools are fraught with inefficiencies that prevent widespread adoption into busy primary care practices. Using expert panel input we designed and built a new reporting tool that could be used in these settings with a variety of information technology capabilities. We pilot tested the system in four Practice Based Research Networks (PBRNs) comprising 24 practices. Over 10 weeks we recorded 507 reports, of which 370 were MEs and 137 were ADEs. Clinicians found the system easy to use, with the average time to generating a report under 4 minutes. By using streamlined interface design techniques we were successfully able to improve reporting rates of ADEs and MEs in these practices. PMID:18999053

  12. Streamlined bioreactor-based production of human cartilage tissues.

    PubMed

    Tonnarelli, B; Santoro, R; Adelaide Asnaghi, M; Wendt, D

    2016-05-27

    Engineered tissue grafts have been manufactured using methods based predominantly on traditional labour-intensive manual benchtop techniques. These methods impart significant regulatory and economic challenges, hindering the successful translation of engineered tissue products to the clinic. Alternatively, bioreactor-based production systems have the potential to overcome such limitations. In this work, we present an innovative manufacturing approach to engineer cartilage tissue within a single bioreactor system, starting from freshly isolated human primary chondrocytes, through the generation of cartilaginous tissue grafts. The limited number of primary chondrocytes that can be isolated from a small clinically-sized cartilage biopsy could be seeded and extensively expanded directly within a 3D scaffold in our perfusion bioreactor (5.4 ± 0.9 doublings in 2 weeks), bypassing conventional 2D expansion in flasks. Chondrocytes expanded in 3D scaffolds better maintained a chondrogenic phenotype than chondrocytes expanded on plastic flasks (collagen type II mRNA, 18-fold; Sox-9, 11-fold). After this "3D expansion" phase, bioreactor culture conditions were changed to subsequently support chondrogenic differentiation for two weeks. Engineered tissues based on 3D-expanded chondrocytes were more cartilaginous than tissues generated from chondrocytes previously expanded in flasks. We then demonstrated that this streamlined bioreactor-based process could be adapted to effectively generate up-scaled cartilage grafts in a size with clinical relevance (50 mm diameter). Streamlined and robust tissue engineering processes, as the one described here, may be key for the future manufacturing of grafts for clinical applications, as they facilitate the establishment of compact and closed bioreactor-based production systems, with minimal automation requirements, lower operating costs, and increased compliance to regulatory guidelines.

  13. Two-dimensional global hybrid simulation of pressure evolution and waves in the magnetosheath

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Denton, R. E.; Lee, L. C.; Chao, J. K.

    2001-06-01

    A two-dimensional hybrid simulation is carried out for the global structure of the magnetosheath. Quasi-perpendicular magnetosonic/fast mode waves with large-amplitude in-phase oscillations of the magnetic field and the ion density are seen near the bow shock transition. Alfvén/ion-cyclotron waves are observed along the streamlines in the magnetosheath, and the wave power peaks in the middle magnetosheath. Antiphase oscillations in the magnetic field and density are present away from the shock transition. Transport ratio analysis suggests that these oscillations result from mirror mode waves. Since fluid simulations are currently best able to model the global magnetosphere and the pressure in the magnetosphere is inherently anisotropic (parallel pressure p∥≠perpendicular pressure p⊥), it is of some interest to see if a fluid model can be used to predict the anisotropic pressure evolution of a plasma. Here the predictions of double adiabatic theory, the bounded anisotropy model, and the double polytropic model are tested using the two-dimensional hybrid simulation of the magnetosheath. Inputs to the models from the hybrid simulation are the initial post bow shock pressures and the time-dependent density and magnetic field strength along streamlines of the plasma. The success of the models is evaluated on the basis of how well they predict the subsequent evolution of p∥ and p⊥. The bounded anisotropy model, which encorporates a bound on p⊥/p∥ due to the effect of ion cyclotron pitch angle scattering, does a very good job of predicting the evolution of p⊥ this is evidence that local transfer of energy due to waves is occurring. Further evidence is the positive identification of ion-cyclotron waves in the simulation. The lack of such a good prediction for the evolution of p∥ appears to be due to the model's lack of time dependence for the wave-particle interaction and its neglect of the parallel heat flux. Estimates indicate that these effects will be less significant in the real magnetosheath, though perhaps not negligible.

  14. Enhancement of a parsimonious water balance model to simulate surface hydrology in a glacierized watershed

    USGS Publications Warehouse

    Valentin, Melissa M.; Viger, Roland J.; Van Beusekom, Ashley E.; Hay, Lauren E.; Hogue, Terri S.; Foks, Nathan Leon

    2018-01-01

    The U.S. Geological Survey monthly water balance model (MWBM) was enhanced with the capability to simulate glaciers in order to make it more suitable for simulating cold region hydrology. The new model, MWBMglacier, is demonstrated in the heavily glacierized and ecologically important Copper River watershed in Southcentral Alaska. Simulated water budget components compared well to satellite‐based observations and ground measurements of streamflow, evapotranspiration, snow extent, and total water storage, with differences ranging from 0.2% to 7% of the precipitation flux. Nash Sutcliffe efficiency for simulated and observed streamflow was greater than 0.8 for six of eight stream gages. Snow extent matched satellite‐based observations with Nash Sutcliffe efficiency values of greater than 0.89 in the four Copper River ecoregions represented. During the simulation period 1949 to 2009, glacier ice melt contributed 25% of total runoff, ranging from 12% to 45% in different tributaries, and glacierized area was reduced by 6%. Statistically significant (p < 0.05) decreasing and increasing trends in annual glacier mass balance occurred during the multidecade cool and warm phases of the Pacific Decadal Oscillation, respectively, reinforcing the link between climate perturbations and glacier mass balance change. The simulations of glaciers and total runoff for a large, remote region of Alaska provide useful data to evaluate hydrologic, cryospheric, ecologic, and climatic trends. MWBM glacier is a valuable tool to understand when, and to what extent, streamflow may increase or decrease as glaciers respond to a changing climate.

  15. Predator-prey models with component Allee effect for predator reproduction.

    PubMed

    Terry, Alan J

    2015-12-01

    We present four predator-prey models with component Allee effect for predator reproduction. Using numerical simulation results for our models, we describe how the customary definitions of component and demographic Allee effects, which work well for single species models, can be extended to predators in predator-prey models by assuming that the prey population is held fixed. We also find that when the prey population is not held fixed, then these customary definitions may lead to conceptual problems. After this discussion of definitions, we explore our four models, analytically and numerically. Each of our models has a fixed point that represents predator extinction, which is always locally stable. We prove that the predator will always die out either if the initial predator population is sufficiently small or if the initial prey population is sufficiently small. Through numerical simulations, we explore co-existence fixed points. In addition, we demonstrate, by simulation, the existence of a stable limit cycle in one of our models. Finally, we derive analytical conditions for a co-existence trapping region in three of our models, and show that the fourth model cannot possess a particular kind of co-existence trapping region. We punctuate our results with comments on their real-world implications; in particular, we mention the possibility of prey resurgence from mortality events, and the possibility of failure in a biological pest control program.

  16. Multistep Cylindrical Structure Analysis at Normal Incidence Based on Water-Substrate Broadband Metamaterial Absorbers

    NASA Astrophysics Data System (ADS)

    Fang, Chonghua

    2018-01-01

    A new multistep cylindrical structure based on water-substrate broadband metamaterial absorbers is designed to reduce the traditional radar cross-section (RCS) of a rod-shaped object. The proposed configuration consists of two distinct parts. One of these components is formed by a four-step cylindrical metal structure, whereas the other one is formed by a new water-substrate broadband metamaterial absorber. The designed structure can significantly reduce the radar cross section more than 10 dB from 4.58 to 18.42 GHz which is the 86.5 % bandwidth of from C-band to 20 GHz. The results of measurement show reasonably good accordance with the simulated ones, which verifies the ability and effect of the proposed design.

  17. Approximate Degrees of Similarity between a User's Knowledge and the Tutorial Systems' Knowledge Base

    ERIC Educational Resources Information Center

    Mogharreban, Namdar

    2004-01-01

    A typical tutorial system functions by means of interaction between four components: the expert knowledge base component, the inference engine component, the learner's knowledge component and the user interface component. In typical tutorial systems the interaction and the sequence of presentation as well as the mode of evaluation are…

  18. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    ERIC Educational Resources Information Center

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  19. Dynamic simulation of a reverse Brayton refrigerator

    NASA Astrophysics Data System (ADS)

    Peng, N.; Lei, L. L.; Xiong, L. Y.; Tang, J. C.; Dong, B.; Liu, L. Q.

    2014-01-01

    A test refrigerator based on the modified Reverse Brayton cycle has been developed in the Chinese Academy of Sciences recently. To study the behaviors of this test refrigerator, a dynamic simulation has been carried out. The numerical model comprises the typical components of the test refrigerator: compressor, valves, heat exchangers, expander and heater. This simulator is based on the oriented-object approach and each component is represented by a set of differential and algebraic equations. The control system of the test refrigerator is also simulated, which can be used to optimize the control strategies. This paper describes all the models and shows the simulation results. Comparisons between simulation results and experimental data are also presented. Experimental validation on the test refrigerator gives satisfactory results.

  20. Probabilistic evaluation of the water footprint of a river basin: Accounting method and case study in the Segura River Basin, Spain.

    PubMed

    Pellicer-Martínez, Francisco; Martínez-Paz, José Miguel

    2018-06-15

    In the current study a method for the probabilistic accounting of the water footprint (WF) at the river basin level has been proposed and developed. It is based upon the simulation of the anthropised water cycle and combines a hydrological model and a decision support system. The methodology was carried out in the Segura River Basin (SRB) in South-eastern Spain, and four historical scenarios were evaluated (1998-2010-2015-2027). The results indicate that the WF of the river basin reached 5581 Mm 3 /year on average in the base scenario, with a high variability. The green component (3231 Mm 3 /year), mainly generated by rainfed crops (62%), was responsible for the great variability of the WF. The blue WF (1201 Mm 3 /year) was broken down into surface water (56%), renewable groundwater (20%) and non-renewable groundwater (24%), and it showed the generalized overexploitation of aquifers. Regarding the grey component (1150 Mm 3 /year), the study reveals that wastewater, especially phosphates (90%), was the main culprit producing water pollution in surface water bodies. The temporal evolution of the four scenarios highlighted the successfulness of the water treatment plans developed in the river basin, with a sharp decrease in the grey WF, as well as the stability of the WF and its three components in the future. So, the accounting of the three components of the WF in a basin was integrated into the management of water resources, it being possible to predict their evolution, their spatial characterisation and even their assessment in probabilistic terms. Then, the WF was incorporated into the set of indicators that usually is used in water resources management and hydrological planning. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Numerical Simulation of Particle Motion in a Curved Channel

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Nie, Deming

    2018-01-01

    In this work the lattice Boltzmann method (LBM) is used to numerically study the motion of a circular particle in a curved channel at intermediate Reynolds numbers (Re). The effects of the Reynolds number and the initial particle position are taken into account. Numerical results include the streamlines, particle trajectories and final equilibrium positions. It has been found that the particle is likely to migrate to a similar equilibrium position irrespective of its initial position when Re is large.

  2. Self streamlining wind tunnel: Low speed testing and transonic test section design

    NASA Technical Reports Server (NTRS)

    Wolf, S. W. D.; Goodyer, M. J.

    1977-01-01

    Comprehensive aerodynamic data on an airfoil section were obtained through a wide range of angles of attack, both stalled and unstalled. Data were gathered using a self streamlining wind tunnel and were compared to results obtained on the same section in a conventional wind tunnel. The reduction of wall interference through streamline was demonstrated.

  3. Internal Flow Simulation of Enhanced Performance Solid Rocket Booster for the Space Transportation System

    NASA Technical Reports Server (NTRS)

    Ahmad, Rashid A.; McCool, Alex (Technical Monitor)

    2001-01-01

    An enhanced performance solid rocket booster concept for the space shuttle system has been proposed. The concept booster will have strong commonality with the existing, proven, reliable four-segment Space Shuttle Reusable Solid Rocket Motors (RSRM) with individual component design (nozzle, insulator, etc.) optimized for a five-segment configuration. Increased performance is desirable to further enhance safety/reliability and/or increase payload capability. Performance increase will be achieved by adding a fifth propellant segment to the current four-segment booster and opening the throat to accommodate the increased mass flow while maintaining current pressure levels. One development concept under consideration is the static test of a "standard" RSRM with a fifth propellant segment inserted and appropriate minimum motor modifications. Feasibility studies are being conducted to assess the potential for any significant departure in component performance/loading from the well-characterized RSRM. An area of concern is the aft motor (submerged nozzle inlet, aft dome, etc.) where the altered internal flow resulting from the performance enhancing features (25% increase in mass flow rate, higher Mach numbers, modified subsonic nozzle contour) may result in increased component erosion and char. To assess this issue and to define the minimum design changes required to successfully static test a fifth segment RSRM engineering test motor, internal flow studies have been initiated. Internal aero-thermal environments were quantified in terms of conventional convective heating and discrete phase alumina particle impact/concentration and accretion calculations via Computational Fluid Dynamics (CFD) simulation. Two sets of comparative CFD simulations of the RSRM and the five-segment (IBM) concept motor were conducted with CFD commercial code FLUENT. The first simulation involved a two-dimensional axi-symmetric model of the full motor, initial grain RSRM. The second set of analyses included three-dimensional models of the RSRM and FSM aft motors with four-degree vectored nozzles.

  4. Investigation of combustion characteristics in a scramjet combustor using a modified flamelet model

    NASA Astrophysics Data System (ADS)

    Zhao, Guoyan; Sun, Mingbo; Wang, Hongbo; Ouyang, Hao

    2018-07-01

    In this study, the characteristics of supersonic combustion inside an ethylene-fueled scramjet combustor equipped with multi-cavities were investigated with different injection schemes. Experimental results showed that the flames concentrated in the cavity and separated boundary layer downstream of the cavity, and they occupied the flow channel further enhancing the bulk flow compression. The flame structure in distributed injection scheme differed from that in centralized injection scheme. In numerical simulations, a modified flamelet model was introduced to consider that the pressure distribution is far from homogenous inside the scramjet combustor. Compared with original flamelet model, numerical predictions based on the modified model showed better agreement with the experimental results, validating the reliability of the calculations. Based on the modified model, the simulations with different injection schemes were analysed. The predicted flame agreed reasonably with the experimental observations in structure. The CO masses were concentrated in cavity and subsonic region adjacent to the cavity shear layer leading to intense heat release. Compared with centralized scheme, the higher jet mixing efficiency in distributed scheme induced an intense combustion in posterior upper cavity and downstream of the cavity. From streamline and isolation surfaces, the combustion at trail of lower cavity was depressed since the bulk flow downstream of the cavity is pushed down.

  5. Performance Test of Laser Velocimeter System for the Langley 16-foot Transonic Tunnel

    NASA Technical Reports Server (NTRS)

    Meyers, J. F.; Hunter, W. W., Jr.; Reubush, D. E.; Nichols, C. E., Jr.; Hepner, T. E.; Lee, J. W.

    1985-01-01

    An investigation in the Langley 16-Foot Transonic Tunnel has been conducted in which a laser velocimeter was used to measure free-stream velocities from Mach 0.1 to 1.0 and the flow velocities along the stagnating streamline of a hemisphere-cylinder model at Mach 0.8 and 1.0. The flow velocity was also measured at Mach 1.0 along the line 0.533 model diameters below the model. These tests determined the performance characteristics of the dedicated two-component laser velocimeter at flow velocities up to Mach 1.0 and the effects of the wind tunnel environment on the particle-generating system and on the resulting size of the generated particles. To determine these characteristics, the measured particle velocities along the stagnating streamline at the two Mach numbers were compared with the theoretically predicted gas and particle velocities calculated using a transonic potential flow method. Through this comparison the mean detectable particle size (2.1 micron) along with the standard deviation of the detectable particles (0.76 micron) was determined; thus the performance characteristics of the laser velocimeter were established.

  6. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    NASA Technical Reports Server (NTRS)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  7. Streamlining Appointment, Promotion, and Tenure Procedures to Promote Early-Career Faculty Success.

    PubMed

    Smith, Shannon B; Hollerbach, Ann; Donato, Annemarie Sipkes; Edlund, Barbara J; Atz, Teresa; Kelechi, Teresa J

    2016-01-01

    A critical component of the progression of a successful academic career is being promoted in rank. Early-career faculty are required to have an understanding of appointment, promotion, and tenure (APT) guidelines, but many factors often impede this understanding, thwarting a smooth and planned promotion pathway for professional advancement. This article outlines the steps taken by an APT committee to improve the promotion process from instructor to assistant professor. Six sigma's DMAIC improvement model was selected as the guiding operational framework to remove variation in the promotion process. After faculty handbook revisions were made, several checklists developed, and a process review rubric was implemented; recently promoted faculty were surveyed on satisfaction with the process. Faculty opinions captured in the survey suggest increased transparency in the process and perceived support offered by the APT committee. Positive outcomes include a strengthened faculty support framework, streamlined promotion processes, and improved faculty satisfaction. Changes to the APT processes resulted in an unambiguous and standardized pathway for successful promotion. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Toward robust estimation of the components of forest population change: simulation results

    Treesearch

    Francis A. Roesch

    2014-01-01

    This report presents the full simulation results of the work described in Roesch (2014), in which multiple levels of simulation were used to test the robustness of estimators for the components of forest change. In that study, a variety of spatial-temporal populations were created based on, but more variable than, an actual forest monitoring dataset, and then those...

  9. Effectiveness of and obstacles to antibiotic streamlining to amoxicillin monotherapy in bacteremic pneumococcal pneumonia.

    PubMed

    Blot, Mathieu; Pivot, Diane; Bourredjem, Abderrahmane; Salmon-Rousseau, Arnaud; de Curraize, Claire; Croisier, Delphine; Chavanet, Pascal; Binquet, Christine; Piroth, Lionel

    2017-09-01

    Antibiotic streamlining is pivotal to reduce the emergence of resistant bacteria. However, whether streamlining is frequently performed and safe in difficult situations, such as bacteremic pneumococcal pneumonia (BPP), has still to be assessed. All adult patients admitted to Dijon Hospital (France) from 2005 to 2013 who had BPP without complications, and were alive on the third day were enrolled. Clinical, biological, radiological, microbiological and therapeutic data were recorded. A first analysis was conducted to assess factors associated with being on amoxicillin on the third day. A second analysis, adjusting for a propensity score, was performed to determine whether 30-day mortality was associated with streamlining to amoxicillin monotherapy. Of the 196 patients hospitalized for BPP, 161 were still alive on the third day and were included in the study. Treatment was streamlined to amoxicillin in 60 patients (37%). Factors associated with not streamlining were severe pneumonia (OR 3.11, 95%CI [1.23-7.87]) and a first-line antibiotic combination (OR 3.08, 95%CI [1.34-7.09]). By contrast, starting with amoxicillin monotherapy correlated inversely with the risk of subsequent treatment with antibiotics other than amoxicillin (OR 0.06, 95%CI [0.01-0.30]). The Cox model adjusted for the propensity-score analysis showed that streamlining to amoxicillin during BPP was not significantly associated with a higher risk of 30-day mortality (HR 0.38, 95%CI [0.08-1.87]). Streamlining to amoxicillin is insufficiently implemented during BPP. This strategy is safe and potentially associated with ecological and economic benefits; therefore, it should be further encouraged, particularly when antibiotic combinations are started for severe pneumonia. Copyright © 2017. Published by Elsevier B.V.

  10. Sensitivity of CONUS Summer Rainfall to the Selection of Cumulus Parameterization Schemes in NU-WRF Seasonal Simulations

    NASA Technical Reports Server (NTRS)

    Iguchi, Takamichi; Tao, Wei-Kuo; Wu, Di; Peters-Lidard, Christa; Santanello, Joseph A.; Kemp, Eric; Tian, Yudong; Case, Jonathan; Wang, Weile; Ferraro, Robert; hide

    2017-01-01

    This study investigates the sensitivity of daily rainfall rates in regional seasonal simulations over the contiguous United States (CONUS) to different cumulus parameterization schemes. Daily rainfall fields were simulated at 24-km resolution using the NASA-Unified Weather Research and Forecasting (NU-WRF) Model for June-August 2000. Four cumulus parameterization schemes and two options for shallow cumulus components in a specific scheme were tested. The spread in the domain-mean rainfall rates across the parameterization schemes was generally consistent between the entire CONUS and most subregions. The selection of the shallow cumulus component in a specific scheme had more impact than that of the four cumulus parameterization schemes. Regional variability in the performance of each scheme was assessed by calculating optimally weighted ensembles that minimize full root-mean-square errors against reference datasets. The spatial pattern of the seasonally averaged rainfall was insensitive to the selection of cumulus parameterization over mountainous regions because of the topographical pattern constraint, so that the simulation errors were mostly attributed to the overall bias there. In contrast, the spatial patterns over the Great Plains regions as well as the temporal variation over most parts of the CONUS were relatively sensitive to cumulus parameterization selection. Overall, adopting a single simulation result was preferable to generating a better ensemble for the seasonally averaged daily rainfall simulation, as long as their overall biases had the same positive or negative sign. However, an ensemble of multiple simulation results was more effective in reducing errors in the case of also considering temporal variation.

  11. 3D Reconstruction of Chick Embryo Vascular Geometries Using Non-invasive High-Frequency Ultrasound for Computational Fluid Dynamics Studies.

    PubMed

    Tan, Germaine Xin Yi; Jamil, Muhammad; Tee, Nicole Gui Zhen; Zhong, Liang; Yap, Choon Hwai

    2015-11-01

    Recent animal studies have provided evidence that prenatal blood flow fluid mechanics may play a role in the pathogenesis of congenital cardiovascular malformations. To further these researches, it is important to have an imaging technique for small animal embryos with sufficient resolution to support computational fluid dynamics studies, and that is also non-invasive and non-destructive to allow for subject-specific, longitudinal studies. In the current study, we developed such a technique, based on ultrasound biomicroscopy scans on chick embryos. Our technique included a motion cancelation algorithm to negate embryonic body motion, a temporal averaging algorithm to differentiate blood spaces from tissue spaces, and 3D reconstruction of blood volumes in the embryo. The accuracy of the reconstructed models was validated with direct stereoscopic measurements. A computational fluid dynamics simulation was performed to model fluid flow in the generated construct of a Hamburger-Hamilton (HH) stage 27 embryo. Simulation results showed that there were divergent streamlines and a low shear region at the carotid duct, which may be linked to the carotid duct's eventual regression and disappearance by HH stage 34. We show that our technique has sufficient resolution to produce accurate geometries for computational fluid dynamics simulations to quantify embryonic cardiovascular fluid mechanics.

  12. Influence of the Convection Electric Field Models on Predicted Plasmapause Positions During Magnetic Storms

    NASA Technical Reports Server (NTRS)

    Pierrard, V.; Khazanov, G.; Cabrera, J.; Lemaire, J.

    2007-01-01

    In the present work, we determine how three well documented models of the magnetospheric electric field, and two different mechanisms proposed for the formation of the plasmapause influence the radial distance, the shape and the evolution of the plasmapause during the geomagnetic storms of 28 October 2001 and of 17 April 2002. The convection electric field models considered are: Mcllwain's E51) electric field model, Volland-Stern's model and Weimer's statistical model compiled from low-Earth orbit satellite data. The mechanisms for the formation of the plasmapause to be tested are: (i) the MHD theory where the plasmapause should correspond to the last-closed- equipotential (LCE) or last-closed-streamline (LCS), if the E-field distribution is stationary or time-dependent respectively; (ii) the interchange mechanism where the plasmapause corresponds to streamlines tangent to a Zero-Parallel-Force surface where the field-aligned plasma distribution becomes convectively unstable during enhancements of the E-field intensity in the nightside local time sector. The results of the different time dependent simulations are compared with concomitant EUV observations when available. The plasmatails or plumes observed after both selected geomagnetic storms are predicted in all simulations and for all E-field models. However, their shapes are quite different depending on the E-field models and the mechanisms that are used. Despite the partial success of the simulations to reproduce plumes during magnetic storms and substorms, there remains a long way to go before the detailed structures observed in the EUV observations during periods of geomagnetic activity can be accounted for very precisely by the existing E-field models. Furthermore, it cannot be excluded that the mechanisms currently identified to explain the formation of "Carpenter's knee" during substorm events, will', have to be revised or complemented in the cases of geomagnetic storms.

  13. SIMULATION AND MOCKUP OF SNS JET-FLOW TARGET WITH WALL JET FOR CAVITATION DAMAGE MITIGATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendel, Mark W; Geoghegan, Patrick J; Felde, David K

    2014-01-01

    Pressure waves created in liquid mercury pulsed spallation targets at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory induce cavitation damage on the stainless steel target container. The cavitation damage is thought to limit the lifetime of the target for power levels at and above 1 MW. Severe through-wall cavitation damage on an internal wall near the beam entrance window has been observed in spent-targets. Surprisingly though, there is very little damage on the walls that bound an annular mercury channel that wraps around the front and outside of the target. The mercury flow through this channel ismore » characterized by smooth, attached streamlines. One theory to explain this lack of damage is that the uni-directional flow biases the direction of the collapsing cavitation bubble, reducing the impact pressure and subsequent damage. The theory has been reinforced by in-beam separate effects data. For this reason, a second-generation SNS mercury target has been designed with an internal wall jet configuration intended to protect the concave wall where damage has been observed. The wall jet mimics the annular flow channel streamlines, but since the jet is bounded on only one side, the momentum is gradually diffused by the bulk flow interactions as it progresses around the cicular path of the target nose. Numerical simulations of the flow through this jet-flow target have been completed, and a water loop has been assembled with a transparent test target in order to visualize and measure the flow field. This paper presents the wall jet simulation results, as well as early experimental data from the test loop.« less

  14. Alternate nozzle ablative materials program

    NASA Technical Reports Server (NTRS)

    Kimmel, N. A.

    1984-01-01

    Four subscale solid rocket motor tests were conducted successfully to evaluate alternate nozzle liner, insulation, and exit cone structural overwrap components for possible application to the Space Shuttle Solid Rocket Motor (SRM) nozzle asasembly. The 10,000 lb propellant motor tests were simulated, as close as practical, the configuration and operational environment of the full scale SRM. Fifteen PAN based and three pitch based materials had no filler in the phenolic resin, four PAN based materials had carbon microballoons in the resin, and the rest of the materials had carbon powder in the resin. Three nozzle insulation materials were evaluated; an aluminum oxide silicon oxide ceramic fiber mat phenolic material with no resin filler and two E-glass fiber mat phenolic materials with no resin filler. It was concluded by MTI/WD (the fabricator and evaluator of the test nozzles) and NASA-MSFC that it was possible to design an alternate material full scale SRM nozzle assembly, which could provide an estimated 360 lb increased payload capability for Space Shuttle launches over that obtainable with the current qualified SRM design.

  15. SmaggIce 2D Version 1.8: Software Toolkit Developed for Aerodynamic Simulation Over Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Vickerman, Mary B.

    2005-01-01

    SmaggIce 2D version 1.8 is a software toolkit developed at the NASA Glenn Research Center that consists of tools for modeling the geometry of and generating the grids for clean and iced airfoils. Plans call for the completed SmaggIce 2D version 2.0 to streamline the entire aerodynamic simulation process--the characterization and modeling of ice shapes, grid generation, and flow simulation--and to be closely coupled with the public-domain application flow solver, WIND. Grid generated using version 1.8, however, can be used by other flow solvers. SmaggIce 2D will help researchers and engineers study the effects of ice accretion on airfoil performance, which is difficult to do with existing software tools because of complex ice shapes. Using SmaggIce 2D, when fully developed, to simulate flow over an iced airfoil will help to reduce the cost of performing flight and wind-tunnel tests for certifying aircraft in natural and simulated icing conditions.

  16. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  17. Formalizing Knowledge in Multi-Scale Agent-Based Simulations

    PubMed Central

    Somogyi, Endre; Sluka, James P.; Glazier, James A.

    2017-01-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused. PMID:29338063

  18. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    PubMed

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  19. Investigating the effects of streamline-based fiber tractography on matrix scaling in brain connective network.

    PubMed

    Jan, Hengtai; Chao, Yi-Ping; Cho, Kuan-Hung; Kuo, Li-Wei

    2013-01-01

    Investigating the brain connective network using the modern graph theory has been widely applied in cognitive and clinical neuroscience research. In this study, we aimed to investigate the effects of streamline-based fiber tractography on the change of network properties and established a systematic framework to understand how an adequate network matrix scaling can be determined. The network properties, including degree, efficiency and betweenness centrality, show similar tendency in both left and right hemispheres. By employing the curve-fitting process with exponential law and measuring the residuals, the association between changes of network properties and threshold of track numbers is found and an adequate range of investigating the lateralization of brain network is suggested. The proposed approach can be further applied in clinical applications to improve the diagnostic sensitivity using network analysis with graph theory.

  20. A Fourier method for the analysis of exponential decay curves.

    PubMed

    Provencher, S W

    1976-01-01

    A method based on the Fourier convolution theorem is developed for the analysis of data composed of random noise, plus an unknown constant "base line," plus a sum of (or an integral over a continuous spectrum of) exponential decay functions. The Fourier method's usual serious practical limitation of needing high accuracy data over a very wide range is eliminated by the introduction of convergence parameters and a Gaussian taper window. A computer program is described for the analysis of discrete spectra, where the data involves only a sum of exponentials. The program is completely automatic in that the only necessary inputs are the raw data (not necessarily in equal intervals of time); no potentially biased initial guesses concerning either the number or the values of the components are needed. The outputs include the number of components, the amplitudes and time constants together with their estimated errors, and a spectral plot of the solution. The limiting resolving power of the method is studied by analyzing a wide range of simulated two-, three-, and four-component data. The results seem to indicate that the method is applicable over a considerably wider range of conditions than nonlinear least squares or the method of moments.

  1. Computational Fluid Dynamic Analysis of Hydrodynamic forces on inundated bridge decks

    NASA Astrophysics Data System (ADS)

    Afzal, Bushra; Guo, Junke; Kerenyi, Kornel

    2010-11-01

    The hydraulic forces experienced by an inundated bridge deck have great importance in the design of bridges. Flood flows or hurricane add significant hydrodynamic loading on bridges, possibly resulting in failure of the bridge superstructures. The objective of the study is to establish validated computational practice to address research needs of the transportation community via computational fluid dynamic simulations. The reduced scale experiments conducted at Turner-Fairbank Highway Research Center establish the foundations of validated computational practices to address the research needs of the transportation community. Three bridge deck prototypes were used: a typical six-girder highway bridge deck, a three-girder deck, and a streamlined deck designed to better withstand the hydraulic forces. Results of the study showed that the streamlined deck significantly reduces drag, lift, and moment coefficient in comparison to the other bridge deck types. The CFD results matched the experimental data in terms of the relationship between inundation ratio and force measured at the bridge. The results of the present research will provide a tool for designing new bridges and retrofitting old ones.

  2. Multiple steady solutions in a driven cavity

    NASA Astrophysics Data System (ADS)

    Osman, Kahar; McHugh, John

    2004-11-01

    The symmetric driven cavity (Farias and McHugh, Phys. Fluids, 2002) in two and three dimensions is considered. Results are obtained via numerical computations of the Navier-Stokes equations, assuming constant density. The numerical algorithm is a splitting method, using finite differences. The forcing at the top is sinusoidal, and the forcing wavelength is allowed to vary in subsequent trials. The two dimensional results with 2, 4, and 6 oscillations in the forcing show a subcritical bifurcation to an asymmetric solution, with the Reynolds number as the important parameter. The symmetric solution is found to have vortex flow with streamlines that conform to the boundary shape. The asymmetric solution has vortex flow with streamlines that are approximately circular near the vortex center. Two dimensional results with 8 or more oscillations in the forcing show a supercritical bifurcation to an asymmetric solution. Three dimensional simulations show that the length ratios play a critical role, and the depth of the cavity must be large compared to the height in order to acheive the same subcritical bifurcation as with two dimensions.

  3. Stress as an order parameter for the glass transition

    NASA Astrophysics Data System (ADS)

    Visscher, P. B.; Logan, W. T.

    1990-09-01

    The stress tensor has been considered as a possible order parameter for the liquid-glass transition, and its autocorrelation matrix (elements of which are the integrands in the Green-Kubo formulas for bulk and shear viscosity) have been measured in simulations. However, only the k=0 spatial Fourier component has apparently been previously measured. We have measured four Fourier components of all matrix elements of the stress-stress correlation function, and we find that some of those with nonzero wave vector are significantly more persistent (slower decaying) than the k=0 component.

  4. JAMS - a software platform for modular hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kralisch, Sven; Fischer, Christian

    2015-04-01

    Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.

  5. On the effect of galactic outflows in cosmological simulations of disc galaxies

    NASA Astrophysics Data System (ADS)

    Valentini, Milena; Murante, Giuseppe; Borgani, Stefano; Monaco, Pierluigi; Bressan, Alessandro; Beck, Alexander M.

    2017-09-01

    We investigate the impact of galactic outflow modelling on the formation and evolution of a disc galaxy, by performing a suite of cosmological simulations with zoomed-in initial conditions (ICs) of a Milky Way-sized halo. We verify how sensitive the general properties of the simulated galaxy are to the way in which stellar feedback triggered outflows are implemented, keeping ICs, simulation code and star formation (SF) model all fixed. We present simulations that are based on a version of the gadget3 code where our sub-resolution model is coupled with an advanced implementation of smoothed particle hydrodynamics that ensures a more accurate fluid sampling and an improved description of gas mixing and hydrodynamical instabilities. We quantify the strong interplay between the adopted hydrodynamic scheme and the sub-resolution model describing SF and feedback. We consider four different galactic outflow models, including the one introduced by Dalla Vecchia & Schaye (2012) and a scheme that is inspired by the Springel & Hernquist (2003) model. We find that the sub-resolution prescriptions adopted to generate galactic outflows are the main shaping factor of the stellar disc component at low redshift. The key requirement that a feedback model must have to be successful in producing a disc-dominated galaxy is the ability to regulate the high-redshift SF (responsible for the formation of the bulge component), the cosmological infall of gas from the large-scale environment, and gas fall-back within the galactic radius at low redshift, in order to avoid a too high SF rate at z = 0.

  6. Stability and sensitivity of ABR flow control protocols

    NASA Astrophysics Data System (ADS)

    Tsai, Wie K.; Kim, Yuseok; Chiussi, Fabio; Toh, Chai-Keong

    1998-10-01

    This tutorial paper surveys the important issues in stability and sensitivity analysis of ABR flow control of ATM networks. THe stability and sensitivity issues are formulated in a systematic framework. Four main cause of instability in ABR flow control are identified: unstable control laws, temporal variations of available bandwidth with delayed feedback control, misbehaving components, and interactions between higher layer protocols and ABR flow control. Popular rate-based ABR flow control protocols are evaluated. Stability and sensitivity is shown to be the fundamental issues when the network has dynamically-varying bandwidth. Simulation result confirming the theoretical studies are provided. Open research problems are discussed.

  7. Study on lockage safety of LNG-fueled ships based on FSA

    PubMed Central

    Lv, Pengfei; Zhuang, Yuan; Deng, Jian; Su, Wei

    2017-01-01

    In the present study, formal safety assessment (FSA) is introduced to investigate lockage safety of LNG-fueled ships. Risk sources during lockage of LNG-fueled ships in four typical scenarios, namely, navigation between two dams, lockage, anchorage, and fueling, are identified, and studied in combination with fundamental leakage probabilities of various components of LNG storage tanks, and simulation results of accident consequences. Some suggestions for lockage safety management of LNG-fueled ships are then proposed. The present research results have certain practical significance for promoting applications of LNG-fueled ships along Chuanjiang River and in Three Gorges Reservoir Region. PMID:28437482

  8. Feasibility and concept study to convert the NASA/AMES vertical motion simulator to a helicopter simulator

    NASA Technical Reports Server (NTRS)

    Belsterling, C. A.; Chou, R. C.; Davies, E. G.; Tsui, K. C.

    1978-01-01

    The conceptual design for converting the vertical motion simulator (VMS) to a multi-purpose aircraft and helicopter simulator is presented. A unique, high performance four degrees of freedom (DOF) motion system was developed to permanently replace the present six DOF synergistic system. The new four DOF system has the following outstanding features: (1) will integrate with the two large VMS translational modes and their associated subsystems; (2) can be converted from helicopter to fixed-wing aircraft simulation through software changes only; (3) interfaces with an advanced cab/visual display system of large dimensions; (4) makes maximum use of proven techniques, convenient materials and off-the-shelf components; (5) will operate within the existing building envelope without modifications; (6) can be built within the specified weight limit and avoid compromising VMS performance; (7) provides maximum performance with a minimum of power consumption; (8) simple design minimizes coupling between motions and maximizes reliability; and (9) can be built within existing budgetary figures.

  9. Building an Open-source Simulation Platform of Acoustic Radiation Force-based Breast Elastography

    PubMed Central

    Wang, Yu; Peng, Bo; Jiang, Jingfeng

    2017-01-01

    Ultrasound-based elastography including strain elastography (SE), acoustic radiation force Impulse (ARFI) imaging, point shear wave elastography (pSWE) and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. “ground truth”) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity – one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments. PMID:28075330

  10. Building an open-source simulation platform of acoustic radiation force-based breast elastography

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Peng, Bo; Jiang, Jingfeng

    2017-03-01

    Ultrasound-based elastography including strain elastography, acoustic radiation force impulse (ARFI) imaging, point shear wave elastography and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. ‘ground truth’) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity—one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments.

  11. Multi-Objective Aerodynamic Optimization of the Streamlined Shape of High-Speed Trains Based on the Kriging Model.

    PubMed

    Xu, Gang; Liang, Xifeng; Yao, Shuanbao; Chen, Dawei; Li, Zhiwei

    2017-01-01

    Minimizing the aerodynamic drag and the lift of the train coach remains a key issue for high-speed trains. With the development of computing technology and computational fluid dynamics (CFD) in the engineering field, CFD has been successfully applied to the design process of high-speed trains. However, developing a new streamlined shape for high-speed trains with excellent aerodynamic performance requires huge computational costs. Furthermore, relationships between multiple design variables and the aerodynamic loads are seldom obtained. In the present study, the Kriging surrogate model is used to perform a multi-objective optimization of the streamlined shape of high-speed trains, where the drag and the lift of the train coach are the optimization objectives. To improve the prediction accuracy of the Kriging model, the cross-validation method is used to construct the optimal Kriging model. The optimization results show that the two objectives are efficiently optimized, indicating that the optimization strategy used in the present study can greatly improve the optimization efficiency and meet the engineering requirements.

  12. NextGEOSS: The Next Generation Data Hub For Earth Observations

    NASA Astrophysics Data System (ADS)

    Lilja Bye, Bente; De Lathouwer, Bart; Catarino, Nuno; Concalves, Pedro; Trijssenaar, Nicky; Grosso, Nuno; Meyer-Arnek, Julian; Goor, Erwin

    2017-04-01

    The Group on Earth observation embarked on the next 10 year phase with an ambition to streamline and further develop its achievements in building the Global Earth Observing System of Systems (GEOSS). The NextGEOSS project evolves the European vision of GEOSS data exploitation for innovation and business, relying on the three main pillars of engaging communities, delivering technological developments and advocating the use of GEOSS, in order to support the creation and deployment of Earth observation based innovative research activities and commercial services. In this presentation we will present the NextGEOSS concept, a concept that revolves around providing the data and resources to the users communities, together with Cloud resources, seamlessly connected to provide an integrated ecosystem for supporting applications. A central component of NextGEOSS is the strong emphasis put on engaging the communities of providers and users, and bridging the space in between.

  13. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  14. On a modified streamline curvature method for the Euler equations

    NASA Technical Reports Server (NTRS)

    Cordova, Jeffrey Q.; Pearson, Carl E.

    1988-01-01

    A modification of the streamline curvature method leads to a quasilinear second-order partial differential equation for the streamline coordinate function. The existence of a stream function is not required. The method is applied to subsonic and supersonic nozzle flow, and to axially symmetric flow with swirl. For many situations, the associated numerical method is both fast and accurate.

  15. STREAMLINED LIFE CYCLE ASSESSMENT: A FINAL REPORT FROM THE SETAC-NORTH AMERICA STREAMLINED LCA WORKGROUP

    EPA Science Inventory

    The original goal of the Streamlined LCA workgroup was to define and document a process for a shortened form of LCA. At the time, because of the large amount of data needed to do a cradle-to-grave evaluation, it was believed that in addition to such a "full" LCA approach there w...

  16. 75 FR 18219 - Statement of Organization, Functions, and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... programs and resources. Plans, organizes and conducts studies of organizational structures, functional... Americans to remain at home by streamlining access to community-based care and empowering older adults to... American grantees to promote the development of State and Native American-administered, community-based...

  17. Indirect estimation of absorption properties for fine aerosol particles using AATSR observations: a case study of wildfires in Russia in 2010

    NASA Astrophysics Data System (ADS)

    Rodriguez, E.; Kolmonen, P.; Virtanen, T. H.; Sogacheva, L.; Sundstrom, A.-M.; de Leeuw, G.

    2015-08-01

    The Advanced Along-Track Scanning Radiometer (AATSR) on board the ENVISAT satellite is used to study aerosol properties. The retrieval of aerosol properties from satellite data is based on the optimized fit of simulated and measured reflectances at the top of the atmosphere (TOA). The simulations are made using a radiative transfer model with a variety of representative aerosol properties. The retrieval process utilizes a combination of four aerosol components, each of which is defined by their (lognormal) size distribution and a complex refractive index: a weakly and a strongly absorbing fine-mode component, coarse mode sea salt aerosol and coarse mode desert dust aerosol). These components are externally mixed to provide the aerosol model which in turn is used to calculate the aerosol optical depth (AOD). In the AATSR aerosol retrieval algorithm, the mixing of these components is decided by minimizing the error function given by the sum of the differences between measured and calculated path radiances at 3-4 wavelengths, where the path radiances are varied by varying the aerosol component mixing ratios. The continuous variation of the fine-mode components allows for the continuous variation of the fine-mode aerosol absorption. Assuming that the correct aerosol model (i.e. the correct mixing fractions of the four components) is selected during the retrieval process, also other aerosol properties could be computed such as the single scattering albedo (SSA). Implications of this assumption regarding the ratio of the weakly/strongly absorbing fine-mode fraction are investigated in this paper by evaluating the validity of the SSA thus obtained. The SSA is indirectly estimated for aerosol plumes with moderate-to-high AOD resulting from wildfires in Russia in the summer of 2010. Together with the AOD, the SSA provides the aerosol absorbing optical depth (AAOD). The results are compared with AERONET data, i.e. AOD level 2.0 and SSA and AAOD inversion products. The RMSE (root mean square error) is 0.03 for SSA and 0.02 for AAOD lower than 0.05. The SSA is further evaluated by comparison with the SSA retrieved from the Ozone Monitoring Instrument (OMI). The SSA retrieved from both instruments show similar features, with generally lower AATSR-estimated SSA values over areas affected by wildfires.

  18. Two-Dimensional Computational Fluid Dynamics and Conduction Simulations of Heat Transfer in Horizontal Window Frames with Internal Cavities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustavsen, Arlid; Kohler, Christian; Dalehaug, Arvid

    2008-12-01

    This paper assesses the accuracy of the simplified frame cavity conduction/convection and radiation models presented in ISO 15099 and used in software for rating and labeling window products. Temperatures and U-factors for typical horizontal window frames with internal cavities are compared; results from Computational Fluid Dynamics (CFD) simulations with detailed radiation modeling are used as a reference. Four different frames were studied. Two were made of polyvinyl chloride (PVC) and two of aluminum. For each frame, six different simulations were performed, two with a CFD code and four with a building-component thermal-simulation tool using the Finite Element Method (FEM). Thismore » FEM tool addresses convection using correlations from ISO 15099; it addressed radiation with either correlations from ISO 15099 or with a detailed, view-factor-based radiation model. Calculations were performed using the CFD code with and without fluid flow in the window frame cavities; the calculations without fluid flow were performed to verify that the CFD code and the building-component thermal-simulation tool produced consistent results. With the FEM-code, the practice of subdividing small frame cavities was examined, in some cases not subdividing, in some cases subdividing cavities with interconnections smaller than five millimeters (mm) (ISO 15099) and in some cases subdividing cavities with interconnections smaller than seven mm (a breakpoint that has been suggested in other studies). For the various frames, the calculated U-factors were found to be quite comparable (the maximum difference between the reference CFD simulation and the other simulations was found to be 13.2 percent). A maximum difference of 8.5 percent was found between the CFD simulation and the FEM simulation using ISO 15099 procedures. The ISO 15099 correlation works best for frames with high U-factors. For more efficient frames, the relative differences among various simulations are larger. Temperature was also compared, at selected locations on the frames. Small differences was found in the results from model to model. Finally, the effectiveness of the ISO cavity radiation algorithms was examined by comparing results from these algorithms to detailed radiation calculations (from both programs). Our results suggest that improvements in cavity heat transfer calculations can be obtained by using detailed radiation modeling (i.e. view-factor or ray-tracing models), and that incorporation of these strategies may be more important for improving the accuracy of results than the use of CFD modeling for horizontal cavities.« less

  19. Mathematical modeling and simulation in animal health. Part I: Moving beyond pharmacokinetics.

    PubMed

    Riviere, J E; Gabrielsson, J; Fink, M; Mochel, J

    2016-06-01

    The application of mathematical modeling to problems in animal health has a rich history in the form of pharmacokinetic modeling applied to problems in veterinary medicine. Advances in modeling and simulation beyond pharmacokinetics have the potential to streamline and speed-up drug research and development programs. To foster these goals, a series of manuscripts will be published with the following goals: (i) expand the application of modeling and simulation to issues in veterinary pharmacology; (ii) bridge the gap between the level of modeling and simulation practiced in human and veterinary pharmacology; (iii) explore how modeling and simulation concepts can be used to improve our understanding of common issues not readily addressed in human pharmacology (e.g. breed differences, tissue residue depletion, vast weight ranges among adults within a single species, interspecies differences, small animal species research where data collection is limited to sparse sampling, availability of different sampling matrices); and (iv) describe how quantitative pharmacology approaches could help understanding key pharmacokinetic and pharmacodynamic characteristics of a drug candidate, with the goal of providing explicit, reproducible, and predictive evidence for optimizing drug development plans, enabling critical decision making, and eventually bringing safe and effective medicines to patients. This study introduces these concepts and introduces new approaches to modeling and simulation as well as clearly articulate basic assumptions and good practices. The driving force behind these activities is to create predictive models that are based on solid physiological and pharmacological principles as well as adhering to the limitations that are fundamental to applying mathematical and statistical models to biological systems. © 2015 John Wiley & Sons Ltd.

  20. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  1. Simulating the component counts of combinatorial structures.

    PubMed

    Arratia, Richard; Barbour, A D; Ewens, W J; Tavaré, Simon

    2018-02-09

    This article describes and compares methods for simulating the component counts of random logarithmic combinatorial structures such as permutations and mappings. We exploit the Feller coupling for simulating permutations to provide a very fast method for simulating logarithmic assemblies more generally. For logarithmic multisets and selections, this approach is replaced by an acceptance/rejection method based on a particular conditioning relationship that represents the distribution of the combinatorial structure as that of independent random variables conditioned on a weighted sum. We show how to improve its acceptance rate. We illustrate the method by estimating the probability that a random mapping has no repeated component sizes, and establish the asymptotic distribution of the difference between the number of components and the number of distinct component sizes for a very general class of logarithmic structures. Copyright © 2018. Published by Elsevier Inc.

  2. Improving ED efficiency to capture additional revenue.

    PubMed

    Mandavia, Sujal; Samaniego, Loretta

    2016-06-01

    An increase in the number of patients visiting emergency departments (EDs) presents an opportunity for additional revenue if hospitals take four steps to optimize resources: Streamline the patient pathway and reduce the amount of time each patient occupies a bed in the ED. Schedule staff according to the busy and light times for patient arrivals. Perform registration and triage bedside, reducing initial wait times. Create an area for patients to wait for test results so beds can be freed up for new arrivals.

  3. Theoretical analysis of incompressible flow through a radial-inlet centrifugal impeller at various weight flows

    NASA Technical Reports Server (NTRS)

    Kramer, James J; Prian, Vasily D; Wu, Chung-Hua

    1956-01-01

    A method for the solution of the incompressible nonviscous flow through a centrifugal impeller, including the inlet region, is presented. Several numerical solutions are obtained for four weight flows through an impeller at one operating speed. These solutions are refined in the leading-edge region. The results are presented in a series of figures showing streamlines and relative velocity contours. A comparison is made with the results obtained by using a rapid approximate method of analysis.

  4. Continuing Experiments on the Receptivity of Transient Disturbances to Surface Roughness and Freestream Turbulence

    DTIC Science & Technology

    2008-09-28

    rotating the spindle of the angle controller with a precision of 0.2°. The multiple-hotwire holder is designed to carry four hotwires. One hotwire is a...section and a maximum operating speed of 25 m/s. The tunnel’s design follows the recommendations of Reshotko et al. (1997) for flow quality. Operating at...This sting assembly includes a shaft that rotates in a streamlined casing and allows angular calibration of slanted hotwires. Outside the test

  5. Prioritizing pharmacokinetic drug interaction precipitants in natural products: application to OATP inhibitors in grapefruit juice

    PubMed Central

    Johnson, Emily J.; Won, Christina S.; Köck, Kathleen; Paine, Mary F.

    2017-01-01

    Natural products, including botanical dietary supplements and exotic drinks, represent an ever-increasing share of the health care market. The parallel ever-increasing popularity of self-medicating with natural products increases the likelihood of co-consumption with conventional drugs, raising concerns for unwanted natural product-drug interactions. Assessing the drug interaction liability of natural products is challenging due to the complex and variable chemical composition inherent to these products, necessitating a streamlined preclinical testing approach to prioritize precipitant individual constituents for further investigation. Such an approach was evaluated in the current work to prioritize constituents in the model natural product, grapefruit juice, as inhibitors of intestinal organic anion-transporting peptide (OATP)-mediated uptake. Using OATP2B1-expressing MDCKII cells and the probe substrate estrone 3-sulfate, IC50s were determined for constituents representative of the flavanone (naringin, naringenin, hesperidin), furanocoumarin (bergamottin, 6′,7′-dihydroxybergamottin), and polymethoxyflavone (nobiletin and tangeretin) classes contained in grapefruit juice juice. Nobiletin was the most potent (IC50, 3.7 μM); 6′,7′-dihydroxybergamottin, naringin, naringenin, and tangeretin were moderately potent (IC50, 20–50 μM); and bergamottin and hesperidin were the least potent (IC50, >300 μM) OATP2B1 inhibitors. Intestinal absorption simulations based on physiochemical properties were used to determine ratios of unbound concentration to IC50 for each constituent within enterocytes and to prioritize in order of pre-defined cut-off values. This streamlined approach could be applied to other natural products that contain multiple precipitants of natural product-drug interactions. PMID:28032362

  6. Visualization of Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Gerald-Yamasaki, Michael; Hultquist, Jeff; Bryson, Steve; Kenwright, David; Lane, David; Walatka, Pamela; Clucas, Jean; Watson, Velvin; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization serves the dual purpose of exploration and exposition of the results of numerical simulations of fluid flow. Along with the basic visualization process which transforms source data into images, there are four additional components to a complete visualization system: Source Data Processing, User Interface and Control, Presentation, and Information Management. The requirements imposed by the desired mode of operation (i.e. real-time, interactive, or batch) and the source data have their effect on each of these visualization system components. The special requirements imposed by the wide variety and size of the source data provided by the numerical simulation of fluid flow presents an enormous challenge to the visualization system designer. We describe the visualization system components including specific visualization techniques and how the mode of operation and source data requirements effect the construction of computational fluid dynamics visualization systems.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Malley, Daniel; Vesselinov, Velimir V.

    MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less

  8. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation.

    PubMed

    Palme, M; Inostroza, L; Villacreses, G; Lobato, A; Carrasco, C

    2017-10-01

    This data article presents files supporting calculation for urban heat island (UHI) inclusion in building performance simulation (BPS). Methodology is used in the research article "From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect" (Palme et al., 2017) [1]. In this research, a Geographical Information System (GIS) study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso). Then, a Principal Component Analysis (PCA) is done to obtain reference Urban Tissues Categories (UTC) to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG) software (version 4.1 beta). Finally, BPS is run out with the Transient System Simulation (TRNSYS) software (version 17). In this data paper, four sets of data are presented: 1) PCA data (excel) to explain how to group different urban samples in representative UTC; 2) UWG data (text) to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso); 3) weather data (text) with the resulting rural and urban weather; 4) BPS models (text) data containing the TRNSYS models (four building models).

  9. Wear Scar Similarities between Retrieved and Simulator-Tested Polyethylene TKR Components: An Artificial Neural Network Approach

    PubMed Central

    2016-01-01

    The aim of this study was to determine how representative wear scars of simulator-tested polyethylene (PE) inserts compare with retrieved PE inserts from total knee replacement (TKR). By means of a nonparametric self-organizing feature map (SOFM), wear scar images of 21 postmortem- and 54 revision-retrieved components were compared with six simulator-tested components that were tested either in displacement or in load control according to ISO protocols. The SOFM network was then trained with the wear scar images of postmortem-retrieved components since those are considered well-functioning at the time of retrieval. Based on this training process, eleven clusters were established, suggesting considerable variability among wear scars despite an uncomplicated loading history inside their hosts. The remaining components (revision-retrieved and simulator-tested) were then assigned to these established clusters. Six out of five simulator components were clustered together, suggesting that the network was able to identify similarities in loading history. However, the simulator-tested components ended up in a cluster at the fringe of the map containing only 10.8% of retrieved components. This may suggest that current ISO testing protocols were not fully representative of this TKR population, and protocols that better resemble patients' gait after TKR containing activities other than walking may be warranted. PMID:27597955

  10. Spatially-Distributed Stream Flow and Nutrient Dynamics Simulations Using the Component-Based AgroEcoSystem-Watershed (AgES-W) Model

    NASA Astrophysics Data System (ADS)

    Ascough, J. C.; David, O.; Heathman, G. C.; Smith, D. R.; Green, T. R.; Krause, P.; Kipka, H.; Fink, M.

    2010-12-01

    The Object Modeling System 3 (OMS3), currently being developed by the USDA-ARS Agricultural Systems Research Unit and Colorado State University (Fort Collins, CO), provides a component-based environmental modeling framework which allows the implementation of single- or multi-process modules that can be developed and applied as custom-tailored model configurations. OMS3 as a “lightweight” modeling framework contains four primary foundations: modeling resources (e.g., components) annotated with modeling metadata; domain specific knowledge bases and ontologies; tools for calibration, sensitivity analysis, and model optimization; and methods for model integration and performance scalability. The core is able to manage modeling resources and development tools for model and simulation creation, execution, evaluation, and documentation. OMS3 is based on the Java platform but is highly interoperable with C, C++, and FORTRAN on all major operating systems and architectures. The ARS Conservation Effects Assessment Project (CEAP) Watershed Assessment Study (WAS) Project Plan provides detailed descriptions of ongoing research studies at 14 benchmark watersheds in the United States. In order to satisfy the requirements of CEAP WAS Objective 5 (“develop and verify regional watershed models that quantify environmental outcomes of conservation practices in major agricultural regions”), a new watershed model development approach was initiated to take advantage of OMS3 modeling framework capabilities. Specific objectives of this study were to: 1) disaggregate and refactor various agroecosystem models (e.g., J2K-S, SWAT, WEPP) and implement hydrological, N dynamics, and crop growth science components under OMS3, 2) assemble a new modular watershed scale model for fully-distributed transfer of water and N loading between land units and stream channels, and 3) evaluate the accuracy and applicability of the modular watershed model for estimating stream flow and N dynamics. The Cedar Creek watershed (CCW) in northeastern Indiana, USA was selected for application of the OMS3-based AgroEcoSystem-Watershed (AgES-W) model. AgES-W performance for stream flow and N loading was assessed using Nash-Sutcliffe model efficiency (ENS) and percent bias (PBIAS) model evaluation statistics. Comparisons of daily and average monthly simulated and observed stream flow and N loads for the 1997-2005 simulation period resulted in PBIAS and ENS values that were similar or better than those reported in the literature for SWAT stream flow and N loading predictions at a similar scale. The results show that the AgES-W model was able to reproduce the hydrological and N dynamics of the CCW with sufficient quality, and should serve as a foundation upon which to better quantify additional water quality indicators (e.g., sediment transport and P dynamics) at the watershed scale.

  11. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    PubMed

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Impact of viscosity variation and micro rotation on oblique transport of Cu-water fluid.

    PubMed

    Tabassum, Rabil; Mehmood, R; Nadeem, S

    2017-09-01

    This study inspects the influence of temperature dependent viscosity on Oblique flow of micropolar nanofluid. Fluid viscosity is considered as an exponential function of temperature. Governing equations are converted into dimensionless forms with aid of suitable transformations. Outcomes of the study are shown in graphical form and discussed in detail. Results revealed that viscosity parameter has pronounced effects on velocity profiles, temperature distribution, micro-rotation, streamlines, shear stress and heat flux. It is found that viscosity parameter enhances the temperature distribution, tangential velocity profile, normal component of micro-rotation and shear stress at the wall while it has decreasing effect on tangential component of micro-rotation and local heat flux. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Study of parameters and entrainment of a jet in cross-flow arrangement with transition at two low Reynolds numbers

    NASA Astrophysics Data System (ADS)

    Cárdenas, Camilo; Denev, Jordan A.; Suntz, Rainer; Bockhorn, Henning

    2012-10-01

    Investigation of the mixing process is one of the main issues in chemical engineering and combustion and the configuration of a jet into a cross-flow (JCF) is often employed for this purpose. Experimental data are gained for the symmetry plane in a JCF-arrangement of an air flow using a combination of particle image velocimetry (PIV) with laser-induced fluorescence (LIF). The experimental data with thoroughly measured boundary conditions are complemented with direct numerical simulations, which are based on idealized boundary conditions. Two similar cases are studied with a fixed jet-to-cross-flow velocity ratio of 3.5 and variable cross-flow Reynolds numbers equal to 4,120 and 8,240; in both cases the jet issues from the pipe at laminar conditions. This leads to a laminar-to-turbulent transition, which depends on the Reynolds number and occurs quicker for the case with higher Reynolds number in both experiments and simulations as well. It was found that the Reynolds number only slightly affects the jet trajectory, which in the case with the higher Reynolds number is slightly deeper. It is attributed to the changed boundary layer shape of the cross-flow. Leeward streamlines bend toward the jet and are responsible for the strong entrainment of cross-flow fluid into the jet. Velocity components are compared for the two Reynolds numbers at the leeward side at positions where strongest entrainment is present and a pressure minimum near the jet trajectory is found. The numerical simulations showed that entrainment is higher for the case with the higher Reynolds number. The latter is attributed to the earlier transition in this case. Fluid entrainment of the jet in cross-flow is more than twice stronger than for a similar flow of a jet issuing into a co-flowing stream. This comparison is made along the trajectory of the two jets at a distance of 5.5 jet diameters downstream and is based on the results from the direct numerical simulations and recently published experiments of a straight jet into a co-flow. Mixing is further studied by means of second-order statistics of the passive scalar variance and the Reynolds fluxes. Windward and leeward sides of the jet exhibit different signs for the time-averaged streamwise Reynolds flux < v x ' c'>. The large coherent structures which contribute to this effect are investigated by means of timely correlated instantaneous PIV-LIF camera snapshots and their contribution to the average statistics of < v x ' c'> are discussed. The discussion on mixing capabilities of the jet in cross-flow is supported by simulation results showing the instantaneous three-dimensional coherent structures defined in terms of the pressure fluctuations.

  14. Streamline coal slurry letdown valve

    DOEpatents

    Platt, Robert J.; Shadbolt, Edward A.

    1983-01-01

    A streamlined coal slurry letdown valve is featured which has a two-piece throat comprised of a seat and seat retainer. The two-piece design allows for easy assembly and disassembly of the valve. A novel cage holds the two-piece throat together during the high pressure letdown. The coal slurry letdown valve has long operating life as a result of its streamlined and erosion-resistance surfaces.

  15. Streamline coal slurry letdown valve

    DOEpatents

    Platt, R.J.; Shadbolt, E.A.

    1983-11-08

    A streamlined coal slurry letdown valve is featured which has a two-piece throat comprised of a seat and seat retainer. The two-piece design allows for easy assembly and disassembly of the valve. A novel cage holds the two-piece throat together during the high pressure letdown. The coal slurry letdown valve has long operating life as a result of its streamlined and erosion-resistance surfaces. 5 figs.

  16. An Experimental Study of the Flowfield on a Semispan Rectangular Wing with a Simulated Glaze Ice Accretion. Ph.D. Thesis, 1993 Final Report

    NASA Technical Reports Server (NTRS)

    Khodadoust, Abdollah

    1994-01-01

    Wind tunnel experiments were conducted in order to study the effect of a simulated glaze ice accretion on the flowfield of a semispan, reflection-plane, rectangular wing at Re = 1.5 million and M = 0.12. A laser Doppler velocimeter was used to map the flowfield on the upper surface of the model in both the clean and iced configurations at alpha = 0, 4, and 8 degrees angle of attack. At low angles of attack, the massive separation bubble aft of the leading edge ice horn was found to behave in a manner similar to laminar separation bubbles. At alpha = 0 and 4 degrees, the locations of transition and reattachment, as deduced from momentum thickness distributions, were found to be in good agreement with transition and reattachment locations in laminar separation bubbles. These values at y/b = 0.470, the centerline measurement location, matched well with data obtained on a similar but two dimensional model. The measured velocity profiles on the iced wing compared reasonably with the predicted profiles from Navier-Stokes computations. The iced-induced separation bubble was also found to have features similar to the recirculating region aft of rearward-facing steps. At alpha = 0 degrees and 4 degrees, reverse flow magnitudes and turbulence intensity levels were typical of those found in the recirculating region aft of rearward-facing steps. The calculated separation streamline aft of the ice horn at alpha = 4 degrees, y/b = 0.470 coincided with the locus of the maximum Reynolds normal stress. The maximum Reynolds normal stress peaked at two locations along the separation streamline. The location of the first peak-value coincided with the transition location, as deduced from the momentum thickness distributions. The location of the second peak was just upstream of reattachment, in good agreement with measurements of flows over similar obstacles. The intermittency factor in the vicinity of reattachment at alpha = 4 degrees, y/b = 0.470, revealed the time-dependent nature of the reattachment process. The size and extent of the separation bubble were found to be a function of angle of attack and the spanwise location. Three dimensional effects were found to be strongest at alpha = 8 degrees. The calculated separation and stagnation streamlines were found to vary little with spanwise location at alpha = 0 degrees. The calculated separation streamlines at alpha = 4 degrees revealed that the bubble was largest near the centerline measurement plane, whereas the tip-induced vortex flow and the model root-tunnel wall boundary-layer interaction reduced the size of the bubble. These effects were found to be most dramatic at alpha = 8 degrees.

  17. Failure of Anisotropic Unstructured Mesh Adaption Based on Multidimensional Residual Minimization

    NASA Technical Reports Server (NTRS)

    Wood, William A.; Kleb, William L.

    2003-01-01

    An automated anisotropic unstructured mesh adaptation strategy is proposed, implemented, and assessed for the discretization of viscous flows. The adaption criteria is based upon the minimization of the residual fluctuations of a multidimensional upwind viscous flow solver. For scalar advection, this adaption strategy has been shown to use fewer grid points than gradient based adaption, naturally aligning mesh edges with discontinuities and characteristic lines. The adaption utilizes a compact stencil and is local in scope, with four fundamental operations: point insertion, point deletion, edge swapping, and nodal displacement. Evaluation of the solution-adaptive strategy is performed for a two-dimensional blunt body laminar wind tunnel case at Mach 10. The results demonstrate that the strategy suffers from a lack of robustness, particularly with regard to alignment of the bow shock in the vicinity of the stagnation streamline. In general, constraining the adaption to such a degree as to maintain robustness results in negligible improvement to the solution. Because the present method fails to consistently or significantly improve the flow solution, it is rejected in favor of simple uniform mesh refinement.

  18. Vortex-Body Interactions: A Critical Assessment. Coupled Gap-Wake Instabilities/Turbulence: A Source of Noise

    NASA Technical Reports Server (NTRS)

    Rockwell, Donald

    1999-01-01

    This program has involved, first of all, a critical state-of-the-art assessment of vortex-body interactions. Then, efforts were focused on experimental investigation on coupled-wake instabilities and turbulence occurring in a two-cylinder system. An extensive review was undertaken on the effect of incident vortices on various types of bodies. These incident vortices have a length scale of the same order of magnitude as the scale of the body. The body can take on various forms, including, for example, a circular cylinder, a blade or a wing. The classes of vortex-body interaction that were critically assessed include: (1) Periodic distortion of the incident (primary) vortex and shedding of secondary vorticity from the surface of the body. (2) Modulated vortex distortion and shedding at a leading-edge or surface due to incidence of a complex system of vortices. (3) Vortex distortion and shedding in presence of body oscillation. (4) Three-dimensional vortex interaction and shedding. For all of these classes of vortex-body interaction, quantitative topologies of the vorticity distributions and streamline patterns were found to be central to a unified description of mechanisms of vortex distortion and shedding. In most cases, it was possible to define relationships between vortex interactions and unsteady loading at the body surface. This phase of the program was an experimental investigation of a two-cylinder system, which simulated a central aspect of a four-wheel bogie on a large-scale commercial aircraft. The overall aim of this experimental research program was to determine the crucial elements of the unsteadiness in the gap and near-wake regions as a function of time using cinema-based techniques. During the research program, various image evaluation techniques were employed. They involved assessment of instantaneous velocity fields, streamline topology and patterns of vorticity. Experiments were performed in a large-scale water channel using a high-resolution version of digital particle image velocimetry. The program has focused on acquisition of images of velocity and vorticity for varying gap widths between the two-cylinder system. As a result of analysis of a relatively large number of images, it is demonstrated that low frequency instabilities can occur in the gap region between the cylinder. These low frequency instabilities are hypothesized to influence the near-wake structure of the entire two-cylinder system. The nature of the unstable shear layers in the gap region involves generation of small-scale Kelvin-Helmholtz instabilities. These unsteady shear layers then impinge upon the upper and lower surfaces of the cylinders, thereby influencing both the unsteady structure and the time-averaged patterns of the near-wake. Initial efforts have focused on characterization of the patterns of instantaneous and averaged streamlines using topological concepts. The end result of this investigation is a series of documented instantaneous images. They will serve as a basis for various types of post-processing, which will lead to a fuller understanding of the instantaneous and time-averaged unstable-turbulent fields in the gap region and downstream of the two-cylinder system. This further assessment is the focus of a subsequent program.

  19. Modelling exhaust plume mixing in the near field of an aircraft

    NASA Astrophysics Data System (ADS)

    Garnier, F.; Brunet, S.; Jacquin, L.

    1997-11-01

    A simplified approach has been applied to analyse the mixing and entrainment processes of the engine exhaust through their interaction with the vortex wake of an aircraft. Our investigation is focused on the near field, extending from the exit nozzle until about 30 s after the wake is generated, in the vortex phase. This study was performed by using an integral model and a numerical simulation for two large civil aircraft: a two-engine Airbus 330 and a four-engine Boeing 747. The influence of the wing-tip vortices on the dilution ratio (defined as a tracer concentration) shown. The mixing process is also affected by the buoyancy effect, but only after the jet regime, when the trapping in the vortex core has occurred. In the early wake, the engine jet location (i.e. inboard or outboard engine jet) has an important influence on the mixing rate. The plume streamlines inside the vortices are subject to distortion and stretching, and the role of the descent of the vortices on the maximum tracer concentration is discussed. Qualitative comparison with contrail photograph shows similar features. Finally, tracer concentration of inboard engine centreline of B-747 are compared with other theoretical analyses and measured data.

  20. Selective Detection of Target Volatile Organic Compounds in Contaminated Humid Air Using a Sensor Array with Principal Component Analysis

    PubMed Central

    Itoh, Toshio; Akamatsu, Takafumi; Tsuruta, Akihiro; Shin, Woosuck

    2017-01-01

    We investigated selective detection of the target volatile organic compounds (VOCs) nonanal, n-decane, and acetoin for lung cancer-related VOCs, and acetone and methyl i-butyl ketone for diabetes-related VOCs, in humid air with simulated VOC contamination (total concentration: 300 μg/m3). We used six “grain boundary-response type” sensors, including four commercially available sensors (TGS 2600, 2610, 2610, and 2620) and two Pt, Pd, and Au-loaded SnO2 sensors (Pt, Pd, Au/SnO2), and two “bulk-response type” sensors, including Zr-doped CeO2 (CeZr10), i.e., eight sensors in total. We then analyzed their sensor signals using principal component analysis (PCA). Although the six “grain boundary-response type” sensors were found to be insufficient for selective detection of the target gases in humid air, the addition of two “bulk-response type” sensors improved the selectivity, even with simulated VOC contamination. To further improve the discrimination, we selected appropriate sensors from the eight sensors based on the PCA results. The selectivity to each target gas was maintained and was not affected by contamination. PMID:28753948

Top