Science.gov

Sample records for centre simulation model

  1. Cell-centred model for the simulation of curved cellular monolayers

    NASA Astrophysics Data System (ADS)

    Mosaffa, Payman; Asadipour, Nina; Millán, Daniel; Rodríguez-Ferran, Antonio; J Muñoz, Jose

    2015-12-01

    This paper presents a cell-centred model for the simulation of planar and curved multicellular soft tissues. We propose a computational model that includes stress relaxation due to cell reorganisation (intercellular connectivity changes) and cytoskeleton remodelling (intracellular changes). Cells are represented by their cell centres, and their mechanical interaction is modelled through active non-linear elastic laws with a dynamically changing resting length. Special attention is paid to the handling of connectivity changes between cells, and the relaxation that the tissues exhibit under these topological changes. Cell-cell connectivity is computed by resorting to a Delaunay triangulation, which is combined with a mapping technique in order to obtain triangulations on curved manifolds. Our numerical results show that even a linear elastic cell-cell interaction model may induce a global non-linear response due to the reorganisation of the cell connectivity. This plastic-like behaviour is combined with a non-linear rheological law where the resting length depends on the elastic strain, mimicking the global visco-elastic response of tissues. The model is applied to simulate the elongation of planar and curved monolayers.

  2. Evaluation of Satellite-based Global Hydrologic Simulation using the Distributed CREST Model and Global Runoff Data Centre Archives

    NASA Astrophysics Data System (ADS)

    Xue, X.; Hong, Y.; Gourley, J. J.; Wang, X.

    2011-12-01

    Flooding is one of the most deadly natural hazards around the world. Distributed hydrologic models can provide the spatial and temporal distribution of precipitation, soil moisture, evapotranspiration and runoff. Implementation of a flood prediction and/or forecast system using a distributed hydrologic model can potentially help mitigate flood-induced hazards. In this study, we propose the use of the Coupled Routing and Excess STorage (CREST) distributed hydrological model driven by real-time rainfall forcing from TRMM-based multi-satellite products and/or precipitation forecast data from the Global Forecast System model (GFS), combined with automatic parameter optimization methods, to estimate hydrological fluxes, storages and inundated areas. Evaluations show that: 1) the capability of real-time streamflow prediction and/or forecast at drainage outlets and identification of inundated areas upstream is an achievable goal even for ungauged basins; 2) a-priori, physically-based parameter estimates with CREST reduce the dependence on rainfall-runoff data often required to calibrate distributed hydrologic models; and 3) the validation of CREST simulations of basin discharge are skillful in several basins throughout the world.

  3. Development of a simulation and skills centre in East Africa: a Rwandan-Canadian partnership.

    PubMed

    Livingston, Patricia; Bailey, Jonathan; Ntakiyiruta, Georges; Mukwesi, Christian; Whynot, Sara; Brindley, Peter

    2014-01-01

    Simulation replicates clinical experiences without patient risk; it remains uncommon in lower-income countries. We outline the creation of Rwanda's first centre for simulation and skills training. We secured funding for renovations, equipment and staff; curricula were developed, tested, and refined; local clinicians were trained to teach. In 13 months the centre provided 2,377 learning-encounters and 822 hours of training to Rwandan health care professionals. Our strategy represents an adaptable model for simulation and skills centre development in low-resources settings. PMID:25328611

  4. Skills development at a paramedic accident simulation centre.

    PubMed

    Donaghy, John

    2016-02-01

    Practice simulation in acute and pre-hospital care settings is a growing area of interest for clinicians and health educationalists, and there is much evidence to support its use (Pike and O'Donnell 2010). Most simulation is delivered through computer-aided software or in virtual environments, however last year the University of Hertfordshire opened an accident simulation centre which is an outdoor facility that offers pre- and post-registration paramedics the opportunity to experience a range of scenarios in a 'real life' but secure environment. This article describes how the centre enables students to apply theory to practice in complex situations, such as managing patients injured in road traffic collisions. PMID:26853672

  5. Galactic Centre hypershell model for the North Polar Spurs

    NASA Astrophysics Data System (ADS)

    Sofue, Y.; Habe, A.; Kataoka, J.; Totani, T.; Inoue, Y.; Nakashima, S.; Matsui, H.; Akita, M.

    2016-06-01

    The bipolar-hypershell (BHS) model for the North Polar Spurs (NPS-E, -W, and Loop I) and counter southern spurs (SPS-E and -W) is revisited based on numerical hydrodynamical simulations. Propagations of shock waves produced by energetic explosive events in the Galactic Centre are examined. Distributions of soft X-ray brightness on the sky at 0.25, 0.7, and 1.5 keV in the ±50° × ±50° region around the Galactic Centre are modelled by thermal emission from high-temperature plasma in the shock-compressed shell considering shadowing by the interstellar H I and H2 gases. The result is compared with the ROSAT wide field X-ray images in R2, 4, and 6 bands. The NPS and southern spurs are well reproduced by the simulation as shadowed dumbbell-shaped shock waves. We discuss the origin and energetics of the event in relation to the starburst and/or active galactic nucleus activities in the Galactic Centre.

  6. Complex Modelling Scheme Of An Additive Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Popescu, Liliana Georgeta

    2015-09-01

    This paper presents a modelling scheme sustaining the development of an additive manufacturing research centre model and its processes. This modelling is performed using IDEF0, the resulting model process representing the basic processes required in developing such a centre in any university. While the activities presented in this study are those recommended in general, changes may occur in specific existing situations in a research centre.

  7. A SDMS Model: Early Warning Coordination Centres

    NASA Astrophysics Data System (ADS)

    Santos-Reyes, Jaime

    2010-05-01

    of work. If it's a tsunami, you've got to get it down to the last Joe on the beach. This is the stuff that is really very hard." Given the above, the paper argues that there is a need for a systemic approach to early warning centres. Systemic means looking upon things as a system; systemic means seeing pattern and inter-relationship within a complex whole; i.e., to see events as products of the working of a system. System may be defined as a whole which is made of parts and relationships. Given this, ‘failure' may be seen as the product of a system and, within that, see death/injury/property loss etc. as results of the working of systems. This paper proposes a preliminary model of ‘early warning coordination centres' (EWCC); it should be highlighted that an EWCC is a subsystem of the Systemic Disaster Management System (SDMS) model.

  8. The Italian institutional accreditation model for Haemophilia Centres

    PubMed Central

    Calizzani, Gabriele; Candura, Fabio; Menichini, Ivana; Arcieri, Romano; Castaman, Giancarlo; Lamanna, Alessandro; Tamburrini, Maria R.; Fortino, Antonio; Lanzoni, Monica; Profili, Samantha; Pupella, Simonetta; Liumbruno, Giancarlo M.; Grazzini, Giuliano

    2014-01-01

    Background In Italy, basic health needs of patients with inherited bleeding disorders are met by a network of 50 haemophilia centres belonging to the Italian Association of Haemophilia Centres. Further emerging needs, due to the increased life expectancy of this patient group, require a multi-professional clinical management of the disease and provide a challenge to the organisation of centres. In order to achieve harmonised quality standards of haemophilia care across Italian Regions, an institutional accreditation model for haemophilia centres has been developed. Material and methods To develop an accreditation scheme for haemophilia centres, a panel of experts representing medical and patient bodies, the Ministry of Health and Regional Health Authorities has been appointed by the National Blood Centre. Following a public consultation, a technical proposal in the form of recommendations for Regional Health Authorities has been formally submitted to the Ministry of Health and has formed the basis for a proposal of Agreement between the Government and the Regions. Results The institutional accreditation model for Haemophilia Centres was approved as an Agreement between the Government and the Regions in March 2013. It identified 23 organisational requirements for haemophilia centres covering different areas and activities. Discussion The Italian institutional accreditation model aims to achieve harmonised quality standards across Regions and to implement continuous improvement efforts, certified by regional inspection systems. The identified requirements are considered as necessary and appropriate in order to provide haemophilia services as “basic healthcare levels” under the umbrella of the National Health Service. This model provides Regions with a flexible institutional accreditation scheme that can be potentially extended to other rare diseases. PMID:24922290

  9. How does our choice of observable influence our estimation of the centre of a galaxy cluster? Insights from cosmological simulations

    NASA Astrophysics Data System (ADS)

    Cui, Weiguang; Power, Chris; Biffi, Veronica; Borgani, Stefano; Murante, Giuseppe; Fabjan, Dunja; Knebe, Alexander; Lewis, Geraint F.; Poole, Greg B.

    2016-03-01

    Galaxy clusters are an established and powerful test-bed for theories of both galaxy evolution and cosmology. Accurate interpretation of cluster observations often requires robust identification of the location of the centre. Using a statistical sample of clusters drawn from a suite of cosmological simulations in which we have explored a range of galaxy formation models, we investigate how the location of this centre is affected by the choice of observable - stars, hot gas, or the full mass distribution as can be probed by the gravitational potential. We explore several measures of cluster centre: the minimum of the gravitational potential, which would expect to define the centre if the cluster is in dynamical equilibrium; the peak of the density; the centre of brightest cluster galaxy (BCG); and the peak and centroid of X-ray luminosity. We find that the centre of BCG correlates more strongly with the minimum of the gravitational potential than the X-ray defined centres, while active galactic nuclei feedback acts to significantly enhance the offset between the peak X-ray luminosity and minimum gravitational potential. These results highlight the importance of centre identification when interpreting clusters observations, in particular when comparing theoretical predictions and observational data.

  10. Modelling of dynamic targeting in the Air Operations Centre

    NASA Astrophysics Data System (ADS)

    Lo, Edward H. S.; Au, T. Andrew

    2007-12-01

    Air Operations Centres (AOCs) are high stress multitask environments for planning and executing of theatre-wide airpower. Operators have multiple responsibilities to ensure that the orchestration of air assets is coordinated to maximum effect. AOCs utilise a dynamic targeting process to immediately prosecute time-sensitive targets. For this process to work effectively, a timely decision must be made regarding the appropriate course of action before the action is enabled. A targeting solution is typically developed using a number of inter-related processes in the kill chain - the Find, Fix, Track, Target, Engage, and Assess (F2T2EA) model. The success of making a right decision about dynamic targeting is ultimately limited by the cognitive and cooperative skills of the team prosecuting the mission and their associated workload. This paper presents a model of human interaction and tasks within the dynamic targeting sequence. The complex network of tasks executed by the team can be analysed by undertaking simulation of the model to identify possible information-processing bottlenecks and overloads. The model was subjected to various tests to generate typical outcomes, operator utilisation, duration as well as rates of output in the dynamic targeting process. This capability will allow for future "what-if" evaluations of numerous concepts for team formation or task reallocation, complementing live exercises and experiments.

  11. Examining molecular clouds in the Galactic Centre region using X-ray reflection spectra simulations.

    NASA Astrophysics Data System (ADS)

    Walls, M.; Chernyakova, M.; Terrier, R.; Goldwurm, A.

    2016-09-01

    In the centre of our galaxy lies a super-massive black hole, identified with the radio source Sagittarius A⋆. This black hole has an estimated mass of around 4 million solar masses. Although Sagittarius A⋆ is quite dim in terms of total radiated energy, having a luminosity that is a factor of 1010 lower than its Eddington luminosity, there is now compelling evidence that this source was far brighter in the past. Evidence derived from the detection of reflected X-ray emission from the giant molecular clouds in the galactic centre region. However, the interpretation of the reflected emission spectra cannot be done correctly without detailed modelling of the reflection process. Attempts to do so can lead to an incorrect interpretation of the data. In this paper we present the results of a Monte Carlo simulation code we developed in order to fully model the complex processes involved in the emerging reflection spectra. The simulated spectra can be compared to real data in order to derive model parameters and constrain the past activity of the black hole. In particular we apply our code to observations of Sgr B2, in order to constrain the position and density of the cloud and the incident luminosity of the central source. The results of the code have been adapted to be used in Xspec by a large community of astronomers.

  12. A model of patient-centred care - turning good care into patient-centred care.

    PubMed

    Scambler, S; Asimakopoulou, K

    2014-09-01

    This paper builds on previous work reviewing patient-centred care in dentistry and acknowledges work that has questioned the measurement and effectiveness of patient-centredness in practice. In an attempt to move the debate from rhetoric to practice and enhance the practical utility of the concept, we present a practical hierarchy of patient-centredness that may aid the practical application of patient-centred care in clinical practice by making explicit a series of stages that a dental care professional needs to move through in order to provide care that is patient-centred. The model presented is illustrated through practical examples. The various stages inherent in it are described with the aim of making clear the perhaps automatic and taken for granted assumptions that are often made by dental care professionals and patients through the course of a consultation. Our aim is to encourage dental consultations to have more open, unambiguous communication, both about the risks and benefits of courses of action and about the choices available to patients. PMID:25213518

  13. Advances in stream shade modelling. Accounting for canopy overhang and off-centre view

    NASA Astrophysics Data System (ADS)

    Davies-Colley, R.; Meleason, M. A.; Rutherford, K.

    2005-05-01

    Riparian shade controls the stream thermal regime and light for photosynthesis of stream plants. The quantity difn (diffuse non-interceptance), defined as the proportion of incident lighting received under a sky of uniform brightness, is useful for general specification of stream light exposure, having the virtue that it can be measured directly with common light sensors of appropriate spatial and spectral character. A simple model (implemented in EXCEL-VBA) (Davies-Colley & Rutherford Ecol. Engrg in press) successfully reproduces the broad empirical trend of decreasing difn at the channel centre with increasing ratio of canopy height to stream width. We have now refined this model to account for (a) foliage overhanging the channel (for trees of different canopy form), and (b) off-centre view of the shade (rather than just the channel centre view). We use two extreme geometries bounding real (meandering) streams: the `canyon' model simulates an infinite straight canal, whereas the `cylinder' model simulates a stream meandering so tightly that its geometry collapses into an isolated pool in the forest. The model has been validated using a physical `rooftop' model of the cylinder case, with which it is possible to measure shade with different geometries.

  14. The ESA Virtual Space Weather Modelling Centre - Phase 1

    NASA Astrophysics Data System (ADS)

    Poedts, Stefaan

    The ESA ITT project (AO/1-6738/11/NL/AT) to develop Phase 1 of a Virtual Space Weather Modelling Centre has the following objectives and scope: 1. The construction of a long term (~10 yrs) plan for the future development of a European virtual space weather modelling centre consisting of a new ‘open’ and distributed framework for the coupling of physics based models for space weather phenomena; 2. The assessment of model capabilities and the amount of work required to make them operational by integrating them in this framework and the identification of computing and networking requirements to do so. 3. The design of a system to enable models and other components to be installed locally or geographically distributed and the creation of a validation plan including a system of metrics for testing results. The consortium that took up this challenge involves: 1)the Katholieke Universiteit Leuven (Prime Contractor, coordinator: Prof. S. Poedts); 2) the Belgian Institute for Space Aeronomy (BIRA-IASB); 3) the Royal Observatory of Belgium (ROB); 4) the Von Karman Institute (VKI); 5) DH Consultancy (DHC); 6) Space Applications Services (SAS). The project started on May 14 2012, and will finish in May 2014. Thus, by the time of the meeting, both Phase 1A and Phase 1B (the development of the prototype) will be finished. The final report will be presented incl. the architecture decisions made, the framework, the current models integrated already as well as the model couplers installed. The prototype VSWMC will be demonstrated.

  15. Competency-based simulation assessment of resuscitation skills in emergency medicine postgraduate trainees – a Canadian multi-centred study

    PubMed Central

    Dagnone, J. Damon; Hall, Andrew K.; Sebok-Syer, Stefanie; Klinger, Don; Woolfrey, Karen; Davison, Colleen; Ross, John; McNeil, Gordon; Moore, Sean

    2016-01-01

    Background The use of high-fidelity simulation is emerging as a desirable method for competency-based assessment in postgraduate medical education. We aimed to demonstrate the feasibility and validity of a multi-centre simulation-based Objective Structured Clinical Examination (OSCE) of resuscitation competence with Canadian Emergency Medicine (EM) trainees. Method EM postgraduate trainees (n=98) from five Canadian academic centres participated in a high fidelity, 3-station simulation-based OSCE. Expert panels of three emergency physicians evaluated trainee performances at each centre using the Queen’s Simulation Assessment Tool (QSAT). Intraclass correlation coefficients were used to measure the inter-rater reliability, and analysis of variance was used to measure the discriminatory validity of each scenario. A fully crossed generalizability study was also conducted for each examination centre. Results Inter-rater reliability in four of the five centres was strong with a median absolute intraclass correlation coefficient (ICC) across centres and scenarios of 0.89 [0.65–0.97]. Discriminatory validity was also strong (p < 0.001 for scenarios 1 and 3; p < 0.05 for scenario 2). Generalizability studies found significant variations at two of the study centres. Conclusions This study demonstrates the successful pilot administration of a multi-centre, 3-station simulation-based OSCE for the assessment of resuscitation competence in post-graduate Emergency Medicine trainees. PMID:27103954

  16. A model to forecast data centre infrastructure costs.

    NASA Astrophysics Data System (ADS)

    Vernet, R.

    2015-12-01

    The computing needs in the HEP community are increasing steadily, but the current funding situation in many countries is tight. As a consequence experiments, data centres, and funding agencies have to rationalize resource usage and expenditures. CC-IN2P3 (Lyon, France) provides computing resources to many experiments including LHC, and is a major partner for astroparticle projects like LSST, CTA or Euclid. The financial cost to accommodate all these experiments is substantial and has to be planned well in advance for funding and strategic reasons. In that perspective, leveraging infrastructure expenses, electric power cost and hardware performance observed in our site over the last years, we have built a model that integrates these data and provides estimates of the investments that would be required to cater to the experiments for the mid-term future. We present how our model is built and the expenditure forecast it produces, taking into account the experiment roadmaps. We also examine the resource growth predicted by our model over the next years assuming a flat-budget scenario.

  17. The European ALMA Regional Centre: a model of user support

    NASA Astrophysics Data System (ADS)

    Andreani, P.; Stoehr, F.; Zwaan, M.; Hatziminaoglou, E.; Biggs, A.; Diaz-Trigo, M.; Humphreys, E.; Petry, D.; Randall, S.; Stanke, T.; van Kampen, E.; Bárta, M.; Brand, J.; Gueth, F.; Hogerheijde, M.; Bertoldi, F.; Muxlow, T.; Richards, A.; Vlemmings, W.

    2014-08-01

    The ALMA Regional Centres (ARCs) form the interface between the ALMA observatory and the user community from the proposal preparation stage to the delivery of data and their subsequent analysis. The ARCs provide critical services to both the ALMA operations in Chile and to the user community. These services were split by the ALMA project into core and additional services. The core services are financed by the ALMA operations budget and are critical to the successful operation of ALMA. They are contractual obligations and must be delivered to the ALMA project. The additional services are not funded by the ALMA project and are not contractual obligations, but are critical to achieve ALMA full scientific potential. A distributed network of ARC nodes (with ESO being the central ARC) has been set up throughout Europe at the following seven locations: Bologna, Bonn-Cologne, Grenoble, Leiden, Manchester, Ondrejov, Onsala. These ARC nodes are working together with the central node at ESO and provide both core and additional services to the ALMA user community. This paper presents the European ARC, and how it operates in Europe to support the ALMA community. This model, although complex in nature, is turning into a very successful one, providing a service to the scientific community that has been so far highly appreciated. The ARC could become a reference support model in an age where very large collaborations are required to build large facilities, and support is needed for geographically and culturally diverse communities.

  18. Aviation Safety Simulation Model

    NASA Technical Reports Server (NTRS)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  19. Modeling and simulation

    SciTech Connect

    Hanham, R.; Vogt, W.G.; Mickle, M.H.

    1986-01-01

    This book presents the papers given at a conference on computerized simulation. Topics considered at the conference included expert systems, modeling in electric power systems, power systems operating strategies, energy analysis, a linear programming approach to optimum load shedding in transmission systems, econometrics, simulation in natural gas engineering, solar energy studies, artificial intelligence, vision systems, hydrology, multiprocessors, and flow models.

  20. The Yale Cost Model and cost centres: servant or master?

    PubMed

    Rigby, E

    1993-01-01

    Cost accounting describes that aspect of accounting which collects, allocates and controls the cost of producing a service. Costing information is primarily reported to management to enable control of costs and to ensure the financial viability of units, departments and divisions. As costing studies continue to produce estimates of Diagnosis Related Group (DRG) costs in New South Wales hospitals, as well as in other states, costs for different hospitals are being externally compared, using a tool which is usually related to internal management and reporting. Comparability of costs is assumed even though accounting systems differ. This paper examines the cost centre structures at five major teaching hospitals in Sydney. It describes the similarities and differences in how the cost centres were constituted, and then details the line items of expenditure that are charged to each cost centre. The results of a comparative study of a medical specialty are included as evidence of different costing methodologies in the hospitals. The picture that emerged from the study is that the hospitals are constituting their cost centres to meet their internal management needs, that is, to know the cost of running a ward or nursing unit, a medical specialty, department and so on. The rationale for the particular cost centre construction was that cost centre managers could manage and control costs and assign responsibility. There are variations in procedures for assigning costs to cost centres, and the question is asked 'Do these variations in procedures make a material difference to our ability to compare costs per Diagnosis Related Group at the various hospitals?' It is contended that the accounting information, which is produced as a result of different practices, is primarily for internal management, not external comparison. It would be better for hospitals to compare their estimated costs per Diagnosis Related Group to an internal standard cost rather than the costs from other

  1. Estimation of the vehicle's centre of gravity based on a braking model

    NASA Astrophysics Data System (ADS)

    Yue, Hongwei; Zhang, Libin; Shan, Hongying; Liu, Huanfeng; Liu, Yicai

    2015-10-01

    A vehicle's centre of gravity (CG) is an important property that affects the vehicle's handing stability, ride comfort and safety. For example, a high CG may lead to a serious traffic accident due to the adverse effects it may have on roll and handling stability. In this paper, we develop a dynamic detection method to obtain vehicle's height that uses a simulation model based on a dynamic analysis during braking. Simulations show that the dynamic detection method is feasible. Experiments with three different vehicles are performed to verify the proposed method. The previously established prediction detection and lifting detection (LD) methods are used for comparison. The experimental results demonstrate that the proposed method has higher accuracy and efficiency than the LD method. Thus, the proposed method is useful for the vehicle detection.

  2. Integrated Assessment of Hadley Centre (HadCM2) Climate Change Projections on Agricultural Productivity and Irrigation Water Supply in the Conterminous United States.I. Climate change scenarios and impacts on irrigation water supply simulated with the HUMUS model.

    SciTech Connect

    Rosenberg, Norman J.; Brown, Robert A.; Izaurralde, R Cesar C.; Thomson, Allison M.

    2003-06-30

    This paper describes methodology and results of a study by researchers at PNNL contributing to the water sector study of the U.S. National Assessment of Climate Change. The vulnerability of water resources in the conterminous U.S. to climate change in 10-y periods centered on 2030 and 2095--as projected by the HadCM2 general circulation model--was modeled with HUMUS (Hydrologic Unit Model of the U.S.). HUMUS consists of a GIS that provides data on soils, land use and climate to drive the hydrology model Soil Water Assessment Tool (SWAT). The modeling was done at the scale of the 2101 8-digit USGS hydrologic unit areas (HUA). Results are aggregated to the 4-digit and 2-digit (Major Water Resource Region, MWRR) scales for various purposes. Daily records of temperature and precipitation for 1961-1990 provided the baseline climate. Water yields (WY)--sum of surface and subsurface runoff--increases from the baseline period over most of the U.S. in 2030 and 2095. In 2030, WY increases in the western US and decreases in the central and southeast regions. Notably, WY increases by 139 mm from baseline in the Pacific NW. Decreased WY is projected for the Lower Mississippi and Texas Gulf basins, driven by higher temperatures and reduced precipitation. The HadCM2 2095 scenario projects a climate significantly wetter than baseline, resulting in WY increases of 38%. WY increases are projected throughout the eastern U.S. WY also increases in the western U.S. Climate change also affects the seasonality of the hydrologic cycle. Early snowmelt is induced in western basins, leading to dramatically increased WYs in late winter and early spring. The simulations were run at current (365 ppm) and elevated (560 ppm) atmospheric CO2 concentrations to account for the potential impacts of the CO2-fertilization effect. The effects of climate change scenario were considerably greater than those due to elevated CO2 but the latter, overall, decreased losses and augmented increases in water yield.

  3. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  4. Theory Modeling and Simulation

    SciTech Connect

    Shlachter, Jack

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  5. Germinal centres seen through the mathematical eye: B-cell models on the catwalk.

    PubMed

    Meyer-Hermann, Michael; Figge, Marc Thilo; Toellner, Kai-Michael

    2009-04-01

    Germinal centres are receiving renewed attention following recent intravital multi-photon imaging studies. These data have shed new light on longstanding questions about the spatial organisation of germinal centres, B-cell migration, selection and differentiation. Mathematical models have proven invaluable in the analysis of intravital motility data, and have predicted novel B-cell selection mechanisms that are now supported by experimental findings. We argue that mathematical modelling adds a different vantage point to experimental data and provides a quantitative and systematic analysis of hypotheses and theories in immunology. Furthermore, the well-characterised nature of the germinal centre provides an excellent proving ground for mathematical modelling. PMID:19282244

  6. AGRICULTURAL SIMULATION MODEL (AGSIM)

    EPA Science Inventory

    AGSIM is a large-scale econometric simulation model of regional crop and national livestock production in the United States. The model was initially developed to analyze the aggregate economic impacts of a wide variety issues facing agriculture, such as technological change, pest...

  7. The Rossby Centre Regional Atmospheric Climate Model part II: application to the Arctic climate.

    PubMed

    Jones, Colin G; Wyser, Klaus; Ullerstig, Anders; Willén, Ulrika

    2004-06-01

    The Rossby Centre regional climate model (RCA2) has been integrated over the Arctic Ocean as part of the international ARCMIP project. Results have been compared to observations derived from the SHEBA data set. The standard RCA2 model overpredicts cloud cover and downwelling longwave radiation, during the Arctic winter. This error was improved by introducing a new cloud parameterization, which significantly improves the annual cycle of cloud cover. Compensating biases between clear sky downwelling longwave radiation and longwave radiation emitted from cloud base were identified. Modifications have been introduced to the model radiation scheme that more accurately treat solar radiation interaction with ice crystals. This leads to a more realistic representation of cloud-solar radiation interaction. The clear sky portion of the model radiation code transmits too much solar radiation through the atmosphere, producing a positive bias at the top of the frequent boundary layer clouds. A realistic treatment of the temporally evolving albedo, of both sea-ice and snow, appears crucial for an accurate simulation of the net surface energy budget. Likewise, inclusion of a prognostic snow-surface temperature seems necessary, to accurately simulate near-surface thermodynamic processes in the Arctic. PMID:15264599

  8. Impact of Patient and Procedure Mix on Finances of Perinatal Centres – Theoretical Models for Economic Strategies in Perinatal Centres

    PubMed Central

    Hildebrandt, T.; Kraml, F.; Wagner, S.; Hack, C. C.; Thiel, F. C.; Kehl, S.; Winkler, M.; Frobenius, W.; Faschingbauer, F.; Beckmann, M. W.; Lux, M. P.

    2013-01-01

    Introduction: In Germany, cost and revenue structures of hospitals with defined treatment priorities are currently being discussed to identify uneconomic services. This discussion has also affected perinatal centres (PNCs) and represents a new economic challenge for PNCs. In addition to optimising the time spent in hospital, the hospital management needs to define the “best” patient mix based on costs and revenues. Method: Different theoretical models were proposed based on the cost and revenue structures of the University Perinatal Centre for Franconia (UPF). Multi-step marginal costing was then used to show the impact on operating profits of changes in services and bed occupancy rates. The current contribution margin accounting used by the UPF served as the basis for the calculations. The models demonstrated the impact of changes in services on costs and revenues of a level 1 PNC. Results: Contribution margin analysis was used to calculate profitable and unprofitable DRGs based on average inpatient cost per day. Nineteen theoretical models were created. The current direct costing used by the UPF and a theoretical model with a 100 % bed occupancy rate were used as reference models. Significantly higher operating profits could be achieved by doubling the number of profitable DRGs and halving the number of less profitable DRGs. Operating profits could be increased even more by changing the rates of profitable DRGs per bed occupancy. The exclusive specialisation on pathological and high-risk pregnancies resulted in operating losses. All models which increased the numbers of caesarean sections or focused exclusively on c-sections resulted in operating losses. Conclusion: These theoretical models offer a basis for economic planning. They illustrate the enormous impact potential changes can have on the operating profits of PNCs. Level 1 PNCs require high bed occupancy rates and a profitable patient mix to cover the extremely high costs incurred due to the services

  9. Sensitivity analysis of the position of the intervertebral centres of reaction in upright standing--a musculoskeletal model investigation of the lumbar spine.

    PubMed

    Zander, Thomas; Dreischarf, Marcel; Schmidt, Hendrik

    2016-03-01

    The loads between adjacent vertebrae can be generalised as a single spatial force acting at the intervertebral centre of reaction. The exact position in vivo is unknown. However, in rigid body musculoskeletal models that simulate upright standing, the position is generally assumed to be located at the discs' centres of rotation. The influence of the antero-posterior position of the centre of reaction on muscle activity and joint loads remains unknown. Thus, by using an inverse dynamic model, we varied the position of the centre of reaction at L4/L5 (i), simultaneously at all lumbar levels (ii), and by optimisation at all lumbar levels (iii). Variation of the centres of reaction can considerably influence the activities of lumbar muscles and the joint forces between vertebrae. The optimisation of the position of the centre of reaction reduced the maximum lumbar muscle activity and axial joint forces at L4/L5 from 17.5% to 1.5% of the muscle strength and from 490 N to 390 N, respectively. Thus, when studying individual postures, such as for therapeutic or preventive evaluations, potential differences between the centre of reaction and the centre of rotation might influence the study results. These differences could be taken into account by sensitivity analyses. PMID:26774670

  10. Gyrokinetic particle simulation model

    SciTech Connect

    Lee, W.W.

    1986-07-01

    A new type of particle simulation model based on the gyrophase-averaged Vlasov and Poisson equations is presented. The reduced system, in which particle gyrations are removed from the equations of motion while the finite Larmor radius effects are still preserved, is most suitable for studying low frequency microinstabilities in magnetized plasmas. It is feasible to simulate an elongated system (L/sub parallel/ >> L/sub perpendicular/) with a three-dimensional grid using the present model without resorting to the usual mode expansion technique, since there is essentially no restriction on the size of ..delta..x/sub parallel/ in a gyrokinetic plasma. The new approach also enables us to further separate the time and spatial scales of the simulation from those associated with global transport through the use of multiple spatial scale expansion. Thus, the model can be a very efficient tool for studying anomalous transport problems related to steady-state drift-wave turbulence in magnetic confinement devices. It can also be applied to other areas of plasma physics.

  11. Data take centre stage at disease modelling symposium.

    PubMed

    2016-04-23

    The APHA hosted its sixth annual mathematical modelling symposium in January at its headquarters in Weybridge. As Charlotte Cook of the APHA reports, this year's event, 'Animal, plant and aquatic health modelling: making best use of evidence', was the biggest yet, with 90 modellers, scientists and policymakers attending, from government agencies and academic and research institutes in the UK and elsewhere in Europe. PMID:27103690

  12. A Model for a Regional Centre of Nutrition Education

    ERIC Educational Resources Information Center

    Icaza, Susana J.

    1974-01-01

    The continuing problem of insufficiency of food and the ensuing malnutrition can be dealt with only on a transdisciplinary basis. The organization of a successful model to overcome many of these difficulties is recounted here. (Author/RH)

  13. Radiative characteristics of the Canadian Climate Centre second-generation general circulation model

    SciTech Connect

    Barker, H.W. ); Li, Zhanqing ); Blanchet, J.P. )

    1994-07-01

    Several observational datasets were used to assess the quality of the radiative characteristics of the Canadian Climate Centre (CCC) second-generation GCM. The GCM data were obtained from the Atmospheric Model Intercomparison Project (AMIP) simulation. Data corresponding to the period January 1985 through December 1988 wee examined since this period of the AMIP simulation overlaps with the Earth Radiation Budget Experiment (ERBE) and the International Satellite Cloud Climatology Project (ISCCP) datasets. Attention was given to mean January and July conditions. Optical properties of surfaces, clear skies, and cloudy skies were examined. Ocean albedos are too high in the Tropics and too low in the polar regions relative to surface observations and theoretical estimates. Compared to a satellite-derived dataset, however, they are slightly underestimated. Throughout much of the Sahara and Saudi Deserts surface albedos are too low, while for much of Western Australia they are too high. Excessive amounts of snow in Southeast Asia seem to have been sustained by a localized snow albedo feedback related to inappropriate snow albedo specification and a weak masking effect by vegetation. Neglect of freshwater lakes in the Canadian Shield leads to negative and positive albedo anomalies in winter and summer, respectively. Like many GCMs, the CCC model has too little atmospheric H[sub 2]O vapor. This results in too much outgoing longwave radiation from clear skies, especially in the Tropics. Neglect of all trace gases except for CO[sub 2] and weak H[sub 2]O vapor absorption this bias. 55 refs., 13 figs., 4 tabs.

  14. Quantifying the centre of rotation pattern in a multi-body model of the lumbar spine.

    PubMed

    Abouhossein, Alireza; Weisse, Bernhard; Ferguson, Stephen J

    2013-01-01

    Understanding the kinematics of the spine provides paramount knowledge for many aspects of the clinical analysis of back pain. More specifically, visualisation of the instantaneous centre of rotation (ICR) enables clinicians to quantify joint laxity in the segments, avoiding a dependence on more inconclusive measurements based on the range of motion and excessive translations, which vary in every individual. Alternatively, it provides motion preserving designers with an insight into where a physiological ICR of a motion preserving prosthesis can be situated in order to restore proper load distribution across the passive and active elements of the lumbar region. Prior to the use of an unconstrained dynamic musculoskeletal model system, based on multi-body models capable of transient analysis, to estimate segmental loads, the model must be kinematically evaluated for all possible sensitivity due to ligament properties and the initial locus of intervertebral disc (IVD). A previously calibrated osseoligamentous model of lumbar spine was used to evaluate the changes in ICR under variation of the ligament stiffness and initial locus of IVD, when subjected to pure moments from 0 to 15 Nm. The ICR was quantified based on the closed solution of unit quaternion that improves accuracy and prevents coordinate singularities, which is often observed in Euler-based methods and least squares principles. The calculation of the ICR during flexion/extension revealed complexity and intrinsic nonlinearity between flexion and extension. This study revealed that, to accommodate a good agreement between in vitro data and the multi-body model predictions, in flexion more laxity is required than in extension. The results showed that the ICR location is concentrated in the posterior region of the disc, in agreement with previous experimental studies. However, the current multi-body model demonstrates a sensitivity to the initial definition of the ICR, which should be recognised as a

  15. A Foreign Model of Teacher Education and Its Local Appropriation: The English Teachers' Centres in Spain

    ERIC Educational Resources Information Center

    Groves, Tamar

    2015-01-01

    This article explores the implementation of the English model of teachers' centres in the context of 1980s Spain. Originally it was a top-down plan initiated by a national government. However, from the very beginning its fate was dependent on a bottom-up educational project carried out by pedagogical social movements. The first part of the article…

  16. How to Determine the Centre of Mass of Bodies from Image Modelling

    ERIC Educational Resources Information Center

    Dias, Marco Adriano; Carvalho, Paulo Simeão; Rodrigues, Marcelo

    2016-01-01

    Image modelling is a recent technique in physics education that includes digital tools for image treatment and analysis, such as digital stroboscopic photography (DSP) and video analysis software. It is commonly used to analyse the motion of objects. In this work we show how to determine the position of the centre of mass (CM) of objects with…

  17. Electricity Portfolio Simulation Model

    Energy Science and Technology Software Center (ESTSC)

    2005-09-01

    Stakeholders often have competing interests when selecting or planning new power plants. The purpose of developing this preliminary Electricity Portfolio Simulation Model (EPSim) is to provide a first cut, dynamic methodology and approach to this problem, that can subsequently be refined and validated, that may help energy planners, policy makers, and energy students better understand the tradeoffs associated with competing electricity portfolios. EPSim allows the user to explore competing electricity portfolios annually from 2002 tomore » 2025 in terms of five different criteria: cost, environmental impacts, energy dependence, health and safety, and sustainability. Four additional criteria (infrastructure vulnerability, service limitations, policy needs and science and technology needs) may be added in future versions of the model. Using an analytic hierarchy process (AHP) approach, users or groups of users apply weights to each of the criteria. The default energy assumptions of the model mimic Department of Energy’s (DOE) electricity portfolio to 2025 (EIA, 2005). At any time, the user can compare alternative portfolios to this reference case portfolio.« less

  18. SSPX simulation model

    SciTech Connect

    Fowler, T K

    1999-09-20

    An analytical approximation to an R-L-C circuit representing SSPX is shown to reproduce the observed capacitor bank efficiency and gun optimization data. As in the SPICE code, the spheromak gun is represented by a fixed resistance chosen to balance energy transfer to the gun. A revised estimate of the magnetic decay time in SSPX Shot 1822 then brings our estimate of the gun efficiency itself in line with the observed spheromak magnetic field for this shot. Prompted by these successes, we present a turbulence-based theoretical model for the spheromak resistance that can be implemented in the SPICE code, of the form: R{sub s} = {kappa}I (1-I{sub 0}/I){sup 2} where I is the gun current, I{sub 0} = ({Lambda}{sub 0}/{mu}{sub 0}){Phi} with bias flux and Taylor eigenvalue {lambda}{sub 0}, and {kappa} is a coefficient based on the magnetic turbulence model employed in Dan Hua's spheromak simulation code. The value of {kappa} giving a good energy balance (around 0.1 m{Omega}/KA) implies substantial turbulence levels. Implementing our model in SPICE would provide a calibration for theoretical calculations of the turbulence. Our analytic approximation to the SPICE code provides guidance to optimize future performance in SSPX, the greatest benefit appearing to come from reducing or eliminating the protective resistor to increase bank efficiency. Eliminating the resistor altogether doubles the bank efficiency and the spheromak magnetic energy.

  19. Simulations of the Galactic Centre Stellar Discs In a Warped Disc Origin Scenario

    NASA Astrophysics Data System (ADS)

    Ulubay-Siddiki, A.; Bartko, H.

    2012-07-01

    The Galactic Center (GC) hosts a population of young stars some of which seem to form a system of mutually inclined warped discs. While the presence of young stars in the close vicinity of the massive black hole is already problematic, their orbital configuration makes the situation even more puzzling. We present a possible warped disc origin scenario for these stars, which assumes an initially flat accretion disc which develops a warp through Pringle instability, or Bardeen-Petterson Effect. By working out the critical radii and the time scales involved, we argue that disc warping is plausible for GC parameters. We construct time evolution models for such discs considering the discs' self-gravity, and the torques exerted by the surrounding old star cluster. Our simulations suggest that the best agreement for a purely self-gravitating model is obtained for a disc-to-black hole mass ratio of Md/Mbh ~ 0.001.

  20. Contrail Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Paoli, Roberto; Shariff, Karim

    2016-01-01

    There is large uncertainty in the radiative forcing induced by aircraft contrails, particularly after they transform to cirrus. It has recently become possible to simulate contrail evolution for long periods after their formation. We review the main physical processes and simulation efforts in the four phases of contrail evolution, namely the jet, vortex, vortex dissipation, and diffusion phases. Recommendations for further work are given.

  1. From Land-surface Snow Cover To Routed River Discharge In The Rossby Centre Regional Climate Model

    NASA Astrophysics Data System (ADS)

    Samuelsson, P.; Gollvik, S.; Graham, L. P.; Bringfelt, B.

    The purpose of a land-surface scheme (LSS) in a coupled atmosphere- land-ocean model system is in general to provide the atmosphere with correct turbulent and radia- tion fluxes and to provide the routing scheme with correct water runoff flux. ``Correct" includes both amounts and time distributions. Snow is a very important process to con- sider in a LSS used for simulations at high latitudes. Snow has extreme radiation and heat transfer properties and it accumulates water during the winter season. Therefore, both amount and distribution of snow have to be carefully simulated. The LSS in the Rossby Centre Regional Climate Model (RCA) separates snow storage on open land and in forest. Both storages are single-layered, include liquid water and use the en- ergy balance to solve for time changes in temperature and water storage. The runoff generated from i.e. snow melt in each grid cell is used in a routing scheme to produce river discharge for the ocean model. The routing scheme is based on the hydrological HBV model which means that it is separately calibrated for each individual sub-basin. Validations of snow cover and river discharge simulations will be presented.

  2. Dedicated Space Science Education Centres Provide the Model for Effective Outreach

    NASA Astrophysics Data System (ADS)

    Brumfitt, A.

    Planetaria and science centres are traditionally successful players in engaging all levels and ages of society. They have long played a supportive role to and within education. Their value in teacher circles has always been recognised as an effective resource. Given the decline in career choices in traditional Science Technology Engineering and Mathematics (STEM) and astronomy and planetary sciences, they are now more important than ever. Since their inception the role and function of Planetaria has been required to evolve to meet the changing demands of society. They are now faced with the challenge of meeting new requirements and the need for new and different resources, techniques, support and funding models to meet and effectively deliver to new target groups. To face these challenges these pivotal centres require new methodology in their development of programs to be effective in their support to education. New directions specifically tailored for teacher professional development and for student studies. The changing requirements have resulted in a new kind of science centre one dedicated and specially designed using space science and dedicated to formal education across stem activities. The space scientist forms an integral and key role in this type of centre by providing the science, the passion of discovery and the relevance of the science to the community. These programs need to be carefully aligned to flexible course requirements and objectives to ensure relevancy to the education and outreach sector. They need access to and the support and input from the scientist and research institutions. They need real and appropriate material and resources. Scientists need effective channels through which to inform and share their work. Here is the potential for enormously effective symbiosis. This paper describes how new multi million dollar state-of-the-art space science centres are working with cutting edge science, research institutes, universities, government

  3. The El Nino-Southern Oscillation in the second Hadley Centre coupled model and its response to greenhouse warming

    SciTech Connect

    Collins, M.

    2000-04-01

    This paper describes El Nino-Southern Oscillation (ENSO) interannual variability simulated in the second Handley Centre coupled model under control and greenhouse warming scenarios. The model produces a very reasonable simulation of ENSO in the control experiment--reproducing the amplitude, spectral characteristics, and phase locking to the annual cycle that are observed in nature. The mechanism for the model ENSO is shown to be a mixed SST-ocean dynamics mode that can be interpreted in terms of the ocean recharge paradigm of Jin. In experiments with increased levels of greenhouse gases, no statistically significant changes in ENSO are seen until these levels approach four times preindustrial values. In these experiments, the model ENSO has an approximately 20% larger amplitude, a frequency that is approximately double that of the current ENSO (implying more frequent El Ninos and La Ninas), and phase locks to the annual cycle at a different time of year. It is shown that the increase in the vertical gradient of temperature in the thermocline region, associated with the model's response to increased greenhouse gases, is responsible for the increase in the amplitude of ENSO, while the increase in meridional temperature gradients on either side of the equator, again associated with the models response to increasing greenhouse gases, is responsible for the increased frequency of ENSO events.

  4. Numerical wind speed simulation model

    SciTech Connect

    Ramsdell, J.V.; Athey, G.F.; Ballinger, M.Y.

    1981-09-01

    A relatively simple stochastic model for simulating wind speed time series that can be used as an alternative to time series from representative locations is described in this report. The model incorporates systematic seasonal variation of the mean wind, its standard deviation, and the correlation speeds. It also incorporates systematic diurnal variation of the mean speed and standard deviation. To demonstrate the model capabilities, simulations were made using model parameters derived from data collected at the Hanford Meteorology Station, and results of analysis of simulated and actual data were compared.

  5. Hybrid modelling for ATES planning and operation in the Utrecht city centre

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Bloemendal, Martin; Kwakkel, Jan; Rostampour, Vahab

    2016-04-01

    Aquifer Thermal Energy Storage (ATES) systems can significantly reduce the energy use and greenhouse gas emissions of buildings in temperate climates. However, the rapid adoption of these systems has evidenced a number of emergent issues with the operation and management of urban ATES systems, which require careful spatial planning to avoid thermal interferences or conflicts with other subsurface functions. These issues have become particularly relevant in the Netherlands, which are currently the leading market for ATES (Bloemendal et al., 2015). In some urban areas of the country, the adoption of ATES technology is thus becoming limited by the available subsurface space. This scarcity is partly caused by current approaches to ATES planning; as such, static permits tend to overestimate pumping rates and yield excessive safety margins, which in turn hamper the energy savings which could be realized by new systems. These aspects are strongly influenced by time-dependent dynamics for the adoption of ATES systems by building owners and operators, and by the variation of ATES well flows under uncertain conditions for building energy demand. In order to take these dynamics into account, previous research (Jaxa-Rozen et al., 2015) introduced a hybrid simulation architecture combining an agent-based model of ATES adoption, a Matlab control design, and a MODFLOW/SEAWAT aquifer model. This architecture was first used to study an idealized case of urban ATES development. This case evidenced a trade-off between the thermal efficiency of individual systems and the collective energy savings realized by ATES systems within a given area, which had already been suggested by other research (e.g. Sommer et al., 2015). These results also indicated that current layout guidelines may be overly conservative, and limit the adoption of new systems. The present study extends this approach to a case study of ATES planning in the city centre of Utrecht, in the Netherlands. This case is

  6. Hybrid modelling for ATES planning and operation in the Utrecht city centre

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Bloemendal, Martin; Kwakkel, Jan; Rostampour, Vahab

    2016-04-01

    Aquifer Thermal Energy Storage (ATES) systems can significantly reduce the energy use and greenhouse gas emissions of buildings in temperate climates. However, the rapid adoption of these systems has evidenced a number of emergent issues with the operation and management of urban ATES systems, which require careful spatial planning to avoid thermal interferences or conflicts with other subsurface functions. These issues have become particularly relevant in the Netherlands, which are currently the leading market for ATES (Bloemendal et al., 2015). In some urban areas of the country, the adoption of ATES technology is thus becoming limited by the available subsurface space. This scarcity is partly caused by current approaches to ATES planning; as such, static permits tend to overestimate pumping rates and yield excessive safety margins, which in turn hamper the energy savings which could be realized by new systems. These aspects are strongly influenced by time-dependent dynamics for the adoption of ATES systems by building owners and operators, and by the variation of ATES well flows under uncertain conditions for building energy demand. In order to take these dynamics into account, previous research (Jaxa-Rozen et al., 2015) introduced a hybrid simulation architecture combining an agent-based model of ATES adoption, a Matlab control design, and a MODFLOW/SEAWAT aquifer model. This architecture was first used to study an idealized case of urban ATES development. This case evidenced a trade-off between the thermal efficiency of individual systems and the collective energy savings realized by ATES systems within a given area, which had already been suggested by other research (e.g. Sommer et al., 2015). These results also indicated that current layout guidelines may be overly conservative, and limit the adoption of new systems. The present study extends this approach to a case study of ATES planning in the city centre of Utrecht, in the Netherlands. This case is

  7. Simulation Models in Higher Education.

    ERIC Educational Resources Information Center

    Morrisseau, James J.

    1973-01-01

    This paper, adapted from a Society for College and University Planning conference, discusses cost simulation models in higher education. Emphasis is placed on the art of management, mini-models vs. maxi-models, the useful model, the reporting problem, anatomy of failure, information vs. action, and words of caution. (MJM)

  8. Cosmic ray models of the ridge-like excess of gamma rays in the Galactic Centre

    NASA Astrophysics Data System (ADS)

    Macias, Oscar; Gordon, Chris; Crocker, Roland M.; Profumo, Stefano

    2015-08-01

    The High-Energy Stereoscopic System (HESS) has detected diffuse TeV emission correlated with the distribution of molecular gas along the Ridge at the Galactic Centre. Diffuse, non-thermal emission is also seen by the Fermi large area telescope (Fermi-LAT) in the GeV range and by radio telescopes in the GHz range. Additionally, there is a distinct, spherically symmetric excess of gamma rays seen by Fermi-LAT in the GeV range. A cosmic ray flare, occurring in the Galactic Centre, 104 yr ago has been proposed to explain the TeV Ridge. An alternative, steady-state model explaining all three data sets (TeV, GeV, and radio) invokes purely leptonic processes. We show that the flare model from the Galactic Centre also provides an acceptable fit to the GeV and radio data, provided the diffusion coefficient is energy independent. However, if Kolmogorov-type turbulence is assumed for the diffusion coefficient, we find that two flares are needed, one for the TeV data (occurring approximately 104 yr ago) and an older one for the GeV data (approximately 105 yr old). We find that the flare models we investigate do not fit the spherically symmetric GeV excess as well as the usual generalized Navarro-Frenk-White spatial profile, but are better suited to explain the Ridge. We also show that a range of single-zone, steady-state models are able to explain all three spectral data sets. Large gas densities equal to the volumetric average in the region can be accommodated by an energy-independent diffusion or streaming based steady-state model. Additionally, we investigate how the flare and steady-state models may be distinguished with future gamma-ray data looking for a spatial dependence of the gamma-ray spectral index.

  9. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  10. Simulation modeling of estuarine ecosystems

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1980-01-01

    A simulation model has been developed of Galveston Bay, Texas ecosystem. Secondary productivity measured by harvestable species (such as shrimp and fish) is evaluated in terms of man-related and controllable factors, such as quantity and quality of inlet fresh-water and pollutants. This simulation model used information from an existing physical parameters model as well as pertinent biological measurements obtained by conventional sampling techniques. Predicted results from the model compared favorably with those from comparable investigations. In addition, this paper will discuss remotely sensed and conventional measurements in the framework of prospective models that may be used to study estuarine processes and ecosystem productivity.

  11. TREAT Modeling and Simulation Strategy

    SciTech Connect

    DeHart, Mark David

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  12. Modeling and Simulation at NASA

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2009-01-01

    This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.

  13. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  14. Infrared simulation model SENSAT-2.

    PubMed

    Richter, R

    1987-06-15

    The computer model SENSAT-2 has been developed for remote sensing uses of passive sensors in the 1-28-, microm infrared spectral region. The model calculates the IR signature of up to three homogeneous objects in the instantaneous field of view of the sensor. For the atmospheric part, model LOWTRAN-6 is used within SENSAT-2. Model SENSAT-2 can be used for mission analysis of sensors on different platforms like groundbased, aircraft, or satellite. It is a useful design tool for simulating and assessing the radiometric relations that are indispensable in designing sensors. Further uses include the comparison of measurements with simulation results and the radiometric correction of measurements. PMID:20489878

  15. Earthquake and failure forecasting in real-time: A Forecasting Model Testing Centre

    NASA Astrophysics Data System (ADS)

    Filgueira, Rosa; Atkinson, Malcolm; Bell, Andrew; Main, Ian; Boon, Steven; Meredith, Philip

    2013-04-01

    Across Europe there are a large number of rock deformation laboratories, each of which runs many experiments. Similarly there are a large number of theoretical rock physicists who develop constitutive and computational models both for rock deformation and changes in geophysical properties. Here we consider how to open up opportunities for sharing experimental data in a way that is integrated with multiple hypothesis testing. We present a prototype for a new forecasting model testing centre based on e-infrastructures for capturing and sharing data and models to accelerate the Rock Physicist (RP) research. This proposal is triggered by our work on data assimilation in the NERC EFFORT (Earthquake and Failure Forecasting in Real Time) project, using data provided by the NERC CREEP 2 experimental project as a test case. EFFORT is a multi-disciplinary collaboration between Geoscientists, Rock Physicists and Computer Scientist. Brittle failure of the crust is likely to play a key role in controlling the timing of a range of geophysical hazards, such as volcanic eruptions, yet the predictability of brittle failure is unknown. Our aim is to provide a facility for developing and testing models to forecast brittle failure in experimental and natural data. Model testing is performed in real-time, verifiably prospective mode, in order to avoid selection biases that are possible in retrospective analyses. The project will ultimately quantify the predictability of brittle failure, and how this predictability scales from simple, controlled laboratory conditions to the complex, uncontrolled real world. Experimental data are collected from controlled laboratory experiments which includes data from the UCL Laboratory and from Creep2 project which will undertake experiments in a deep-sea laboratory. We illustrate the properties of the prototype testing centre by streaming and analysing realistically noisy synthetic data, as an aid to generating and improving testing methodologies in

  16. Stochastic models: theory and simulation.

    SciTech Connect

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  17. Hybrid guiding-centre/full-orbit simulations in non-axisymmetric magnetic geometry exploiting general criterion for guiding-centre accuracy

    NASA Astrophysics Data System (ADS)

    Pfefferlé, D.; Graves, J. P.; Cooper, W. A.

    2015-05-01

    To identify under what conditions guiding-centre or full-orbit tracing should be used, an estimation of the spatial variation of the magnetic field is proposed, not only taking into account gradient and curvature terms but also parallel currents and the local shearing of field-lines. The criterion is derived for general three-dimensional magnetic equilibria including stellarator plasmas. Details are provided on how to implement it in cylindrical coordinates and in flux coordinates that rely on the geometric toroidal angle. A means of switching between guiding-centre and full-orbit equations at first order in Larmor radius with minimal discrepancy is shown. Techniques are applied to a MAST (mega amp spherical tokamak) helical core equilibrium in which the inner kinked flux-surfaces are tightly compressed against the outer axisymmetric mantle and where the parallel current peaks at the nearly rational surface. This is put in relation with the simpler situation B(x, y, z) = B0[sin(kx)ey + cos(kx)ez], for which full orbits and lowest order drifts are obtained analytically. In the kinked equilibrium, the full orbits of NBI fast ions are solved numerically and shown to follow helical drift surfaces. This result partially explains the off-axis redistribution of neutral beam injection fast particles in the presence of MAST long-lived modes (LLM).

  18. Tree Modeling and Dynamics Simulation

    NASA Astrophysics Data System (ADS)

    Tian-shuang, Fu; Yi-bing, Li; Dong-xu, Shen

    This paper introduces the theory about tree modeling and dynamic movements simulation in computer graphics. By comparing many methods we choose Geometry-based rendering as our method. The tree is decomposed into branches and leaves, under the rotation and quaternion methods we realize the tree animation and avoid the Gimbals Lock in Euler rotation. We take Orge 3D as render engine, which has good graphics programming ability. By the end we realize the tree modeling and dynamic movements simulation, achieve realistic visual quality with little computation cost.

  19. Supporting families with Cancer: A patient centred survivorship model of care.

    PubMed

    Craft, Emily Victoria; Billington, Caron; O'Sullivan, Rory; Watson, Wendy; Suter-Giorgini, Nicola; Singletary, Joanne; King, Elizabeth; Perfirgines, Matthew; Cashmore, Annette; Barwell, Julian

    2015-12-01

    In 2011, the Leicestershire Clinical Genetics Department in collaboration with Macmillan Cancer Support initiated a project called Supporting Families with Cancer (SFWC). The project aimed to raise awareness of inherited cancers amongst both healthcare professionals and the general public and develop a patient-centred collaborative approach to cancer treatment and support services. This paper describes the project's development of a range of community outreach events and a training scheme for primary healthcare professionals designed to improve familial cancer referral rates in Leicester. Following consultation with patients and support groups, a series of interactive 'medical supermarket' events were held in Leicester. These events focused on providing patients with a forum for sharing research data, information about diagnosis and treatments and access to support groups and other allied healthcare services with additional information being made available digitally via SFWC webpages and a series of short videos available on a YouTube channel. Qualitative and quantitative data presented here indicate that the SFWC medical supermarket model has been well received by patients and offers a patient-centred, holistic approach to cancer treatment. PMID:26077135

  20. Models for institutional and professional accreditation of haemophilia centres in Italy.

    PubMed

    Calizzani, G; Vaglio, S; Arcieri, R; Menichini, I; Tagliaferri, A; Antoncecchi, S; Carloni, M T; Breda, A; Santagostino, E; Ghirardini, A; Tamburrini, M R; Morfini, M; Mannucci, P M; Grazzini, G

    2013-07-01

    The Health Commission of the Conference between the Italian State and Regions recognized the need to establish an institutional accreditation model for Haemophilia Centres (HCs) to be implemented by 21 Regions in order to provide patients with haemophilia and allied inherited coagulations disorders with high and uniform standards of care. The Italian National Blood Centre, on behalf of the Commission, convened a panel of clinicians, patients, experts, representatives from Regions and Ministry of Health. The agreed methodology included: systematic literature review and best practice collection, analysis of provisions and regulations of currently available services, priority setting, definition of principles and criteria for the development of recommendations on the optimal requirements for HCs. The result was the formulation of two recommendations sets. Two sets of recommendations were produced. The first concerns regional policy planning, in which the following aspects of comprehensive haemophilia care should be considered for implementation: monitoring and auditing, multidisciplinary approach to clinical care, protocols for emergency management, home treatment and its monitoring, patient registries, drug availability and procurement, recruitment and training of health care professionals. The second set concerns the accreditation process and lists 23 organizational requirements for level 1 HCs and 4 additional requirements for level 2 HCs. These recommendations help to provide Italian Regional Health Authorities with an organizational framework for the provision of comprehensive care to patients with inherited coagulation disorders based on current scientific evidence. PMID:23556420

  1. Describing team development within a novel GP-led urgent care centre model: a qualitative study

    PubMed Central

    Igantowicz, Agnieszka; Gnani, Shamini; Greenfield, Geva

    2016-01-01

    Objective Urgent care centres (UCCs) co-located within an emergency department were developed to reduce the numbers of inappropriate emergency department admissions. Since then various UCC models have developed, including a novel general practitioner (GP)-led UCC that incorporates both GPs and emergency nurse practitioners (ENPs). Traditionally these two groups do not work alongside each other within an emergency setting. Although good teamwork is crucial to better patient outcomes, there is little within the literature about the development of a team consisting of different healthcare professionals in a novel healthcare setting. Our aim was therefore to describe staff members' perspectives of team development within the GP-led UCC model. Design Open-ended semistructured interviews, analysed using thematic content analysis. Setting GP-led urgent care centres in two academic teaching hospitals in London. Participants 15 UCC staff members including six GPs, four ENPs, two receptionists and three managers. Results Overall participants were positive about the interprofessional team that had developed and recognised that this process had taken time. Hierarchy within the UCC setting has diminished with time, although some residual hierarchical beliefs do appear to remain. Staff appreciated interdisciplinary collaboration was likely to improve patient care. Eight key facilitating factors for the team were identified: appointment of leaders, perception of fair workload, education on roles/skill sets and development of these, shared professional understanding, interdisciplinary working, ED collaboration, clinical guidelines and social interactions. Conclusions A strong interprofessional team has evolved within the GP-led UCCs over time, breaking down traditional professional divides. Future implementation of UCC models should pro-actively incorporate the eight facilitating factors identified from the outset, to enable effective teams to develop more quickly. PMID:27338875

  2. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  3. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  4. University Macro Analytic Simulation Model.

    ERIC Educational Resources Information Center

    Baron, Robert; Gulko, Warren

    The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…

  5. The European ALMA Regional Centre Network: A Geographically Distributed User Support Model

    NASA Astrophysics Data System (ADS)

    Hatziminaoglou, E.; Zwaan, M.; Andreani, P.; Barta, M.; Bertoldi, F.; Brand, J.; Gueth, F.; Hogerheijde, M.; Maercker, M.; Massardi, M.; Muehle, S.; Muxlow, Th.; Richards, A.; Schilke, P.; Tilanus, R.; Vlemmings, W.; Afonso, J.; Messias, H.

    2015-12-01

    In recent years there has been a paradigm shift from centralised to geographically distributed resources. Individual entities are no longer able to host or afford the necessary expertise in-house, and, as a consequence, society increasingly relies on widespread collaborations. Although such collaborations are now the norm for scientific projects, more technical structures providing support to a distributed scientific community without direct financial or other material benefits are scarce. The network of European ALMA Regional Centre (ARC) nodes is an example of such an internationally distributed user support network. It is an organised effort to provide the European ALMA user community with uniform expert support to enable optimal usage and scientific output of the ALMA facility. The network model for the European ARC nodes is described in terms of its organisation, communication strategies and user support.

  6. How to determine the centre of mass of bodies from image modelling

    NASA Astrophysics Data System (ADS)

    Adriano Dias, Marco; Simeão Carvalho, Paulo; Rodrigues, Marcelo

    2016-03-01

    Image modelling is a recent technique in physics education that includes digital tools for image treatment and analysis, such as digital stroboscopic photography (DSP) and video analysis software. It is commonly used to analyse the motion of objects. In this work we show how to determine the position of the centre of mass (CM) of objects with either isotropic or anisotropic mass density, by video analyses as a video based experimental activity (VBEA). Strobe imaging is also presented in an educational view, helping students to visualize the complex motion of a rigid body with heterogeneous structure. As an example, we present a hammer tossed with translation and rotation. The technique shown here is valid for almost any kind of objects and it is very useful to work with the concept of CM.

  7. Modeling and Simulation for Safeguards

    SciTech Connect

    Swinhoe, Martyn T.

    2012-07-26

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  8. Multiscale Stochastic Simulation and Modeling

    SciTech Connect

    James Glimm; Xiaolin Li

    2006-01-10

    Acceleration driven instabilities of fluid mixing layers include the classical cases of Rayleigh-Taylor instability, driven by a steady acceleration and Richtmyer-Meshkov instability, driven by an impulsive acceleration. Our program starts with high resolution methods of numerical simulation of two (or more) distinct fluids, continues with analytic analysis of these solutions, and the derivation of averaged equations. A striking achievement has been the systematic agreement we obtained between simulation and experiment by using a high resolution numerical method and improved physical modeling, with surface tension. Our study is accompanies by analysis using stochastic modeling and averaged equations for the multiphase problem. We have quantified the error and uncertainty using statistical modeling methods.

  9. The HWVP availability simulation model

    SciTech Connect

    Reisdorf, J.; Sienko, F.; Melville, D.; Gogg, T.

    1994-12-31

    This report described the hanford Waste Vitrification Plant simualtion model (HWVP).The model was utilized to simulate the performance and repair of remote handling equipment utilizied at the vitrification plant. The simulation model demonstrates that the HWVP has an availability of {approx} 85%. It also shows that both the MC and CDC cranes have a high utilization factor of {approx} 70%. This means that the crane`s idle time of {approx} 30% may not be sufficient to meet off-normal events such as canister rework. A study is recommended to optimize the crane operations in these areas. The ST/ET crane`s utilization factor is 16%, indicating that it can meet upset conditions. The analysis also shows that the canyon crane has a utilization factor of 29%, or it is idle 61% of the time. This large amount of inactive time demonstrates that the crane can service failed equipment without affecting production.

  10. Assessment of Molecular Modeling & Simulation

    SciTech Connect

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  11. Guiding-centre and full-Lorentz orbit solving in 3D magnetic coordinates for fast particle simulations

    NASA Astrophysics Data System (ADS)

    Cooper, Wilfred A.; Pfefferle, David; Graves, Jonathan P.

    2014-10-01

    Designed to accurately solve the motion of energetic particles in the presence of 3D magnetic fields, the VENUS-LEVIS code leans on a non-canonical general coordinate Lagrangian formulation of the equations of motion. It switches between full-orbit particle following and guiding-centre tracing by verifying the perpendicular variation of magnetic vector field, not only including gradients and curvature terms but also the shearing of field-lines. The criteria is particularly relevant for the study of fast ion redistribution in the kinked core of hybrid plasmas, where the compression of flux-surfaces against the axisymmetric outer mantle creates strongly varying magnetic field-lines and large parallel currents. Slowing-down simulations of NBI fast ions show that co-passing particles helically align in the opposite side of the plasma deformation whereas counter-passing particles are barely affected by the kinked structure. Results are compared with experimental neutron camera traces and FIDA measurements during long-lived modes (LLM).

  12. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  13. Modelling the formation of the circumnuclear ring in the Galactic centre

    NASA Astrophysics Data System (ADS)

    Mapelli, Michela; Trani, Alessandro A.

    2016-01-01

    Several thousand solar masses of molecular, atomic, and ionized gas lie in the innermost ~10 pc of our Galaxy. The most relevant structure of molecular gas is the circumnuclear ring (CNR), a dense and clumpy ring surrounding the supermassive black hole (SMBH), with a radius of ~2 pc. We propose that the CNR formed through the tidal disruption of a molecular cloud, and we investigate this scenario by means of N-body smoothed-particle hydrodynamics simulations. We ran a grid of simulations, varying cloud mass (4 × 104, 1.3 × 105M⊙), initial orbital velocity (vin = 0.2-0.5 vesc, where vesc is the escape velocity from the SMBH), and impact parameter (b = 8, 26 pc). The disruption of the molecular cloud leads to the formation of very dense and clumpy gas rings containing most of the initial cloud mass. If the initial orbital velocity of the cloud is sufficiently low (vin< 0.4 vesc, for b = 26 pc) or the impact parameter is sufficiently small (b ≲ 10 pc, for vin> 0.5 vesc), at least two rings form around the SMBH: an inner ring (with radius ~0.4 pc) and an outer ring (with radius ~2-4 pc). The inner ring forms from low-angular momentum material that engulfs the SMBH during the first periapsis passage, while the outer ring forms later, during the subsequent periapsis passages of the disrupted cloud. The inner and outer rings are misaligned by ~24 degrees because they form from different gas streamers, which are affected by the SMBH gravitational focusing in different ways. The outer ring matches several properties (mass, rotation velocity, temperature, clumpiness) of the CNR in our Galactic centre. We speculate that the inner ring might account for the neutral gas observed in the central cavity.

  14. Simulating spin models on GPU

    NASA Astrophysics Data System (ADS)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  15. Standard for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  16. Business Models of High Performance Computing Centres in Higher Education in Europe

    ERIC Educational Resources Information Center

    Eurich, Markus; Calleja, Paul; Boutellier, Roman

    2013-01-01

    High performance computing (HPC) service centres are a vital part of the academic infrastructure of higher education organisations. However, despite their importance for research and the necessary high capital expenditures, business research on HPC service centres is mostly missing. From a business perspective, it is important to find an answer to…

  17. The perioperative surgical home: An innovative, patient-centred and cost-effective perioperative care model.

    PubMed

    Desebbe, Olivier; Lanz, Thomas; Kain, Zeev; Cannesson, Maxime

    2016-02-01

    Contrary to the intraoperative period, the current perioperative environment is known to be fragmented and expensive. One of the potential solutions to this problem is the newly proposed perioperative surgical home (PSH) model of care. The PSH is a patient-centred micro healthcare system, which begins at the time the decision for surgery is made, is continuous through the perioperative period and concludes 30 days after discharge from the hospital. The model is based on multidisciplinary involvement: coordination of care, consistent application of best evidence/best practice protocols, full transparency with continuous monitoring and reporting of safety, quality, and cost data to optimize and decrease variation in care practices. To reduce said variation in care, the entire continuum of the perioperative process must evolve into a unique care environment handled by one perioperative team and coordinated by a leader. Anaesthesiologists are ideally positioned to lead this new model and thus significantly contribute to the highest standards in transitional medicine. The unique characteristics that place Anaesthesiologists in this framework include their systematic role in hospitals (as coordinators between patients/medical staff and institutions), the culture of safety and health care metrics innate to the specialty, and a significant role in the preoperative evaluation and counselling process, making them ideal leaders in perioperative medicine. PMID:26613678

  18. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  19. Development of a Matheatical Dynamic Simulation Model for the New Motion Simulator Used for the Large Space Simulator at ESTEC

    NASA Astrophysics Data System (ADS)

    Messing, Rene

    2012-07-01

    To simulate environmental space conditions for space- craft qualification testing the European Space Agency ESA uses a Large Space Simulator (LSS) in its Test Centre in Noordwijk, the Netherlands. In the LSS a motion system is used, to provide the orientation of an up to five tons heavy spacecraft with respect to an artificial solar beam. The existing motion simulation will be replaced by a new motion system. The new motion system shall be able to orient a spacecraft, defined by its elevation and azimuth angle and provide an eclipse simulation (continuous spinning) around the spacecraft rotation axis. The development of the new motion system has been contracted to APCO Technologies in Switzerland. In addition to the design development done by the con- tractor the Engineering section of the ESTEC Test Centre is in parallel developing a mathematical model simulating the dynamic behaviour of the system. The model shall to serve, during the preliminary design, to verify the selection of the drive units and define the specimen trajectory speed and acceleration profiles. In the further design phase it shall verify the dynamic response, at the spacecraft mounting interface of the unloaded system, against the requirements. In the future it shall predict the dynamic responses of the implemented system for different spacecraft being mounted and operated onto the system. The paper shall give a brief description of the investment history and design developments of the new motion system for the LSS and then give a brief description the different developments steps which are foreseen and which have been already implemented in the mathematical simulation model.

  20. The community care model of the Intercountry Centre for Oral Health at Chiangmai, Thailand.

    PubMed

    Anumanrajadhon, T; Rajchagool, S; Nitisiri, P; Phantumvanit, P; Songpaisan, Y; Barmes, D E; Sardo-Infirri, J; Davies, G N; Møller, I J; Pilot, T

    1996-08-01

    The Intercountry Centre for Oral Health opened in Chiangmai, Thailand, in November, 1981. In 1984, as part of its mandate to promote new approaches to the delivery of oral health care, it initiated a demonstration project known as the Community Care Model for Oral Health. Logistic, financial and organisational difficulties prevented the full implementation of the original plan. Nevertheless, consideration of the strengths and weaknesses of the Model has provided valuable suggestions for adoption by national and international health agencies interested in adopting a primary health care approach to the delivery of oral health services. Important features which could be appropriate for disadvantaged communities include: integration into the existing health service infrastructure; emphasis on health promotion and prevention; minimal clinical interventions; an in-built monitoring and evaluation system based on epidemiological principles, full community participation in planning and implementation; the establishment of specific targets and goals; the instruction of all health personnel, teachers and senior students in the basic principles of the recognition, prevention and control of oral diseases and conditions; the application of relevant principles of Performance Logic to training; and the provision of a clear career path for all health personnel. PMID:9147120

  1. Deformation twinning in small-sized face-centred cubic single crystals: Experiments and modelling

    NASA Astrophysics Data System (ADS)

    Liang, Z. Y.; Huang, M. X.

    2015-12-01

    Small-sized crystals generally show deformation behaviour distinct from their bulk counterparts. In addition to dislocation slip, deformation twinning in small-sized face-centred cubic (FCC) single crystals has been reported to follow a different mechanism which involves coherent emission of partial dislocations on successive { 111 } planes from free surface. The present work employed a twinning-induced plasticity (TWIP) steel with a low stacking fault energy to systematically investigate the twin evolution in small-sized FCC single crystals. Micrometre-sized single crystal pillars of TWIP steel were fabricated by focus ion beam and then strained to different levels by compression experiments. Detailed transmission electron microscopy characterization was carried out to obtain a quantitative evaluation of the deformation twins, which contribute to most of the plastic strain. Emissions of partial dislocations from free surface (surface sources) and pre-existing perfect dislocations inside the pillar (inner sources) are found as the essential processes for the formation of deformation twins. Accordingly, a physically-based model, which integrates source introduction methods and source activation criterions for partial dislocation emission, is developed to quantitatively predict the twin evolution. The model is able to reproduce the experimental twin evolution, in terms of the total twin formation, the twin morphology and the occurrence of twinning burst.

  2. Electricity Generation Cost Simulation Model

    SciTech Connect

    2003-04-25

    The Electricity Generation Cost Simulation Model (GENSIM) is a user-friendly, high-level dynamic simulation model that calculates electricity production costs for variety of electricity generation technologies, including: pulverized coal, gas combustion turbine, gas combined cycle, nuclear, solar (PV and thermal), and wind. The model allows the user to quickly conduct sensitivity analysis on key variables, including: capital, O&M, and fuel costs; interest rates; construction time; heat rates; and capacity factors. The model also includes consideration of a wide range of externality costs and pollution control options for carbon dioxide, nitrogen oxides, sulfur dioxide, and mercury. Two different data sets are included in the model; one from the U.S. Department of Energy (DOE) and the other from Platt's Research Group. Likely users of this model include executives and staff in the Congress, the Administration and private industry (power plant builders, industrial electricity users and electric utilities). The model seeks to improve understanding of the economic viability of various generating technologies and their emission trade-offs. The base case results using the DOE data, indicate that in the absence of externality costs, or renewable tax credits, pulverized coal and gas combined cycle plants are the least cost alternatives at 3.7 and 3.5 cents/kwhr, respectively. A complete sensitivity analysis on fuel, capital, and construction time shows that these results coal and gas are much more sensitive to assumption about fuel prices than they are to capital costs or construction times. The results also show that making nuclear competitive with coal or gas requires significant reductions in capital costs, to the $1000/kW level, if no other changes are made. For renewables, the results indicate that wind is now competitive with the nuclear option and is only competitive with coal and gas for grid connected applications if one includes the federal production tax credit

  3. Electricity Generation Cost Simulation Model

    Energy Science and Technology Software Center (ESTSC)

    2003-04-25

    The Electricity Generation Cost Simulation Model (GENSIM) is a user-friendly, high-level dynamic simulation model that calculates electricity production costs for variety of electricity generation technologies, including: pulverized coal, gas combustion turbine, gas combined cycle, nuclear, solar (PV and thermal), and wind. The model allows the user to quickly conduct sensitivity analysis on key variables, including: capital, O&M, and fuel costs; interest rates; construction time; heat rates; and capacity factors. The model also includes consideration ofmore » a wide range of externality costs and pollution control options for carbon dioxide, nitrogen oxides, sulfur dioxide, and mercury. Two different data sets are included in the model; one from the U.S. Department of Energy (DOE) and the other from Platt's Research Group. Likely users of this model include executives and staff in the Congress, the Administration and private industry (power plant builders, industrial electricity users and electric utilities). The model seeks to improve understanding of the economic viability of various generating technologies and their emission trade-offs. The base case results using the DOE data, indicate that in the absence of externality costs, or renewable tax credits, pulverized coal and gas combined cycle plants are the least cost alternatives at 3.7 and 3.5 cents/kwhr, respectively. A complete sensitivity analysis on fuel, capital, and construction time shows that these results coal and gas are much more sensitive to assumption about fuel prices than they are to capital costs or construction times. The results also show that making nuclear competitive with coal or gas requires significant reductions in capital costs, to the $1000/kW level, if no other changes are made. For renewables, the results indicate that wind is now competitive with the nuclear option and is only competitive with coal and gas for grid connected applications if one includes the federal production tax

  4. Towards Sustainable Research Capacity Development and Research Ownership for Academic Institutes in Developing Countries: The Malawian Research Support Centre Model

    ERIC Educational Resources Information Center

    Gomo, Exnevia; Kalilani, Linda; Mwapasa, Victor; Trigu, Chifundo; Phiri, Kamija; Schmidt, Joann; van Hensbroek, Michael Boele

    2011-01-01

    In lesser-developed African countries, the lack of institutionalised support for research, combined with limited career opportunities and poor remuneration, have contributed to weak research infrastructure and capacity, and a continuing brain drain to developed countries. Malawi's Research Support Centre (RSC) model is novel in that it provides a…

  5. SEMI Modeling and Simulation Roadmap

    SciTech Connect

    Hermina, W.L.

    2000-10-02

    With the exponential growth in the power of computing hardware and software, modeling and simulation is becoming a key enabler for the rapid design of reliable Microsystems. One vision of the future microsystem design process would include the following primary software capabilities: (1) The development of 3D part design, through standard CAD packages, with automatic design rule checks that guarantee the manufacturability and performance of the microsystem. (2) Automatic mesh generation, for 3D parts as manufactured, that permits computational simulation of the process steps, and the performance and reliability analysis for the final microsystem. (3) Computer generated 2D layouts for process steps that utilize detailed process models to generate the layout and process parameter recipe required to achieve the desired 3D part. (4) Science-based computational tools that can simulate the process physics, and the coupled thermal, fluid, structural, solid mechanics, electromagnetic and material response governing the performance and reliability of the microsystem. (5) Visualization software that permits the rapid visualization of 3D parts including cross-sectional maps, performance and reliability analysis results, and process simulation results. In addition to these desired software capabilities, a desired computing infrastructure would include massively parallel computers that enable rapid high-fidelity analysis, coupled with networked compute servers that permit computing at a distance. We now discuss the individual computational components that are required to achieve this vision. There are three primary areas of focus: design capabilities, science-based capabilities and computing infrastructure. Within each of these areas, there are several key capability requirements.

  6. Muscle contributions to centre of mass acceleration during turning gait in typically developing children: A simulation study.

    PubMed

    Dixon, Philippe C; Jansen, Karen; Jonkers, Ilse; Stebbins, Julie; Theologis, Tim; Zavatsky, Amy B

    2015-12-16

    Turning while walking requires substantial joint kinematic and kinetic adaptations compared to straight walking in order to redirect the body centre of mass (COM) towards the new walking direction. The role of muscles and external forces in controlling and redirecting the COM during turning remains unclear. The aim of this study was to compare the contributors to COM medio-lateral acceleration during 90° pre-planned turns about the inside limb (spin) and straight walking in typically developing children. Simulations of straight walking and turning gait based on experimental motion data were implemented in OpenSim. The contributors to COM global medio-lateral acceleration during the approach (outside limb) and turn (inside limb) stance phase were quantified via an induced acceleration analysis. Changes in medio-lateral COM acceleration occurred during both turning phases, compared to straight walking (p<0.001). During the approach, outside limb plantarflexors (soleus and medial gastrocnemius) contribution to lateral (away from the turn side) COM acceleration was reduced (p<0.001), whereas during the turn, inside limb plantarflexors (soleus and gastrocnemii) contribution to lateral acceleration (towards the turn side) increased (p≤0.013) and abductor (gluteus medius and minimus) contribution medially decreased (p<0.001), compared to straight walking, together helping accelerate the COM towards the new walking direction. Knowledge of the changes in muscle contributions required to modulate the COM position during turning improves our understanding of the control mechanisms of gait and may be used clinically to guide the management of gait disorders in populations with restricted gait ability. PMID:26555714

  7. Simulated annealing model of acupuncture

    NASA Astrophysics Data System (ADS)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  8. The MAST-edge centred lumped scheme for the flow simulation in variably saturated heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Aricò, Costanza; Sinagra, Marco; Tucciarelli, Tullio

    2012-02-01

    A novel methodology is proposed for the solution of the flow equation in a variably saturated heterogeneous porous medium. The computational domain is descretized using triangular meshes and the governing PDEs are discretized using a lumped in the edge centres numerical technique. The dependent unknown variable of the problem is the piezometric head. A fractional time step methodology is applied for the solution of the original system, solving consecutively a prediction and a correction problem. A scalar potential of the flow field exists and in the prediction step a MArching in Space and Time (MAST) formulation is applied for the sequential solution of the Ordinary Differential Equation of the cells, ordered according to their potential value computed at the beginning of the time step. In the correction step, the solution of a large linear system with order equal to the number of edges is required. A semi-analytical procedure is also proposed for the solution of the prediction step. The computational performance, the order of convergence and the mass balance error have been estimated in several tests and compared with the results of other literature models.

  9. Modelling and simulation of radiotherapy

    NASA Astrophysics Data System (ADS)

    Kirkby, Norman F.

    2007-02-01

    In this paper, models are described which have been developed to model both the way in which a population of cells respond to radiation and the way in which a population of patients respond to radiotherapy to assist the conduct of clinical trials in silico. Population balance techniques have been used to simulate the age distribution of tumour cells in the cell cycle. Sensitivity to radiation is not constant round the cell cycle and a single fraction of radiation changes the age distribution. Careful timing of further fractions of radiation can be used to maximize the damage delivered to the tumour while minimizing damage to normal tissue. However, tumour modelling does not necessarily predict patient outcome. A separate model has been established to predict the course of a brain cancer called glioblastoma multiforme (GBM). The model considers the growth of the tumour and its effect on the normal brain. A simple representation is included of the health status of the patient and hence the type of treatment offered. It is concluded that although these and similar models have a long way yet to be developed, they are beginning to have an impact on the development of clinical practice.

  10. Uterine Contraction Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Liu, Miao; Belfore, Lee A.; Shen, Yuzhong; Scerbo, Mark W.

    2010-01-01

    Building a training system for medical personnel to properly interpret fetal heart rate tracing requires developing accurate models that can relate various signal patterns to certain pathologies. In addition to modeling the fetal heart rate signal itself, the change of uterine pressure that bears strong relation to fetal heart rate and provides indications of maternal and fetal status should also be considered. In this work, we have developed a group of parametric models to simulate uterine contractions during labor and delivery. Through analysis of real patient records, we propose to model uterine contraction signals by three major components: regular contractions, impulsive noise caused by fetal movements, and low amplitude noise invoked by maternal breathing and measuring apparatus. The regular contractions are modeled by an asymmetric generalized Gaussian function and least squares estimation is used to compute the parameter values of the asymmetric generalized Gaussian function based on uterine contractions of real patients. Regular contractions are detected based on thresholding and derivative analysis of uterine contractions. Impulsive noise caused by fetal movements and low amplitude noise by maternal breathing and measuring apparatus are modeled by rational polynomial functions and Perlin noise, respectively. Experiment results show the synthesized uterine contractions can mimic the real uterine contractions realistically, demonstrating the effectiveness of the proposed algorithm.

  11. Plasma disruption modeling and simulation

    SciTech Connect

    Hassanein, A.

    1994-07-01

    Disruptions in tokamak reactors are considered a limiting factor to successful operation and a reliable design. The behavior of plasma-facing components during a disruption is critical to the overall integrity of the reactor. Erosion of plasma facing-material (PFM) surfaces due to thermal energy dump during the disruption can severely limit the lifetime of these components and thus diminish the economic feasibility of the reactor.Initially, the incident plasma particles will deposit their energy directly on the PFM surface, heating it to a very high temperature where ablation occurs. Models for plasma-material interactions have been developed and used to predict material thermal evolution during the disruption. Within a few microseconds after the start of the disruption, enough material is vaporized to intercept most of the incoming plasma particles. Models for plasma-vapor interactions are necessary to predict vapor cloud expansion and hydrodynamics. Continuous heating of the vapor cloud above the material surface by the incident plasma particles will excite, ionize, and cause vapor atoms to emit thermal radiation. Accurate models for radiation transport in the vapor are essential for calculating the net radiated flux to the material surface which determines the final erosion thickness and consequently component lifetime. A comprehensive model that takes into account various stages of plasma-material interaction has been developed and used to predict erosion rates during reactor disruption, as well during induced disruption in laboratory experiments. Differences between various simulation experiments and reactor conditions are discussed. A two-dimensional radiation transport model has been developed to particularly simulate the effect of small test samples used in laboratory disruption experiments.

  12. A modular BLSS simulation model

    NASA Technical Reports Server (NTRS)

    Rummel, John D.; Volk, Tyler

    1987-01-01

    A bioregenerative life support system (BLSS) for extraterrestrial use will be faced with coordination problems more acute than those in any ecosystem found on Earth. A related problem in BLSS design is providing an interface between the various life support processors, one that will allow for their coordination while still allowing for system expansion. A modular model is presented of a BLSS that interfaces system processors only with the material storage reservoirs, allowing those reservoirs to act as the principal buffers in the system and thus minimizing difficulties with processor coordination. The modular nature of the model allows independent development of the detailed submodels that exist within the model framework. Using this model, BLSS dynamics were investigated under normal conditions and under various failure modes. Partial and complete failures of various components, such as the waste processors or the plants themselves, drive transient responses in the model system, allowing the examination of the effectiveness of the system reservoirs as buffers. The results from simulations help to determine control strategies and BLSS design requirements. An evolved version could be used as an interactive control aid in a future BLSS.

  13. Ubiquitin: molecular modeling and simulations.

    PubMed

    Ganoth, Assaf; Tsfadia, Yossi; Wiener, Reuven

    2013-11-01

    The synthesis and destruction of proteins are imperative for maintaining their cellular homeostasis. In the 1970s, Aaron Ciechanover, Avram Hershko, and Irwin Rose discovered that certain proteins are tagged by ubiquitin before degradation, a discovery that awarded them the 2004 Nobel Prize in Chemistry. Compelling data gathered during the last several decades show that ubiquitin plays a vital role not only in protein degradation but also in many cellular functions including DNA repair processes, cell cycle regulation, cell growth, immune system functionality, hormone-mediated signaling in plants, vesicular trafficking pathways, regulation of histone modification and viral budding. Due to the involvement of ubiquitin in such a large number of diverse cellular processes, flaws and impairments in the ubiquitin system were found to be linked to cancer, neurodegenerative diseases, genetic disorders, and immunological disorders. Hence, deciphering the dynamics and complexity of the ubiquitin system is of significant importance. In addition to experimental techniques, computational methodologies have been gaining increasing influence in protein research and are used to uncover the structure, stability, folding, mechanism of action and interactions of proteins. Notably, molecular modeling and molecular dynamics simulations have become powerful tools that bridge the gap between structure and function while providing dynamic insights and illustrating essential mechanistic characteristics. In this study, we present an overview of molecular modeling and simulations of ubiquitin and the ubiquitin system, evaluate the status of the field, and offer our perspective on future progress in this area of research. PMID:24113788

  14. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  15. A neural-network reinforcement-learning model of domestic chicks that learn to localize the centre of closed arenas.

    PubMed

    Mannella, Francesco; Baldassarre, Gianluca

    2007-03-29

    Previous experiments have shown that when domestic chicks (Gallus gallus) are first trained to locate food elements hidden at the centre of a closed square arena and then are tested in a square arena of double the size, they search for food both at its centre and at a distance from walls similar to the distance of the centre from the walls experienced during training. This paper presents a computational model that successfully reproduces these behaviours. The model is based on a neural-network implementation of the reinforcement-learning actor - critic architecture (in this architecture the 'critic' learns to evaluate perceived states in terms of predicted future rewards, while the 'actor' learns to increase the probability of selecting the actions that lead to higher evaluations). The analysis of the model suggests which type of information and cognitive mechanisms might underlie chicks' behaviours: (i) the tendency to explore the area at a specific distance from walls might be based on the processing of the height of walls' horizontal edges, (ii) the capacity to generalize the search at the centre of square arenas independently of their size might be based on the processing of the relative position of walls' vertical edges on the horizontal plane (equalization of walls' width), and (iii) the whole behaviour exhibited in the large square arena can be reproduced by assuming the existence of an attention process that, at each time, focuses chicks' internal processing on either one of the two previously discussed information sources. The model also produces testable predictions regarding the generalization capabilities that real chicks should exhibit if trained in circular arenas of varying size. The paper also highlights the potentialities of the model to address other experiments on animals' navigation and analyses its strengths and weaknesses in comparison to other models. PMID:17255019

  16. Lessons learned from family-centred models of treatment for children living with HIV: current approaches and future directions

    PubMed Central

    2010-01-01

    Background Despite strong global interest in family-centred HIV care models, no reviews exist that detail the current approaches to family-centred care and their impact on the health of children with HIV. A systematic review of family-centred HIV care programmes was conducted in order to describe both programme components and paediatric cohort characteristics. Methods We searched online databases, including PubMed and the International AIDS Society abstract database, using systematic criteria. Data were extracted regarding programme setting, staffing, services available and enrolment methods, as well as cohort demographics and paediatric outcomes. Results The search yielded 25 publications and abstracts describing 22 separate cohorts. These contained between 43 and 657 children, and varied widely in terms of staffing, services provided, enrolment methods and cohort demographics. Data on clinical outcomes was limited, but generally positive. Excellent adherence, retention in care, and low mortality and/or loss to follow up were documented. Conclusions The family-centred model of care addresses many needs of infected patients and other household members. Major reported obstacles involved recruiting one or more types of family members into care, early diagnosis and treatment of infected children, preventing mortality during children's first six months of highly active antiretroviral therapy, and staffing and infrastructural limitations. Recommendations include: developing interventions to enrol hard-to-reach populations; identifying high-risk patients at treatment initiation and providing specialized care; and designing and implementing evidence-based care packages. Increased research on family-centred care, and better documentation of interventions and outcomes is also critical. PMID:20573285

  17. An Evaluation of the Plant Density Estimator the Point-Centred Quarter Method (PCQM) Using Monte Carlo Simulation

    PubMed Central

    Khan, Md Nabiul Islam; Hijbeek, Renske; Berger, Uta; Koedam, Nico; Grueters, Uwe; Islam, S. M. Zahirul; Hasan, Md Asadul; Dahdouh-Guebas, Farid

    2016-01-01

    Background In the Point-Centred Quarter Method (PCQM), the mean distance of the first nearest plants in each quadrant of a number of random sample points is converted to plant density. It is a quick method for plant density estimation. In recent publications the estimator equations of simple PCQM (PCQM1) and higher order ones (PCQM2 and PCQM3, which uses the distance of the second and third nearest plants, respectively) show discrepancy. This study attempts to review PCQM estimators in order to find the most accurate equation form. We tested the accuracy of different PCQM equations using Monte Carlo Simulations in simulated (having ‘random’, ‘aggregated’ and ‘regular’ spatial patterns) plant populations and empirical ones. Principal Findings PCQM requires at least 50 sample points to ensure a desired level of accuracy. PCQM with a corrected estimator is more accurate than with a previously published estimator. The published PCQM versions (PCQM1, PCQM2 and PCQM3) show significant differences in accuracy of density estimation, i.e. the higher order PCQM provides higher accuracy. However, the corrected PCQM versions show no significant differences among them as tested in various spatial patterns except in plant assemblages with a strong repulsion (plant competition). If N is number of sample points and R is distance, the corrected estimator of PCQM1 is 4(4N − 1)/(π ∑ R2) but not 12N/(π ∑ R2), of PCQM2 is 4(8N − 1)/(π ∑ R2) but not 28N/(π ∑ R2) and of PCQM3 is 4(12N − 1)/(π ∑ R2) but not 44N/(π ∑ R2) as published. Significance If the spatial pattern of a plant association is random, PCQM1 with a corrected equation estimator and over 50 sample points would be sufficient to provide accurate density estimation. PCQM using just the nearest tree in each quadrant is therefore sufficient, which facilitates sampling of trees, particularly in areas with just a few hundred trees per hectare. PCQM3 provides the best density estimations for all

  18. An approximate model for pulsar navigation simulation

    NASA Astrophysics Data System (ADS)

    Jovanovic, Ilija; Enright, John

    2016-02-01

    This paper presents an approximate model for the simulation of pulsar aided navigation systems. High fidelity simulations of these systems are computationally intensive and impractical for simulating periods of a day or more. Simulation of yearlong missions is done by abstracting navigation errors as periodic Gaussian noise injections. This paper presents an intermediary approximate model to simulate position errors for periods of several weeks, useful for building more accurate Gaussian error models. This is done by abstracting photon detection and binning, replacing it with a simple deterministic process. The approximate model enables faster computation of error injection models, allowing the error model to be inexpensively updated throughout a simulation. Testing of the approximate model revealed an optimistic performance prediction for non-millisecond pulsars with more accurate predictions for pulsars in the millisecond spectrum. This performance gap was attributed to noise which is not present in the approximate model but can be predicted and added to improve accuracy.

  19. 3D modelling of soil texture: mapping and incertitude estimation in centre-France

    NASA Astrophysics Data System (ADS)

    Ciampalini, Rossano; Martin, Manuel P.; Saby, Nicolas P. A.; Richer de Forges, Anne C.; Nehlig, Pierre; Martelet, Guillaume; Arrouays, Dominique

    2014-05-01

    Soil texture is an important component of all soil physical-chemical processes. The spatial variability of soil texture plays a crucial role in the evaluation and modelling of all distributed processes. The object of this study is to determine the spatial variation of soil granulometric fractions (i.e., clay, silt, sand) in the region "Centre" of France in relation to the main controlling factors, and to create extended maps of these properties following GlobalSoilMap specifications. For this purpose we used 2487 soil profiles of the French soil database (IGCS - Inventory Management and Soil Conservation) and continuum depth values of the properties within the soil profiles have been calculated with a quadratic splines methodology optimising the spline parameters in each soil profile. We used environmental covariates to predict soil properties within the region at depth intervals 0-5, 5-15, 15-30, 30-60, 60-100, and 100-200 cm. Concerning environmental covariates, we used SRTM and ASTER DEM with 90m and 30m resolution, respectively, to generate terrain parameters and topographic indexes. Other covariates we used are Gamma Ray maps, Corine land cover, available geological and soil maps of the region at scales 1M, 250k and 50k. Soil texture is modeled with the application of the compositional data analysis theory namely, alr-transform (Aitchison, 1986) which considers in statistical calculation the complementary dependence between the different granulometric classes (i.e. 100% constraint). The prediction models of the alr-transformed variables have been developed with the use of boosting regression trees (BRT), then, using a LMM - Linear Mixed Model - that separates a fixed effect from a random effect related to the continuous spatially correlated variation of the property. In this case, the LMM is applied to the two co-regionalized properties (clay and sand alr-transforms). Model uncertainty mapping represents a practical way to describe efficiency and limits of

  20. The cultural route of present and lost landscapes in the centre of Bucharest - digital model

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria

    2015-04-01

    We are developing a digital model of the Magheru boulevard in central Bucharest. This N-S axis in the centre of the city is a unique encounter with interwar architecture. It is a protected area in the city, with buildings listed individually or as group of monuments, and also with protection at urban planning level. But at the same time the landscape does not facilitate the building of urban routes between monuments. A GIS model of the area exists, but does not yet take into account this heritage value of the buildings, being developed in a civil engineering environment. It is also one of the few partial 3D models of Bucharest. It allows datascapes of various buidling characteristics. At the same time a 3D model which equally covers all items in an area is ressources expensive. Hence, we propose, similarly to strategic planning to do a Kevin Lynch type selection. Landmarks will be identified as nodes of the routes, and the remaining area treated as zone. Ways connect the nodes and we paid special attention as we will see to their landscape. We developed a concept on how to further build from the idea of layers in GIS to include the issue of scale. As such, floor plans can build strategic points for the nodes of the route such as in Nolli or Sitte plans. Cooperation between GIS and GoogleEarth is envisaged, since GoogleEarth allows for detailing in SketchUp for the interior space. This way we developed an alternative digital model to the levels of detail of CityGML, the classical for 3D city models. The route itself is to be analysed with the method of Space Syntax. While this part of the research focused on the built heritage, on culture, we included also issues of landscape. First, the landscape of the boulevard has to be shaped as to build the route between these nodes of the route. Our concept includes the creation of pocket parks and of links between the pocket parks through vegetal and mineral elements to connect them. Existing urban spaces and empty plots are

  1. Maximizing the Impact of Telepractice through a Multifaceted Service Delivery Model at The Shepherd Centre, Australia

    ERIC Educational Resources Information Center

    Davis, Aleisha; Hopkins, Tracy; Abrahams, Yetta

    2012-01-01

    The Shepherd Centre is a nonprofit early intervention program in New South Wales, Australia, providing listening and spoken language services through an interdisciplinary team approach to children with hearing loss and their families. The program has been providing distance services to families in rural and remote areas of Australia and in other…

  2. Student Learning Centre (SLC) Embraces the New Melbourne Model of Teaching: Facilitating Collaborative Learning

    ERIC Educational Resources Information Center

    Ball, Sarah

    2010-01-01

    Learning is about discovery and change. As schools and universities look to the future, it is fundamental that they provide environments that facilitate collaborative learning and act as points for interaction and social activity. The redevelopment of the existing Engineering Library into a Student Learning Centre (SLC) embraces the new Melbourne…

  3. A Child-Centred Evaluation Model: Gaining the Children's Perspective in Evaluation Studies in China

    ERIC Educational Resources Information Center

    Fleer, Marilyn; Li, Liang

    2016-01-01

    In recent times there has been a major international push for giving voice to children in the provision of services for early education and development particularly among researchers and non-government organisations. However, what has been missing from this body of literature and activity is the children's perspective when centres and services are…

  4. The ICT Centre Model in Andalusia (Spain): Results of a Resolute Educational Policy

    ERIC Educational Resources Information Center

    Aguaded, J. Ignacio; Fandos, M.; Perez, M. Amor

    2009-01-01

    This paper displays some results from research carried out in Andalusia (Spain) to evaluate the impact of the educational innovation policy developed by the regional government through widely introducing Information and Communication Technologies (ICT) in primary and secondary schools (ICT Centres). Specifically, it analysed the effect of the…

  5. Aeroacoustic simulation for phonation modeling

    NASA Astrophysics Data System (ADS)

    Irwin, Jeffrey; Hanford, Amanda; Craven, Brent; Krane, Michael

    2011-11-01

    The phonation process occurs as air expelled from the lungs creates a pressure drop and a subsequent air flow across the larynx. The fluid-structure interaction between the turbulent air flow and oscillating vocal folds, combined with additional resonance in the oral and nasal cavities, creates much of what we hear in the human voice. As many voice-related disorders can be traced to irregular vocal tract shape or motion, it is important to understand in detail the physics involved in the phonation process. To numerically compute the physics of phonation, a solver must be able to accurately model acoustic airflow through a moving domain. The open-source CFD package OpenFOAM is currently being used to evaluate existing solvers against simple acoustic test cases, including an open-ended resonator and an expansion chamber, both of which utilize boundary conditions simulating acoustic sources as well as anechoic termination. Results of these test cases will be presented and compared with theory, and the future development of a three-dimensional vocal tract model and custom-mode acoustic solver will be discussed. Acknowledge support of NIH grant 5R01DC005642 and ARL E&F program.

  6. Modeling the Patient Journey from Injury to Community Reintegration for Persons with Acute Traumatic Spinal Cord Injury in a Canadian Centre

    PubMed Central

    Santos, Argelio; Gurling, James; Dvorak, Marcel F.; Noonan, Vanessa K.; Fehlings, Michael G.; Burns, Anthony S.; Lewis, Rachel; Soril, Lesley; Fallah, Nader; Street, John T.; Bélanger, Lise; Townson, Andrea; Liang, Liping; Atkins, Derek

    2013-01-01

    Background A patient’s journey through the health care system is influenced by clinical and system processes across the continuum of care. Methods To inform optimized access to care and patient flow for individuals with traumatic spinal cord injury (tSCI), we developed a simulation model that can examine the full impact of therapeutic or systems interventions across the care continuum for patients with traumatic spinal cord injuries. The objective of this paper is to describe the detailed development of this simulation model for a major trauma and a rehabilitation centre in British Columbia (BC), Canada, as part of the Access to Care and Timing (ACT) project and is referred to as the BC ACT Model V1.0. Findings To demonstrate the utility of the simulation model in clinical and administrative decision-making we present three typical scenarios that illustrate how an investigator can track the indirect impact(s) of medical and administrative interventions, both upstream and downstream along the continuum of care. For example, the model was used to estimate the theoretical impact of a practice that reduced the incidence of pressure ulcers by 70%. This led to a decrease in acute and rehabilitation length of stay of 4 and 2 days, respectively and a decrease in bed utilization of 9% and 3% in acute and rehabilitation. Conclusion The scenario analysis using the BC ACT Model V1.0 demonstrates the flexibility and value of the simulation model as a decision-making tool by providing estimates of the effects of different interventions and allowing them to be objectively compared. Future work will involve developing a generalizable national Canadian ACT Model to examine differences in care delivery and identify the ideal attributes of SCI care delivery. PMID:24023623

  7. An introduction to enterprise modeling and simulation

    SciTech Connect

    Ostic, J.K.; Cannon, C.E.

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  8. Structured building model reduction toward parallel simulation

    SciTech Connect

    Dobbs, Justin R.; Hencey, Brondon M.

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  9. Space station models, mockups and simulators

    NASA Technical Reports Server (NTRS)

    Miller, K. H.; Osgood, A.

    1985-01-01

    Schematic outlines for space station models, mockups, and simulators are presented. The types of Boeing models, mockups, and simulators are given along with the classes and characteristics. The use of models in the 767 program is briefly given. Computerized human factors tools are outlined. The use of computer aided design and computer aided manufacturing in the approach for the space station is advocated.

  10. Survey of models/simulations at RADC

    NASA Astrophysics Data System (ADS)

    Denz, M. L.

    1982-11-01

    A survey was conducted to evaluate the current state of the art and technology of model/simulation capabilities at Rome Air Development Center, Griffiss AFB, NY. This memo presents a tabulation of 28 such models/simulations. These models/simulations are being used within RADC in the development and evaluations of Command, Control, Communications and Intelligence (C3I) technology. The results of this survey are incorporated in this memo.

  11. Theory, Modeling, and Simulation of Semiconductor Lasers

    NASA Technical Reports Server (NTRS)

    Ning, Cun-Zheng; Saini, Subbash (Technical Monitor)

    1998-01-01

    Semiconductor lasers play very important roles in many areas of information technology. In this talk, I will first give an overview of semiconductor laser theory. This will be followed by a description of different models and their shortcomings in modeling and simulation. Our recent efforts in constructing a fully space and time resolved simulation model will then be described. Simulation results based on our model will be presented. Finally the effort towards a self-consistent and comprehensive simulation capability for the opto-electronics integrated circuits (OEICs) will be briefly reviewed.

  12. Evaluating uncertainty in stochastic simulation models

    SciTech Connect

    McKay, M.D.

    1998-02-01

    This paper discusses fundamental concepts of uncertainty analysis relevant to both stochastic simulation models and deterministic models. A stochastic simulation model, called a simulation model, is a stochastic mathematical model that incorporates random numbers in the calculation of the model prediction. Queuing models are familiar simulation models in which random numbers are used for sampling interarrival and service times. Another example of simulation models is found in probabilistic risk assessments where atmospheric dispersion submodels are used to calculate movement of material. For these models, randomness comes not from the sampling of times but from the sampling of weather conditions, which are described by a frequency distribution of atmospheric variables like wind speed and direction as a function of height above ground. A common characteristic of simulation models is that single predictions, based on one interarrival time or one weather condition, for example, are not nearly as informative as the probability distribution of possible predictions induced by sampling the simulation variables like time and weather condition. The language of model analysis is often general and vague, with terms having mostly intuitive meaning. The definition and motivations for some of the commonly used terms and phrases offered in this paper lead to an analysis procedure based on prediction variance. In the following mathematical abstraction the authors present a setting for model analysis, relate practical objectives to mathematical terms, and show how two reasonable premises lead to a viable analysis strategy.

  13. Multidecadal simulation of coastal fog with a regional climate model

    NASA Astrophysics Data System (ADS)

    O'Brien, Travis A.; Sloan, Lisa C.; Chuang, Patrick Y.; Faloona, Ian C.; Johnstone, James A.

    2013-06-01

    In order to model stratocumulus clouds and coastal fog, we have coupled the University of Washington boundary layer model to the regional climate model, RegCM (RegCM-UW). By comparing fog occurrences observed at various coastal airports in the western United States, we show that RegCM-UW has success at modeling the spatial and temporal (diurnal, seasonal, and interannual) climatology of northern California coastal fog. The quality of the modeled fog estimate depends on whether coast-adjacent ocean or land grid cells are used; for the model runs shown here, the oceanic grid cells seem to be most appropriate. The interannual variability of oceanic northern California summertime fog, from a multi-decadal simulation, has a high and statistically significant correlation with the observed interannual variability ( r = 0.72), which indicates that RegCM-UW is capable of investigating the response of fog to long-term climatological forcing. While RegCM-UW has a number of aspects that would benefit from further investigation and development, RegCM-UW is a new tool for investigating the climatology of coastal fog and the physical processes that govern it. We expect that with appropriate physical parameterizations and moderate horizontal resolution, other climate models should be capable of simulating coastal fog. The source code for RegCM-UW is publicly available, under the GNU license, through the International Centre for Theoretical Physics.

  14. A Generic Multibody Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Neuhaus, Jason Richard; Kenney, Patrick Sean

    2006-01-01

    Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.

  15. SSA Modeling and Simulation with DIRSIG

    NASA Astrophysics Data System (ADS)

    Bennett, D.; Allen, D.; Dank, J.; Gartley, M.; Tyler, D.

    2014-09-01

    We describe and demonstrate a robust, physics-based modeling system to simulate ground and space-based observations of both LEO and GEO objects. With the DIRSIG radiometry engine at its core, our system exploits STK, adaptive optics modeling, and detector effects to produce high fidelity simulated images and radiometry. Key to generating quantitative simulations is our ability to attribute engineering-quality, faceted CAD models with reflective and emissive properties derived from laboratory measurements, including the spatial structure of such difficult materials as MLI. In addition to simulated video imagery, we will demonstrate a computational procedure implementing a position-based dynamics approach to shrink wrap MLI around space components.

  16. VHDL simulation with access to transistor models

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  17. Crop Simulation Models and Decision Support Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The first computer simulation models for agricultural systems were developed in the 1970s. These early models simulated potential production for major crops as a function of weather conditions, especially temperature and solar radiation. At a later stage, the water component was added to be able to ...

  18. Resist profile simulation with fast lithography model

    NASA Astrophysics Data System (ADS)

    He, Yan-Ying; Chou, Chih-Shiang; Tang, Yu-Po; Huang, Wen-Chun; Liu, Ru-Gun; Gau, Tsai-Sheng

    2014-03-01

    A traditional approach to construct a fast lithographic model is to match wafer top-down SEM images, contours and/or gauge CDs with a TCC model plus some simple resist representation. This modeling method has been proven and is extensively used for OPC modeling. As the technology moves forward, this traditional approach has become insufficient in regard to lithography weak point detection, etching bias prediction, etc. The drawback of this approach is from metrology and simulation. First, top-down SEM is only good for acquiring planar CD information. Some 3D metrology such as cross-section SEM or AFM is necessary to obtain the true resist profile. Second, the TCC modeling approach is only suitable for planar image simulation. In order to model the resist profile, full 3D image simulation is needed. Even though there are many rigorous simulators capable of catching the resist profile very well, none of them is feasible for full-chip application due to the tremendous consumption of computational resource. The authors have proposed a quasi-3D image simulation method in the previous study [1], which is suitable for full-chip simulation with the consideration of sidewall angles, to improve the model accuracy of planar models. In this paper, the quasi-3D image simulation is extended to directly model the resist profile with AFM and/or cross-section SEM data. Resist weak points detected by the model generated with this 3D approach are verified on the wafer.

  19. Development of a One Health National Capacity in Africa : the Southern African Centre for Infectious Disease Surveillance (SACIDS) One Health Virtual Centre Model.

    PubMed

    Rweyemamu, Mark; Kambarage, Dominic; Karimuribo, Esron; Wambura, Philemon; Matee, Mecky; Kayembe, Jean-Marie; Mweene, Aaron; Neves, Luis; Masumu, Justin; Kasanga, Christopher; Hang'ombe, Bernard; Kayunze, Kim; Misinzo, Gerald; Simuunza, Martin; Paweska, Janusz T

    2013-01-01

    Among the many challenges to health, infectious diseases stand out for their ability to have a profound impact on humans and animals. The recent years have witnessed an increasing number of novel infectious diseases. The numerous examples of infections which originated from animals suggest that the zoonotic pool is an important and potentially rich source of emerging diseases. Since emergence and re-emergence of pathogens, and particularly zoonotic agents, occur at unpredictable rates in animal and human populations, infectious diseases will constitute a significant challenge for the public health and animal health communities in the twenty-first century. The African continent suffers from one of the highest burdens of infectious diseases of humans and animals in the world but has the least capacity for their detection, identification and monitoring. Lessons learnt from recent zoonotic epidemics in Africa and elsewhere clearly indicate the need for coordinated research, interdisciplinary centres, response systems and infrastructures, integrated surveillance systems and workforce development strategies. More and stronger partnerships across national and international sectors (human health, animal health, environment) and disciplines (natural and social sciences) involving public, academic and private organisations and institutions will be required to meet the present and future challenges of infectious diseases. In order to strengthen the efficiency of early warning systems, monitoring trends and disease prediction and timely outbreak interventions for the benefit of the national and international community, it is essential that each nation improves its own capacity in disease recognition and laboratory competence. The SACIDS, a One Health African initiative linking southern African academic and research institutions in smart partnership with centres of science excellence in industrialised countries as well as international research centres, strives to strengthen

  20. Protein Simulation Data in the Relational Model.

    PubMed

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646

  1. Protein Simulation Data in the Relational Model

    PubMed Central

    Simms, Andrew M.; Daggett, Valerie

    2011-01-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost—significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646

  2. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement. PMID:19668066

  3. SIMULATION MODELING OF GASTROINTESTINAL ABSORPTION

    EPA Science Inventory

    Mathematical dosimetry models incorporate mechanistic determinants of chemical disposition in a living organism to describe relationships between exposure concentration and the internal dose needed for PBPK models and human health risk assessment. Because they rely on determini...

  4. An Extensible Reduced Order Model Builder for Simulation and Modeling

    SciTech Connect

    2012-09-28

    REVEAL is a software framework for building reduced order models (surrogate models) for high fidelity complex scientific simulations. REVEAL is designed to do reduced order modeling and sensitivity analysis for scientific simulations. REVEAL incorporates a range of sampling and regression methods. It provides complete user environment and is adaptable to new simulators, runs jobs on any computing platform of choice, automatically post processes simulation results and provides a range of data analysis tools. The software is generic and can easily be extended to incorporate new methods, simulators.

  5. An Extensible Reduced Order Model Builder for Simulation and Modeling

    Energy Science and Technology Software Center (ESTSC)

    2012-09-28

    REVEAL is a software framework for building reduced order models (surrogate models) for high fidelity complex scientific simulations. REVEAL is designed to do reduced order modeling and sensitivity analysis for scientific simulations. REVEAL incorporates a range of sampling and regression methods. It provides complete user environment and is adaptable to new simulators, runs jobs on any computing platform of choice, automatically post processes simulation results and provides a range of data analysis tools. The softwaremore » is generic and can easily be extended to incorporate new methods, simulators.« less

  6. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  7. Simulation of the great plains low-level jet and associated clouds by general circulation models

    SciTech Connect

    Ghan, S.J.; Bian, X.; Corsetti, L.

    1996-07-01

    The low-level jet frequently observed in the Great Plains of the United States forms preferentially at night and apparently influences the timing of the thunderstorms in the region. The authors have found that both the European Centre for Medium-Range Weather Forecasts general circulation model and the National Center for Atmospheric Research Community Climate Model simulate the low-level jet rather well, although the spatial distribution of the jet frequency simulated by the two GCM`s differ considerably. Sensitivity experiments have demonstrated that the simulated low-level jet is surprisingly robust, with similar simulations at much coarser horizontal and vertical resolutions. However, both GCM`s fail to simulate the observed relationship between clouds and the low-level jet. The pronounced nocturnal maximum in thunderstorm frequency associated with the low-level jet is not simulated well by either GCM, with only weak evidence of a nocturnal maximum in the Great Plains. 36 refs., 20 figs.

  8. Simulation modeling and analysis with Arena

    SciTech Connect

    Tayfur Altiok; Benjamin Melamed

    2007-06-15

    The textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings. Chapter 13.3.3 is on coal loading operations on barges/tugboats.

  9. Predicting Species Distributions Using Record Centre Data: Multi-Scale Modelling of Habitat Suitability for Bat Roosts

    PubMed Central

    Bellamy, Chloe; Altringham, John

    2015-01-01

    Conservation increasingly operates at the landscape scale. For this to be effective, we need landscape scale information on species distributions and the environmental factors that underpin them. Species records are becoming increasingly available via data centres and online portals, but they are often patchy and biased. We demonstrate how such data can yield useful habitat suitability models, using bat roost records as an example. We analysed the effects of environmental variables at eight spatial scales (500 m – 6 km) on roost selection by eight bat species (Pipistrellus pipistrellus, P. pygmaeus, Nyctalus noctula, Myotis mystacinus, M. brandtii, M. nattereri, M. daubentonii, and Plecotus auritus) using the presence-only modelling software MaxEnt. Modelling was carried out on a selection of 418 data centre roost records from the Lake District National Park, UK. Target group pseudoabsences were selected to reduce the impact of sampling bias. Multi-scale models, combining variables measured at their best performing spatial scales, were used to predict roosting habitat suitability, yielding models with useful predictive abilities. Small areas of deciduous woodland consistently increased roosting habitat suitability, but other habitat associations varied between species and scales. Pipistrellus were positively related to built environments at small scales, and depended on large-scale woodland availability. The other, more specialist, species were highly sensitive to human-altered landscapes, avoiding even small rural towns. The strength of many relationships at large scales suggests that bats are sensitive to habitat modifications far from the roost itself. The fine resolution, large extent maps will aid targeted decision-making by conservationists and planners. We have made available an ArcGIS toolbox that automates the production of multi-scale variables, to facilitate the application of our methods to other taxa and locations. Habitat suitability modelling has

  10. Rabi multi-sector reservoir simulation model

    SciTech Connect

    Bruijnzeels, C.; O`Halloran, C.

    1995-12-31

    To ensure optimum ultimate recovery of the 46 meter thick oil rim of the Rabi Field in Gabon, a full field simulation model was required. Due to it`s size and complexity, with local cusping, coning and geological circumstances dominating individual well behavior, a single full field model would be too large for existing hardware. A method was developed to simulate the full field with 5 separate sector models, whilst allowing the development in one sector model to have an effect on the boundary conditions of another sector. In this manner, the 13 x 4.5 km field could be simulated with a horizontal well spacing down to 175 meter. This paper focuses on the method used to attach single 3-phase tank cells to a sector simulation grid in order to represent non-simulated parts of the field. It also describes the history matching methodology and how to run a multisector model in forecasting mode. This method can be used for any reservoir, where size and complexity require large reservoir simulation models that normally could not be modeled within the constraints of available computer facilities. Detailed studies can be conducted on specific parts of a field, whilst allowing for dynamic flow and pressure effects caused by the rest of the field.

  11. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  12. Theory, modeling, and simulation annual report, 1992

    SciTech Connect

    Not Available

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  13. DEVELOPMENT OF THE ADVANCED UTILITY SIMULATION MODEL

    EPA Science Inventory

    The paper discusses the development of the Advanced Utility Simulation Model (AUSM), developed for the National Acid Precipitation Assessment Program (NAPAP), to forecast air emissions of pollutants from electric utilities. USM integrates generating unit engineering detail with d...

  14. A Simulation To Model Exponential Growth.

    ERIC Educational Resources Information Center

    Appelbaum, Elizabeth Berman

    2000-01-01

    Describes a simulation using dice-tossing students in a population cluster to model the growth of cancer cells. This growth is recorded in a scatterplot and compared to an exponential function graph. (KHR)

  15. MODELING CONCEPTS FOR BMP/LID SIMULATION

    EPA Science Inventory

    Enhancement of simulation options for stormwater best management practices (BMPs) and hydrologic source control is discussed in the context of the EPA Storm Water Management Model (SWMM). Options for improvement of various BMP representations are presented, with emphasis on inco...

  16. Mathematical Model Development and Simulation Support

    NASA Technical Reports Server (NTRS)

    Francis, Ronald C.; Tobbe, Patrick A.

    2000-01-01

    This report summarizes the work performed in support of the Contact Dynamics 6DOF Facility and the Flight Robotics Lab at NASA/ MSFC in the areas of Mathematical Model Development and Simulation Support.

  17. LAKE WATER TEMPERATURE SIMULATION MODEL

    EPA Science Inventory

    Functional relationships to describe surface wind mixing, vertical turbulent diffusion, convective heat transfer, and radiation penetration based on data from lakes in Minnesota have been developed. hese relationships have been introduced by regressing model parameters found eith...

  18. Modelling metal centres, acid sites and reaction mechanisms in microporous catalysts.

    PubMed

    O'Malley, Alexander J; Logsdail, A J; Sokol, A A; Catlow, C R A

    2016-07-01

    We discuss the role of QM/MM (embedded cluster) computational techniques in catalytic science, in particular their application to microporous catalysis. We describe the methodologies employed and illustrate their utility by briefly summarising work on metal centres in zeolites. We then report a detailed investigation into the behaviour of methanol at acidic sites in zeolites H-ZSM-5 and H-Y in the context of the methanol-to-hydrocarbons/olefins process. Studying key initial steps of the reaction (the adsorption and subsequent methoxylation), we probe the effect of framework topology and Brønsted acid site location on the energetics of these initial processes. We find that although methoxylation is endothermic with respect to the adsorbed system (by 17-56 kJ mol(-1) depending on the location), there are intriguing correlations between the adsorption/reaction energies and the geometries of the adsorbed species, of particular significance being the coordination of methyl hydrogens. These observations emphasise the importance of adsorbate coordination with the framework in zeolite catalysed conversions, and how this may vary with framework topology and site location, particularly suited to investigation by QM/MM techniques. PMID:27136967

  19. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  20. Modeling: The Role of Atomistic Simulations

    SciTech Connect

    Aga, Rachel S; Morris, James R

    2007-01-01

    A major advantage of atomistic simulations is that a detailed picture of the model under investigation is available, and so they have been very instrumental in explaining the connection of macroscopic properties to the atomic scale. Simulations play a significant role in the development and testing of theories. For example, simulations have been extensively used to test the mode-coupling theory (MCT). The theory predicts that at some critical temperature Tc, known as the mode-coupling temperature, the supercooled liquid undergoes a structural arrest, prohibiting the system from accessing all possible states, thus, essentially undergoing an ergodic to nonergodic transition. It gives definite predictions on various correlation functions that can be calculated directly in simulations. Simulations and MCT have played a tremendous role in elucidating a majority of what we now understand about the dynamics of glass-forming systems. Simulations can also be used to compare with experimental results to validate the model, so that one can use simulation results to measure properties not accessible to experiments. In many cases, as will be illustrated in the next sections, results of simulations motivate experimental investigations. Part of the goal of this chapter is to examine the contributions of atomic simulations to the current state of understanding of metallic glasses.

  1. Modeling of transformers using circuit simulators

    SciTech Connect

    Archer, W.E.; Deveney, M.F.; Nagel, R.L.

    1994-07-01

    Transformers of two different designs; and unencapsulated pot core and an encapsulated toroidal core have been modeled for circuit analysis with circuit simulation tools. We selected MicroSim`s PSPICE and Anology`s SABER as the simulation tools and used experimental BH Loop and network analyzer measurements to generate the needed input data. The models are compared for accuracy and convergence using the circuit simulators. Results are presented which demonstrate the effects on circuit performance from magnetic core losses, eddy currents, and mechanical stress on the magnetic cores.

  2. Intelligent Simulation Model To Facilitate EHR Training

    PubMed Central

    Mohan, Vishnu; Scholl, Gretchen; Gold, Jeffrey A.

    2015-01-01

    Despite the rapid growth of EHR use, there are currently no standardized protocols for EHR training. A simulation EHR environment may offer significant advantages with respect to EHR training, but optimizing the training paradigm requires careful consideration of the simulation model itself, and how it is to be deployed during training. In this paper, we propose Six Principles that are EHR-agnostic and provide the framework for the development of an intelligent simulation model that can optimize EHR training by replicating real-world clinical conditions and appropriate cognitive loads. PMID:26958229

  3. JAK/STAT signalling--an executable model assembled from molecule-centred modules demonstrating a module-oriented database concept for systems and synthetic biology.

    PubMed

    Blätke, Mary Ann; Dittrich, Anna; Rohr, Christian; Heiner, Monika; Schaper, Fred; Marwan, Wolfgang

    2013-06-01

    Mathematical models of molecular networks regulating biological processes in cells or organisms are most frequently designed as sets of ordinary differential equations. Various modularisation methods have been applied to reduce the complexity of models, to analyse their structural properties, to separate biological processes, or to reuse model parts. Taking the JAK/STAT signalling pathway with the extensive combinatorial cross-talk of its components as a case study, we make a natural approach to modularisation by creating one module for each biomolecule. Each module consists of a Petri net and associated metadata and is organised in a database publically accessible through a web interface (). The Petri net describes the reaction mechanism of a given biomolecule and its functional interactions with other components including relevant conformational states. The database is designed to support the curation, documentation, version control, and update of individual modules, and to assist the user in automatically composing complex models from modules. Biomolecule centred modules, associated metadata, and database support together allow the automatic creation of models by considering differential gene expression in given cell types or under certain physiological conditions or states of disease. Modularity also facilitates exploring the consequences of alternative molecular mechanisms by comparative simulation of automatically created models even for users without mathematical skills. Models may be selectively executed as an ODE system, stochastic, or qualitative models or hybrid and exported in the SBML format. The fully automated generation of models of redesigned networks by metadata-guided modification of modules representing biomolecules with mutated function or specificity is proposed. PMID:23443149

  4. Binary black hole simulations for surrogate modeling

    NASA Astrophysics Data System (ADS)

    Hemberger, Daniel; SXS Collaboration

    2016-03-01

    Analytic or data-driven models of binary black hole coalescences are used to densely cover the full parameter space, because it is computationally infeasible to do so using numerical relativity (NR). However, these models still need input from NR, either for calibration, or because the model is agnostic to the underlying physics. We use the Spectral Einstein Code (SpEC) to provide a large number of simulations to aid the construction of a NR surrogate model in a 5-dimensional subset of the parameter space. I will present an analysis of the simulations that were used to construct the surrogate model. I will also describe the infrastructure that was needed to efficiently perform a large number of simulations across many computational resources.

  5. An assessment of CSIRO Conformal Cubic Atmospheric Model simulations over Sri Lanka

    NASA Astrophysics Data System (ADS)

    Thevakaran, A.; McGregor, J. L.; Katzfey, J.; Hoffmann, P.; Suppiah, R.; Sonnadara, D. U. J.

    2016-03-01

    In this study, we present an assessment of the Conformal Cubic Atmospheric Model (CCAM) 50 km simulations forced by the sea surface temperature and sea ice concentration of six global climate models (GCMs) (ACCESS1-0, CCSM4, GFDL-CM3, NorESM, MPI-ESM and CNRM-CM5) from the Coupled Model Inter-comparison Project Phase 5 (CMIP5) over South Asia, centred on Sri Lanka. The model simulations were compared with the data provided by the Asian Precipitation Highly Resolved Observational Data Integration towards Evaluation of Water Resource (APHRODITE) project and ERA-Interim from the European Centre for Medium range Weather Forecast (ECMWF) over a broad region centred on Sri Lanka. This broad region includes South Asia and northern Indian Ocean. Statistical measures such as pattern correlations, mean biases and root mean square errors were calculated separately for the four seasons. Results based on statistical tests indicate that the current CCAM simulations capture the spatial patterns of 10 m wind speed, mean sea level pressure, temperature and rainfall over a broad region over South Asia fairly well. The annual cycles of temperature and rainfall were also compared against observations over the northern and southern regions of Sri Lanka by taking the field average of each model and the observed data. The characteristics of the observed annual variations of rainfall and temperature over the smaller domains are not very well captured by the CCAM simulations. There are differences in the magnitudes of the temperature and rainfall in the six member CCAM simulations. Comparatively, the two CCAM simulations CNRM-CM5 and GFDL-CM3 show slightly better agreement over the Sri Lankan region.

  6. Kuroshio Extension dynamics from satellite altimetry and a model simulation

    NASA Astrophysics Data System (ADS)

    Mitchell, J. L.; Teague, W. J.; Jacobs, G. A.; Hurlburt, H. E.

    1996-01-01

    Altimeter data from the Geosat Exact Repeat Mission (ERM) are analyzed with the aid of a simulation from an eddy-resolving primitive equation model of the North Pacific basin in the region of the Kuroshio and Kuroshio Extension. The model domain covers the Pacific Ocean north of 20°S and has a resolution of 0.125° latitude and 0.176° longitude. The model is synoptically driven by daily 1000-mbar winds from the European Centre for Medium-Range Weather Forecasts (ECMWF) which encompass the Geosat time period. Model output is sampled along Geosat ground tracks for the period of the ERM. Additionally, the model and the Geosat data are compared with climatological hydrography and satellite IR frontal position analyses. Analyses compared include maps of sea surface height (SSH) mean and variability, eddy kinetic energy (EKE), seasonal transport anomaly, and time-longitude plots of SSH anomaly. The model simulation provides annual mean SSH fields for 1987 and 1988 which reproduce the four quasi-permanent meanders seen in hydrographic climatology (cyclonic at 138°E and anticyclonic at 144°E, 150°E, and 160°E). These are linked to the bottom topography. In the model simulation, Geosat altimeter data, and climatology, we observe four peaks in SSH variability associated with meander activity and two peaks in EKE, with the strongest about 3200 cm2 s-2 along the mean Kuroshio path in the Geosat data. The local maxima in SSH variability tend to occur where relatively strong, topographically steered meridional abyssal currents intersect the zonally oriented Kuroshio Extension. Westward propagation of SSH anomalies at phase speeds of 2 to 3 cm s-1 in the region east of 155°E is observed in the model simulation and Geosat observations. A late summer maximum in the upper ocean transport anomaly of the Kuroshio Extension is inferred from changes in the cross-stream differential in SSH from the simulation and Geosat observations.

  7. River system environmental modeling and simulation methodology

    SciTech Connect

    Rao, N.B.

    1981-01-01

    Several computer models have been built to examine pollution in rivers. However, the current state of the art in this field emphasizes problem solving using specific programs. A general methodology for building and simulating models of river systems is lacking. Thus, the purpose of this research was to develop a methodology which can be used to conceptualize, visualize, construct and analyze using simulation, models of pollution in river systems. The conceptualization and visualization of these models was facilitated through a network representation. The implementation of the models was accomplished using the capabilities of an existing simulation language, GASP V. The methodology also provides data management facilities for model outputs through the use of the Simulation Data Language (SDL), and high quality plotting facilities through the use of the graphics package DISSPLA (Display Integrated Software System and Plotting Language). Using this methodology, a river system is modeled as consisting of certain elements, namely reaches, junctions, dams, reservoirs, withdrawals and pollutant sources. All these elements of the river system are described in a standard form which has been implemented on a computer. This model, when executed, produces spatial and temporal distributions of the pollutants in the river system. Furthermore, these outputs can be stored in a database and used to produce high quality plots. The result of this research is a methodology for building, implementing and examining the results of models of pollution in river systems.

  8. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  9. Atmospheric model intercomparison project: Monsoon simulations

    SciTech Connect

    Sperber, K.R.; Palmer, T.N.

    1994-06-01

    The simulation of monsoons, in particular the Indian summer monsoon, has proven to be a critical test of a general circulation model`s ability to simulate tropical climate and variability. The Monsoon Numerical Experimentation Group has begun to address questions regarding the predictability of monsoon extremes, in particular conditions associated with El Nino and La Nina conditions that tend to be associated with drought and flood conditions over the Indian subcontinent, through a series of seasonal integrations using analyzed initial conditions from successive days in 1987 and 1988. In this paper the authors present an analysis of simulations associated with the Atmospheric Model Intercomparison Project (AMIP), a coordinated effort to simulate the 1979--1988 decade using standardized boundary conditions with approximately 30 atmospheric general circulation models. The 13 models analyzed to date are listed. Using monthly mean data from these simulations they have calculated indices of precipitation and wind shear in an effort to access the performance of the models over the course of the AMIP decade.

  10. Revolutions in energy through modeling and simulation

    SciTech Connect

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  11. Towards a High Resolution Cellular Model for Coastal Simulation (CEMCOS)

    NASA Astrophysics Data System (ADS)

    Dearing, J.; Plater, A. J.; Richmond, N. C.

    2004-12-01

    The aim of this research is to develop a cellular model for coastal simulation in response to changing climate and sea-level, as a contribution to the UK Tyndall Centre's Research Theme 4: Sustaining the Coastal Zone. The modelling approach uses simple cell-based rules of sediment erosion, transport and deposition operating between adjacent cells. This enables the model to include the full range of processes and properties of the coastal environment, including nonlinear behaviour, using only local interactions at discrete time intervals. Tide propagation and wave action drive sediment transport, which is further conditioned by erosion thresholds related to grain size and vegetation growth. Here, we report an overview of this one-year project and details on model design and validation. This includes tide and wave parameterisation, resulting in sediment transport over a 3-D grid of cells representing estuary morphology and bathymetry. The model (CEMCOS) is being designed to be fully generic and exportable to different coastal areas, with initial testing and validation being conducted using published bathymetric and cartographic data over the last c.150 years for the Blackwater Estuary in eastern England.

  12. Electrical Load Modeling and Simulation

    SciTech Connect

    Chassin, David P.

    2013-01-01

    Electricity consumer demand response and load control are playing an increasingly important role in the development of a smart grid. Smart grid load management technologies such as Grid FriendlyTM controls and real-time pricing are making their way into the conventional model of grid planning and operations. However, the behavior of load both affects, and is affected by load control strategies that are designed to support electric grid planning and operations. This chapter discussed the natural behavior of electric loads, how it interacts with various load control and demand response strategies, what the consequences are for new grid operation concepts and the computing issues these new technologies raise.

  13. Non-linear transformer modeling and simulation

    SciTech Connect

    Archer, W.E.; Deveney, M.F.; Nagel, R.L.

    1994-08-01

    Transformers models for simulation with Pspice and Analogy`s Saber are being developed using experimental B-H Loop and network analyzer measurements. The models are evaluated for accuracy and convergence using several test circuits. Results are presented which demonstrate the effects on circuit performance from magnetic core losses eddy currents and mechanical stress on the magnetic cores.

  14. Rotor systems research aircraft simulation mathematical model

    NASA Technical Reports Server (NTRS)

    Houck, J. A.; Moore, F. L.; Howlett, J. J.; Pollock, K. S.; Browne, M. M.

    1977-01-01

    An analytical model developed for evaluating and verifying advanced rotor concepts is discussed. The model was used during in both open loop and real time man-in-the-loop simulation during the rotor systems research aircraft design. Future applications include: pilot training, preflight of test programs, and the evaluation of promising concepts before their implementation on the flight vehicle.

  15. Estimating solar radiation for plant simulation models

    NASA Technical Reports Server (NTRS)

    Hodges, T.; French, V.; Leduc, S.

    1985-01-01

    Five algorithms producing daily solar radiation surrogates using daily temperatures and rainfall were evaluated using measured solar radiation data for seven U.S. locations. The algorithms were compared both in terms of accuracy of daily solar radiation estimates and terms of response when used in a plant growth simulation model (CERES-wheat). Requirements for accuracy of solar radiation for plant growth simulation models are discussed. One algorithm is recommended as being best suited for use in these models when neither measured nor satellite estimated solar radiation values are available.

  16. Modeling and simulation of metal forming equipment

    NASA Astrophysics Data System (ADS)

    Frazier, W. G.; Medina, E. A.; Malas, J. C.; Irwin, R. D.

    1997-04-01

    The demand for components made from hard-to-form materials is growing, as is the need to better understand and improve the control of metal forming equipment. Techniques are presented for developing accurate models and computer simulations of metal forming equipment for the purpose of improving metal forming process design. Emphasis is placed on modeling the dynamic behavior of hydraulic vertical forge presses, although similar principles apply to other types of metal forming equipment. These principles are applied to modeling and simulation of a 1000 ton forge press in service at Wright-Patterson Air Force Base, Ohio, along with experimental verification.

  17. Molecular simulation and modeling of complex I.

    PubMed

    Hummer, Gerhard; Wikström, Mårten

    2016-07-01

    Molecular modeling and molecular dynamics simulations play an important role in the functional characterization of complex I. With its large size and complicated function, linking quinone reduction to proton pumping across a membrane, complex I poses unique modeling challenges. Nonetheless, simulations have already helped in the identification of possible proton transfer pathways. Simulations have also shed light on the coupling between electron and proton transfer, thus pointing the way in the search for the mechanistic principles underlying the proton pump. In addition to reviewing what has already been achieved in complex I modeling, we aim here to identify pressing issues and to provide guidance for future research to harness the power of modeling in the functional characterization of complex I. This article is part of a Special Issue entitled Respiratory complex I, edited by Volker Zickermann and Ulrich Brandt. PMID:26780586

  18. International linking of research and development on the model of Laser Centre Hanover

    NASA Astrophysics Data System (ADS)

    Nowitzki, Klaus-Dieter; Boedecker, Olaf

    2005-10-01

    Asia is becoming one of the most important regions in the world from the political, economic and scientific point of view. Germany believes that it is becoming increasingly necessary to cooperate with certain Asian countries especially for scientific and technological reasons. Above and beyond exchanges of scientists, the scientific and technological cooperation will be organized to cover projects with specific targets and to find solutions to important problems. International economic development is characterized by a mixture of competition and cooperation within the context of growing globalization. Germany, being one of the world's largest exporting nation, must therefore combine its active role in cooperation with these countries in the fields of education, research and innovation with economic cooperation. The Laser Centre Hanover pursues the goal of establishing and operating a Chinese German center for training and further education in laser technology and setting up a joint platform for long-term German Chinese cooperation in laser technology. An optimized training infrastructure combined with modern production processes support consequently long-term German businesses in China and secures their market-shares. LZH establishes Laser academies for skilled workers and technical decision makers in Shanghai and Changchun together with local universities and German partners. Due to the economic growth, Russia records since more than two years, the economic conditions are improving the cooperation between Germany and Russia step-by-step. The main goal of Russian science-politics is to stabilize an efficient scientific-technical potential with better chances in the global competition. The German-Russian scientific and technological cooperation plays an important role in this context. It has considerably increased in the last years in terms of width and depth and virtually includes all areas of science and technology at present. The region around Moscow is regarded

  19. A queuing model for road traffic simulation

    SciTech Connect

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-03-10

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.

  20. PIXE simulation: Models, methods and technologies

    SciTech Connect

    Batic, M.; Pia, M. G.; Saracco, P.; Weidenspointner, G.

    2013-04-19

    The simulation of PIXE (Particle Induced X-ray Emission) is discussed in the context of general-purpose Monte Carlo systems for particle transport. Dedicated PIXE codes are mainly concerned with the application of the technique to elemental analysis, but they lack the capability of dealing with complex experimental configurations. General-purpose Monte Carlo codes provide powerful tools to model the experimental environment in great detail, but so far they have provided limited functionality for PIXE simulation. This paper reviews recent developments that have endowed the Geant4 simulation toolkit with advanced capabilities for PIXE simulation, and related efforts for quantitative validation of cross sections and other physical parameters relevant to PIXE simulation.

  1. Mars Smart Lander Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Queen, Eric M.; Raiszadeh, Ben

    2002-01-01

    A multi-body flight simulation for the Mars Smart Lander has been developed that includes six degree-of-freedom rigid-body models for both the supersonically-deployed and subsonically-deployed parachutes. This simulation is designed to be incorporated into a larger simulation of the entire entry, descent and landing (EDL) sequence. The complete end-to-end simulation will provide attitude history predictions of all bodies throughout the flight as well as loads on each of the connecting lines. Other issues such as recontact with jettisoned elements (heat shield, back shield, parachute mortar covers, etc.), design of parachute and attachment points, and desirable line properties can also be addressed readily using this simulation.

  2. Power electronics system modeling and simulation

    SciTech Connect

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  3. Five forest harvesting simulation models, part 1: modeling characteristics

    SciTech Connect

    Goulet, D.V.; Iff, R.H.; Sirois, D.L.

    1980-01-01

    This paper is the first of two describing the conclusions from a study to determine the state of the art in timber harvesting computer simulation modeling. Five models were evaluated -- Forest Harvesting Simulation Model (FHSM), Full Tree Field Chipping (FTFC), Harvesting System Simulator (HSS), Simulation Applied to Logging Systems (SAPLOS), and Timber Harvesting and Transport Simulator (THATS) -- for their potential use in southern forest harvesting operations. In Part I, modeling characteristics and overall model philosophy are identified and illustrated. This includes a detailed discussion of the wood flow process in each model, accounting strategies for productive/non-productive times, performance variables, and the different types of harvesting systems modelable. In Part II we discuss user implementation problems. Those dealt with in detail are: What questions can be asked of the model. What are the modeling tradeoffs, and how do they impact on the analysis. What are the computer skills necessary to effectively work with the model. What computer support is needed. Are the models operational. The results provide a good picture of the state of the art in timber harvesting computer simulation. Much learning has occurred in the generation of these models, and many modeling and implementation problems have been uncovered, some of which remain unsolved. Hence, the user needs to examine closely the model and the intended application so that results will represent usable, valid data. It is recommended that the development of timber harvesting computer simulation modeling continue, so that existing and proposed timber harvesting strategies can be adequately evaluated. A set of design criteria are proposed. (Refs. 21).

  4. Modeling and simulation of plasma processing equipment

    NASA Astrophysics Data System (ADS)

    Kim, Heon Chang

    Currently plasma processing technology is utilized in a wide range of applications including advanced Integrated Circuit (IC) fabrication. Traditionally, plasma processing equipments have been empirically designed and optimized at great expense of development time and cost. This research proposes the development of a first principle based, multidimensional plasma process simulator with the aim of enhancing the equipment design procedure. The proposed simulator accounts for nonlinear interactions among various plasma chemistry and physics, neutral chemistry and transport, and dust transport phenomena. A three moment modeling approach is employed that shows good predictive capabilities at reasonable computational expense. For numerical efficiency, various versions of explicit and implicit Essentially Non- Oscillatory (ENO) algorithms are employed. For the rapid evaluation of time-periodic steady-state solutions, a feedback control approach is employed. Two dimensional simulation results of capacitively coupled rf plasmas show that ion bombardment uniformity can be improved through simulation based design of the plasma process. Through self-consistent simulations of an rf triode, it is also shown that effects of secondary rf voltage and frequency on ion bombardment energy can be accurately captured. These results prove that scaling relations among important process variables can be identified through the three moment modeling and simulation approach. Through coupling of the plasma model with a neutral chemistry and transport model, spatiotemporal distributions of both charged and uncharged species, including metastables, are predicted for an oxygen plasma. Furthermore, simulation results also verify the existence of a double layer in this electronegative plasma. Through Lagrangian simulation of dust in a plasma reactor, it is shown that small particles are accumulate near the center and the radial sheath boundary depending on their initial positions while large

  5. Pressurized Cadaver Model in Cardiothoracic Surgical Simulation.

    PubMed

    Greene, Christina L; Minneti, Michael; Sullivan, Maura E; Baker, Craig J

    2015-09-01

    Simulation is increasingly recognized as an integral aspect of thoracic surgery education. A number of simulators have been introduced to teach component cardiothoracic skills; however, no good model exists for numerous essential skills including redo sternotomy and internal mammary artery takedown. These procedures are often relegated to thoracic surgery residents but have significant negative implications if performed incorrectly. Fresh tissue dissection is recognized as the gold standard for surgical simulation, but the lack of circulating blood volume limits surgical realism. Our aim is to describe the technique of the pressurized cadaver for use in cardiothoracic surgical procedures, focusing on internal mammary artery takedown. PMID:26354651

  6. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  7. Incorporation of RAM techniques into simulation modeling

    NASA Astrophysics Data System (ADS)

    Nelson, S. C., Jr.; Haire, M. J.; Schryver, J. C.

    1995-01-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model to represent the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army's next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through 'what if' questions, sensitivity studies, and battle scenario changes.

  8. Analyzing Strategic Business Rules through Simulation Modeling

    NASA Astrophysics Data System (ADS)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  9. Simulation of 18O in precipitation by the regional circulation model REMOiso

    NASA Astrophysics Data System (ADS)

    Sturm, Kristof; Hoffmann, Georg; Langmann, Bärbel; Stichler, Willibald

    2005-11-01

    The first results of a regional circulation model REMOiso fitted with water isotope diagnostics are compared with various isotope series from central Europe. A 2 year case study is conducted from March 1997 to February 1999 centred over Europe, analysing daily and monthly measurements. Isotope signals over Europe are dominated by the typical isotopic effects such as temperature, continental and altitude effects, both on annual and seasonal scales. These well-known isotopic effects are successfully reproduced by REMOiso, using two different boundary data sets. In a first simulation, the European Centre for Medium-range Weather Forecasts (ECMWF) analyses serve as boundary conditions, where water isotopes were parameterized by a simple temperature dependence. In a second simulation, boundary conditions both for climatic and isotopic variables are taken from the ECHAMiso general circulation model output. The comparison of both simulations shows a very high sensitivity of the simulated 18O signal to boundary conditions. The ECMWF-nested simulation shows an average offset of -4.5 in mean 18O values and exaggerated seasonal amplitude. The ECHAM-nested simulation represents correctly the observed mean 18O values, although with a dampened seasonality. REMOiso's isotope module is further validated against daily 18O measurements at selected stations (Nordeney, Arkona and Hohenpeissenberg) situated in Germany. Copyright

  10. Distributed earth model/orbiter simulation

    NASA Technical Reports Server (NTRS)

    Geisler, Erik; Mcclanahan, Scott; Smith, Gary

    1989-01-01

    Distributed Earth Model/Orbiter Simulation (DEMOS) is a network based application developed for the UNIX environment that visually monitors or simulates the Earth and any number of orbiting vehicles. Its purpose is to provide Mission Control Center (MCC) flight controllers with a visually accurate three dimensional (3D) model of the Earth, Sun, Moon and orbiters, driven by real time or simulated data. The project incorporates a graphical user interface, 3D modelling employing state-of-the art hardware, and simulation of orbital mechanics in a networked/distributed environment. The user interface is based on the X Window System and the X Ray toolbox. The 3D modelling utilizes the Programmer's Hierarchical Interactive Graphics System (PHIGS) standard and Raster Technologies hardware for rendering/display performance. The simulation of orbiting vehicles uses two methods of vector propagation implemented with standard UNIX/C for portability. Each part is a distinct process that can run on separate nodes of a network, exploiting each node's unique hardware capabilities. The client/server communication architecture of the application can be reused for a variety of distributed applications.

  11. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  12. Decision- rather than scenario-centred downscaling: Towards smarter use of climate model outputs

    NASA Astrophysics Data System (ADS)

    Wilby, Robert L.

    2013-04-01

    Climate model output has been used for hydrological impact assessments for at least 25 years. Scenario-led methods raise awareness about risks posed by climate variability and change to the security of supplies, performance of water infrastructure, and health of freshwater ecosystems. However, it is less clear how these analyses translate into actionable information for adaptation. One reason is that scenario-led methods typically yield very large uncertainty bounds in projected impacts at regional and river catchment scales. Consequently, there is growing interest in vulnerability-based frameworks and strategies for employing climate model output in decision-making contexts. This talk begins by summarising contrasting perspectives on climate models and principles for testing their utility for water sector applications. Using selected examples it is then shown how water resource systems may be adapted with varying levels of reliance on climate model information. These approaches include the conventional scenario-led risk assessment, scenario-neutral strategies, safety margins and sensitivity testing, and adaptive management of water systems. The strengths and weaknesses of each approach are outlined and linked to selected water management activities. These cases show that much progress can be made in managing water systems without dependence on climate models. Low-regret measures such as improved forecasting, better inter-agency co-operation, and contingency planning, yield benefits regardless of the climate outlook. Nonetheless, climate model scenarios are useful for evaluating adaptation portfolios, identifying system thresholds and fixing weak links, exploring the timing of investments, improving operating rules, or developing smarter licensing regimes. The most problematic application remains the climate change safety margin because of the very low confidence in extreme precipitation and river flows generated by climate models. In such cases, it is necessary to

  13. The effect of two cognitive aid designs on team functioning during intra-operative anaphylaxis emergencies: a multi-centre simulation study.

    PubMed

    Marshall, S D; Sanderson, P; McIntosh, C A; Kolawole, H

    2016-04-01

    This multi-centre repeated measures study was undertaken to determine how contrasting designs of cognitive aids affect team performance during simulated intra-operative anaphylaxis crises. A total of 24 teams consisting of a consultant anaesthetist, an anaesthetic trainee and anaesthetic assistant managed three simulated intra-operative anaphylaxis emergencies. Each team was assigned at random to a counterbalanced order of: no cognitive aid; a linear cognitive aid; and a branched cognitive aid, and scored for team functioning. Scores were significantly higher with a linear compared with either a branched version of the cognitive aid or no cognitive aid for 'Team Overall Behavioural Performance', difference between study groups (F-value) 5.8, p = 0.01. Aggregate scores were higher with the linear compared with the branched aid design (p = 0.03). Cognitive aids improve co-ordination of the team's activities and support team members to verbalise their actions. A linear design of cognitive aid improves team functioning more than a branched design. PMID:26792648

  14. Battery thermal models for hybrid vehicle simulations

    NASA Astrophysics Data System (ADS)

    Pesaran, Ahmad A.

    This paper summarizes battery thermal modeling capabilities for: (1) an advanced vehicle simulator (ADVISOR); and (2) battery module and pack thermal design. The National Renewable Energy Laboratory's (NREL's) ADVISOR is developed in the Matlab/Simulink environment. There are several battery models in ADVISOR for various chemistry types. Each one of these models requires a thermal model to predict the temperature change that could affect battery performance parameters, such as resistance, capacity and state of charges. A lumped capacitance battery thermal model in the Matlab/Simulink environment was developed that included the ADVISOR battery performance models. For thermal evaluation and design of battery modules and packs, NREL has been using various computer aided engineering tools including commercial finite element analysis software. This paper will discuss the thermal ADVISOR battery model and its results, along with the results of finite element modeling that were presented at the workshop on "Development of Advanced Battery Engineering Models" in August 2001.

  15. Modeling surgical skill learning with cognitive simulation.

    PubMed

    Park, Shi-Hyun; Suh, Irene H; Chien, Jung-hung; Paik, Jaehyon; Ritter, Frank E; Oleynikov, Dmitry; Siu, Ka-Chun

    2011-01-01

    We used a cognitive architecture (ACT-R) to explore the procedural learning of surgical tasks and then to understand the process of perceptual motor learning and skill decay in surgical skill performance. The ACT-R cognitive model simulates declarative memory processes during motor learning. In this ongoing study, four surgical tasks (bimanual carrying, peg transfer, needle passing, and suture tying) were performed using the da Vinci© surgical system. Preliminary results revealed that an ACT-R model produced similar learning effects. Cognitive simulation can be used to demonstrate and optimize the perceptual motor learning and skill decay in surgical skill training. PMID:21335834

  16. The Peter Brojde Lung Cancer Centre: a model of integrative practice

    PubMed Central

    Grossman, M.; Agulnik, J.; Batist, G.

    2012-01-01

    Background The generally poor prognosis and poor quality of life for lung cancer patients have highlighted the need for a conceptual model of integrative practice. Although the philosophy of integrative oncology is well described, conceptual models that could guide the implementation and scientific evaluation of integrative practice are lacking. Purpose The present paper describes a conceptual model of integrative practice in which the philosophical underpinnings derive mainly from integrative oncology, with important contributions from Traditional Chinese Medicine (tcm) and the discipline of nursing. The conceptual model is described in terms of its purpose, values, concepts, dynamic components, scientific evidence, clinical approach, and theoretical underpinnings. The model argues that these components delineate the initial scope and orientation of integrative practice. They serve as the needed context for evaluating and interpreting the effectiveness of clinical interventions in enhancing patient outcomes in lung cancer at various phases of the illness. Furthermore, the development of relevant and effective integrative clinical interventions requires new research methods based on whole-systems research. An initial focus would be the identification of interrelationship patterns among variables that influence clinical interventions and their targeted patient outcomes. PMID:22670104

  17. How well do climate models simulate precipitation?

    NASA Astrophysics Data System (ADS)

    Schaller, Nathalie; Mahlstein, Irina; Knutti, Reto; Cermak, Jan

    2010-05-01

    This study compares three different methods to evaluate the ability of Atmosphere Ocean General Circulation Models (AOGCMs) to simulate precipitation. Currently, AOGCMs are the most powerful tool to investigate the future climate but how to evaluate them is a relatively new research field. Thus, no standardized metric for defining a climate model's skill has been defined so far. The common way to proceed is to evaluate the model simulations against observations using statistical measures. However, precipitation is highly variable on both the spatial and temporal scales. We therefore suspect that metrics representing regional features of the modelled precipitation response to climate change are more suitable to identify the good models than statistical measures defined on a global scale. Here, we compare three different ways of ranking the climate models: a) biases in a broad range of climate variables, b) only biases in global precipitation and c) regional features of modelled precipitation in areas where future changes are expected to be pronounced. Surprisingly, the multimodel mean performs only average for the feature-based ranking, while it outperforms all single models in the two bias-based rankings. In the feature-based ranking, the models performing best can be different for each region or zonal band considered and identifying them each time newly depending on the purpose may allow for more reliable projections. Further, this study reveals that many models have similar biases and that the observation datasets are often located at one end of the model range. Our results suggest that weighting the models according to their ability to simulate the present climate might lead to more reliable projections than the "one model, one vote" approach that has been favored so far.

  18. Ion selective transistor modelling for behavioural simulations.

    PubMed

    Daniel, M; Janicki, M; Wroblewski, W; Dybko, A; Brzozka, Z; Napieralski, A

    2004-01-01

    Computer aided design and simulation of complex silicon microsystems oriented for environment monitoring requires efficient and accurate models of ion selective sensors, compatible with the existing behavioural simulators. This paper concerns sensors based on the back-side contact Ion Sensitive Field Effect Transistors (ISFETs). The ISFETs with silicon nitride gate are sensitive to hydrogen ion concentration. When the transistor gate is additionally covered with a special ion selective membrane, selectivity to other than hydrogen ions can be achieved. Such sensors are especially suitable for flow analysis of solutions containing various ions. The problem of ion selective sensor modelling is illustrated here on a practical example of an ammonium sensitive membrane. The membrane is investigated in the presence of some interfering ions and appropriate selectivity coefficients are determined. Then, the model of the whole sensor is created and used in subsequent electrical simulations. Providing that appropriate selectivity coefficients are known, the proposed model is applicable for any membrane, and can be straightforwardly implemented for behavioural simulation of water monitoring microsystems. The model has been already applied in a real on-line water pollution monitoring system for detection of various contaminants. PMID:15685987

  19. Damage modeling for Taylor impact simulations

    NASA Astrophysics Data System (ADS)

    Anderson, C. E., Jr.; Chocron, I. S.; Nicholls, A. E.

    2006-08-01

    G. I. Taylor showed that dynamic material properties could be deduced from the impact of a projectile against a rigid boundary. The Taylor anvil test became very useful with the advent of numerical simulations and has been used to infer and/or to validate material constitutive constants. A new experimental facility has been developed to conduct Taylor anvil impacts to support validation of constitutive constants used in simulations. Typically, numerical simulations are conducted assuming 2-D cylindrical symmetry, but such computations cannot hope to capture the damage observed in higher velocity experiments. A computational study was initiated to examine the ability to simulate damage and subsequent deformation of the Taylor specimens. Three-dimensional simulations, using the Johnson-Cook damage model, were conducted with the nonlinear Eulerian wavecode CTH. The results of the simulations are compared to experimental deformations of 6061-T6 aluminum specimens as a function of impact velocity, and conclusions regarding the ability to simulate fracture and reproduce the observed deformations are summarized.

  20. Observation simulation experiments with regional prediction models

    NASA Technical Reports Server (NTRS)

    Diak, George; Perkey, Donald J.; Kalb, Michael; Robertson, Franklin R.; Jedlovec, Gary

    1990-01-01

    Research efforts in FY 1990 included studies employing regional scale numerical models as aids in evaluating potential contributions of specific satellite observing systems (current and future) to numerical prediction. One study involves Observing System Simulation Experiments (OSSEs) which mimic operational initialization/forecast cycles but incorporate simulated Advanced Microwave Sounding Unit (AMSU) radiances as input data. The objective of this and related studies is to anticipate the potential value of data from these satellite systems, and develop applications of remotely sensed data for the benefit of short range forecasts. Techniques are also being used that rely on numerical model-based synthetic satellite radiances to interpret the information content of various types of remotely sensed image and sounding products. With this approach, evolution of simulated channel radiance image features can be directly interpreted in terms of the atmospheric dynamical processes depicted by a model. Progress is being made in a study using the internal consistency of a regional prediction model to simplify the assessment of forced diabatic heating and moisture initialization in reducing model spinup times. Techniques for model initialization are being examined, with focus on implications for potential applications of remote microwave observations, including AMSU and Special Sensor Microwave Imager (SSM/I), in shortening model spinup time for regional prediction.

  1. Applying the OSPM model to the calculation of PM 10 concentration levels in the historical centre of the city of Thessaloniki

    NASA Astrophysics Data System (ADS)

    Assael, M. J.; Delaki, M.; Kakosimos, K. E.

    In this paper, the OSPM model is employed for the calculation of the PM 10 concentration levels in the historical centre of the city of Thessaloniki (Greece). Although measurements of the background concentration are available at a suburban station, and a few measurements of PM 10 concentrations do exist at particular areas inside the historical city centre, further assumptions had to be made (e.g., for the traffic load) in order to implement OSPM. To validate this approach, NO x and NO 2 measurements were employed in addition to data for PM 10. The good agreement observed allowed the prediction of PM 10 concentrations in all streets in the historical city centre. The very high PM 10 concentration levels obtained in almost all streets are indicative of the city's situation today. Finally, developments in vehicle's technology are invoked to model possible future scenarios.

  2. Modeling and Simulation of Nuclear Fuel Materials

    SciTech Connect

    Devanathan, Ram; Van Brutzel, Laurent; Tikare, Veena; Bartel, Timothy; Besmann, Theodore M; Stan, Marius; Van Uffelen, Paul

    2010-01-01

    We review the state of modeling and simulation of nuclear fuels with emphasis on the most widely used nuclear fuel, UO2. The hierarchical scheme presented represents a science-based approach to modeling nuclear fuels by progressively passing information in several stages from ab initio to continuum levels. Such an approach is essential to overcome the challenges posed by radioactive materials handling, experimental limitations in modeling extreme conditions and accident scenarios and small time and distance scales of fundamental defect processes. When used in conjunction with experimental validation, this multiscale modeling scheme can provide valuable guidance to development of fuel for advanced reactors to meet rising global energy demand.

  3. Modeling and Simulation of Nuclear Fuel Materials

    SciTech Connect

    Devanathan, Ramaswami; Van Brutzel, Laurent; Chartier, Alan; Gueneau, Christine; Mattsson, Ann E.; Tikare, Veena; Bartel, Timothy; Besmann, T. M.; Stan, Marius; Van Uffelen, Paul

    2010-10-01

    We review the state of modeling and simulation of nuclear fuels with emphasis on the most widely used nuclear fuel, UO2. The hierarchical scheme presented represents a science-based approach to modeling nuclear fuels by progressively passing information in several stages from ab initio to continuum levels. Such an approach is essential to overcome the challenges posed by radioactive materials handling, experimental limitations in modeling extreme conditions and accident scenarios, and the small time and distance scales of fundamental defect processes. When used in conjunction with experimental validation, this multiscale modeling scheme can provide valuable guidance to development of fuel for advanced reactors to meet rising global energy demand.

  4. Models, Simulations, and Games: A Survey.

    ERIC Educational Resources Information Center

    Shubik, Martin; Brewer, Garry D.

    A Rand evaluation of activity and products of gaming, model-building, and simulation carried out under the auspices of the Defense Advanced Research Projects Agency aimed not only to assess the usefulness of gaming in military-political policymaking, but also to contribute to the definition of common standards and the refinement of objectives for…

  5. Center for Advanced Modeling and Simulation Intern

    SciTech Connect

    Gertman, Vanessa

    2010-01-01

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  6. Love Kills:. Simulations in Penna Ageing Model

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich; Cebrat, Stanisław; Penna, T. J. P.; Sousa, A. O.

    The standard Penna ageing model with sexual reproduction is enlarged by adding additional bit-strings for love: Marriage happens only if the male love strings are sufficiently different from the female ones. We simulate at what level of required difference the population dies out.

  7. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  8. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2013-05-28

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  9. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  10. Twitter's tweet method modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  11. Numerical simulations and modeling of turbulent combustion

    NASA Astrophysics Data System (ADS)

    Cuenot, B.

    Turbulent combustion is the basic physical phenomenon responsible for efficient energy release by any internal combustion engine. However it is accompanied by other undesirable phenomena such as noise, pollutant species emission or damaging instabilities that may even lead to the system desctruction. It is then crucial to control this phenomenon, to understand all its mecanisms and to master it in industrial systems. For long time turbulent combustion has been explored only through theory and experiment. But the rapid increase of computers power during the last years has allowed an important development of numerical simulation, that has become today an essential tool for research and technical design. Direct numerical simulation has then allowed to rapidly progress in the knowledge of turbulent flame structures, leading to new modelisations for steady averaged simulations. Recently large eddy simulation has made a new step forward by refining the description of complex and unsteady flames. The main problem that arises when performing numerical simulation of turbulent combustion is linked to the description of the flame front. Being very thin, it can not however be reduced to a simple interface as it is the location of intense chemical transformation and of strong variations of thermodynamical quantities. Capturing the internal structure of a zone with a thickness of the order of 0.1 mm in a computation with a mesh step 10 times larger being impossible, it is necessary to model the turbulent flame. Models depend on the chemical structure of the flame, on the ambiant turbulence, on the combustion regime (flamelets, distributed combustion, etc.) and on the reactants injection mode (premixed or not). One finds then a large class of models, from the most simple algebraic model with a one-step chemical kinetics, to the most complex model involving probablity density functions, cross-correlations and multiple-step or fully complex chemical kinetics.

  12. Advances in NLTE modeling for integrated simulations

    NASA Astrophysics Data System (ADS)

    Scott, H. A.; Hansen, S. B.

    2010-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different atomic species for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly-excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with sufficient accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δ n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short time steps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  13. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  14. Modeling of protein loops by simulated annealing.

    PubMed Central

    Collura, V.; Higo, J.; Garnier, J.

    1993-01-01

    A method is presented to model loops of protein to be used in homology modeling of proteins. This method employs the ESAP program of Higo et al. (Higo, J., Collura, V., & Garnier, J., 1992, Biopolymers 32, 33-43) and is based on a fast Monte Carlo simulation and a simulated annealing algorithm. The method is tested on different loops or peptide segments from immunoglobulin, bovine pancreatic trypsin inhibitor, and bovine trypsin. The predicted structure is obtained from the ensemble average of the coordinates of the Monte Carlo simulation at 300 K, which exhibits the lowest internal energy. The starting conformation of the loop prior to modeling is chosen to be completely extended, and a closing harmonic potential is applied to N, CA, C, and O atoms of the terminal residues. A rigid geometry potential of Robson and Platt (1986, J. Mol. Biol. 188, 259-281) with a united atom representation is used. This we demonstrate to yield a loop structure with good hydrogen bonding and torsion angles in the allowed regions of the Ramachandran map. The average accuracy of the modeling evaluated on the eight modeled loops is 1 A root mean square deviation (rmsd) for the backbone atoms and 2.3 A rmsd for all heavy atoms. PMID:8401234

  15. Robust three-body water simulation model

    NASA Astrophysics Data System (ADS)

    Tainter, C. J.; Pieniazek, P. A.; Lin, Y.-S.; Skinner, J. L.

    2011-05-01

    The most common potentials used in classical simulations of liquid water assume a pairwise additive form. Although these models have been very successful in reproducing many properties of liquid water at ambient conditions, none is able to describe accurately water throughout its complicated phase diagram. The primary reason for this is the neglect of many-body interactions. To this end, a simulation model with explicit three-body interactions was introduced recently [R. Kumar and J. L. Skinner, J. Phys. Chem. B 112, 8311 (2008), 10.1021/jp8009468]. This model was parameterized to fit the experimental O-O radial distribution function and diffusion constant. Herein we reparameterize the model, fitting to a wider range of experimental properties (diffusion constant, rotational correlation time, density for the liquid, liquid/vapor surface tension, melting point, and the ice Ih density). The robustness of the model is then verified by comparing simulation to experiment for a number of other quantities (enthalpy of vaporization, dielectric constant, Debye relaxation time, temperature of maximum density, and the temperature-dependent second and third virial coefficients), with good agreement.

  16. Fault diagnosis based on continuous simulation models

    NASA Technical Reports Server (NTRS)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  17. Electronic continuum model for molecular dynamics simulations.

    PubMed

    Leontyev, I V; Stuchebrukhov, A A

    2009-02-28

    A simple model for accounting for electronic polarization in molecular dynamics (MD) simulations is discussed. In this model, called molecular dynamics electronic continuum (MDEC), the electronic polarization is treated explicitly in terms of the electronic continuum (EC) approximation, while the nuclear dynamics is described with a fixed-charge force field. In such a force-field all atomic charges are scaled to reflect the screening effect by the electronic continuum. The MDEC model is rather similar but not equivalent to the standard nonpolarizable force-fields; the differences are discussed. Of our particular interest is the calculation of the electrostatic part of solvation energy using standard nonpolarizable MD simulations. In a low-dielectric environment, such as protein, the standard MD approach produces qualitatively wrong results. The difficulty is in mistreatment of the electronic polarizability. We show how the results can be much improved using the MDEC approach. We also show how the dielectric constant of the medium obtained in a MD simulation with nonpolarizable force-field is related to the static (total) dielectric constant, which includes both the nuclear and electronic relaxation effects. Using the MDEC model, we discuss recent calculations of dielectric constants of alcohols and alkanes, and show that the MDEC results are comparable with those obtained with the polarizable Drude oscillator model. The applicability of the method to calculations of dielectric properties of proteins is discussed. PMID:19256627

  18. Flight Simulation Model Exchange. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce

    2011-01-01

    The NASA Engineering and Safety Center Review Board sponsored an assessment of the draft Standard, Flight Dynamics Model Exchange Standard, BSR/ANSI-S-119-201x (S-119) that was conducted by simulation and guidance, navigation, and control engineers from several NASA Centers. The assessment team reviewed the conventions and formats spelled out in the draft Standard and the actual implementation of two example aerodynamic models (a subsonic F-16 and the HL-20 lifting body) encoded in the Extensible Markup Language grammar. During the implementation, the team kept records of lessons learned and provided feedback to the American Institute of Aeronautics and Astronautics Modeling and Simulation Technical Committee representative. This document contains the appendices to the main report.

  19. Flight Simulation Model Exchange. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce

    2011-01-01

    The NASA Engineering and Safety Center Review Board sponsored an assessment of the draft Standard, Flight Dynamics Model Exchange Standard, BSR/ANSI-S-119-201x (S-119) that was conducted by simulation and guidance, navigation, and control engineers from several NASA Centers. The assessment team reviewed the conventions and formats spelled out in the draft Standard and the actual implementation of two example aerodynamic models (a subsonic F-16 and the HL-20 lifting body) encoded in the Extensible Markup Language grammar. During the implementation, the team kept records of lessons learned and provided feedback to the American Institute of Aeronautics and Astronautics Modeling and Simulation Technical Committee representative. This document contains the results of the assessment.

  20. Atmospheric Modeling And Sensor Simulation (AMASS) study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1984-01-01

    The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.

  1. Computational model for protein unfolding simulation

    NASA Astrophysics Data System (ADS)

    Tian, Xu-Hong; Zheng, Ye-Han; Jiao, Xiong; Liu, Cai-Xing; Chang, Shan

    2011-06-01

    The protein folding problem is one of the fundamental and important questions in molecular biology. However, the all-atom molecular dynamics studies of protein folding and unfolding are still computationally expensive and severely limited by the time scale of simulation. In this paper, a simple and fast protein unfolding method is proposed based on the conformational stability analyses and structure modeling. In this method, two structure-based conditions are considered to identify the unstable regions of proteins during the unfolding processes. The protein unfolding trajectories are mimicked through iterative structure modeling according to conformational stability analyses. Two proteins, chymotrypsin inhibitor 2 (CI2) and α -spectrin SH3 domain (SH3) were simulated by this method. Their unfolding pathways are consistent with the previous molecular dynamics simulations. Furthermore, the transition states of the two proteins were identified in unfolding processes and the theoretical Φ values of these transition states showed significant correlations with the experimental data (the correlation coefficients are >0.8). The results indicate that this method is effective in studying protein unfolding. Moreover, we analyzed and discussed the influence of parameters on the unfolding simulation. This simple coarse-grained model may provide a general and fast approach for the mechanism studies of protein folding.

  2. Compressible homogeneous shear: Simulation and modeling

    NASA Technical Reports Server (NTRS)

    Sarkar, S.; Erlebacher, G.; Hussaini, M. Y.

    1992-01-01

    Compressibility effects were studied on turbulence by direct numerical simulation of homogeneous shear flow. A primary observation is that the growth of the turbulent kinetic energy decreases with increasing turbulent Mach number. The sinks provided by compressible dissipation and the pressure dilatation, along with reduced Reynolds shear stress, are shown to contribute to the reduced growth of kinetic energy. Models are proposed for these dilatational terms and verified by direct comparison with the simulations. The differences between the incompressible and compressible fields are brought out by the examination of spectra, statistical moments, and structure of the rate of strain tensor.

  3. Blast furnace on-line simulation model

    NASA Astrophysics Data System (ADS)

    Saxén, Henrik

    1990-10-01

    A mathematical model of the ironmaking blast furnace (BF) is presented. The model describes the steady-state operation of the furnace in one spatial dimension using real process data sampled at the steelworks. The measurement data are reconciled by an interface routine which yields boundary conditions obeying the conservation laws of atoms and energy. The simulation model, which provides a picture of the internal conditions of the BF, can be used to evaluate the current state of the process and to predict the effect of operating actions on the performance of the furnace.

  4. Enhancing Social Competence and the Child-Teacher Relationship Using a Child-Centred Play Training Model in Hong Kong Preschools

    ERIC Educational Resources Information Center

    Leung, Chi-hung

    2015-01-01

    The purpose of this study was to examine whether a child-centred play training model, filial play therapy, enhances child-teacher relationship and thereby reduces children's internalising problems (such as anxiety/depression and withdrawal) and externalising problems (such as aggressive and destructive behaviour). Sixty teachers (n = 60) and 60…

  5. GLAST Burst Monitor Instrument Simulation and Modeling

    SciTech Connect

    Hoover, A. S.; Kippen, R. M.; Wallace, M. S.; Pendleton, G. N.; Fishman, G. J.; Meegan, C. A.; Kouveliotou, C.; Wilson-Hodge, C. A.; Bhat, P. N.; Briggs, M. S.; Connaughton, V.; Paciesas, W. S.; Preece, R. D.

    2008-05-22

    The GLAST Burst Monitor (GBM) is designed to provide wide field of view observations of gamma-ray bursts and other fast transient sources in the energy range 10 keV to 30 MeV. The GBM is composed of several unshielded and uncollimated scintillation detectors (twelve NaI and two BGO) that are widely dispersed about the GLAST spacecraft. As a result, reconstructing source locations, energy spectra, and temporal properties from GBM data requires detailed knowledge of the detectors' response to both direct radiation as well as that scattered from the spacecraft and Earth's atmosphere. This full GBM instrument response will be captured in the form of a response function database that is derived from computer modeling and simulation. The simulation system is based on the GEANT4 Monte Carlo radiation transport simulation toolset.

  6. Facebook's personal page modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  7. Optimisation Strategies for Modelling and Simulation

    NASA Astrophysics Data System (ADS)

    Louchet, Jean

    2007-12-01

    Progress in computation techniques has been dramatically reducing the gap between modeling and simulation. Simulation as the natural outcome of modeling is used both as a tool to predict the behavior of natural or artificial systems, a tool to validate modeling, and a tool to build and refine models - in particular identify model internal parameters. In this paper we will concentrate upon the latter, model building and identification, using modern optimization techniques, through application examples taken from the digital imaging field. The first example is given by Image Processing with retrieval of known patterns in an image. The second example is taken from synthetic image animation: we show how it is possible to learn the model's internal physical parameters from actual trajectory examples, using Darwin-inspired evolutionary algorithms. In the third example, we will demonstrate how it is possible, when the problem cannot easily be handled by a reasonably simple optimization technique, to split the problem into simpler elements which can be efficiently evolved by an evolutionary optimization algorithm - which is now called "Parisian Evolution". The "Fly algorithm" is a realtime stereovision algorithm which skips conventional preliminary stages of image processing, now applied into mobile robotics and medical imaging. The main question left is now, to which degree is it possible to delegate to a computer a part of the physicist's role, which is to collect examples and build general laws from these examples?

  8. Towards Better Coupling of Hydrological Simulation Models

    NASA Astrophysics Data System (ADS)

    Penton, D.; Stenson, M.; Leighton, B.; Bridgart, R.

    2012-12-01

    Standards for model interoperability and scientific workflow software provide techniques and tools for coupling hydrological simulation models. However, model builders are yet to realize the benefits of these and continue to write ad hoc implementations and scripts. Three case studies demonstrate different approaches to coupling models, the first using tight interfaces (OpenMI), the second using a scientific workflow system (Trident) and the third using a tailored execution engine (Delft Flood Early Warning System - Delft-FEWS). No approach was objectively better than any other approach. The foremost standard for coupling hydrological models is the Open Modeling Interface (OpenMI), which defines interfaces for models to interact. An implementation of the OpenMI standard involves defining interchange terms and writing a .NET/Java wrapper around the model. An execution wrapper such as OatC.GUI or Pipistrelle executes the models. The team built two OpenMI implementations for eWater Source river system models. Once built, it was easy to swap river system models. The team encountered technical challenges with versions of the .Net framework (3.5 calling 4.0) and with the performance of the execution wrappers when running daily simulations. By design, the OpenMI interfaces are general, leaving significant decisions around the semantics of the interfaces to the implementer. Increasingly, scientific workflow tools such as Kepler, Taverna and Trident are able to replace custom scripts. These tools aim to improve the provenance and reproducibility of processing tasks. In particular, Taverna and the myExperiment website have had success making many bioinformatics workflows reusable and sharable. The team constructed Trident activities for hydrological software including IQQM, REALM and eWater Source. They built an activity generator for model builders to build activities for particular river systems. The models were linked at a simulation level, without any daily time

  9. Theory, modeling and simulation: Annual report 1993

    SciTech Connect

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  10. eShopper modeling and simulation

    NASA Astrophysics Data System (ADS)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  11. The University of the First Age Extended Learning Centres: A Model of Study Support for the New Millennium?

    ERIC Educational Resources Information Center

    Burgess, Sarah

    2000-01-01

    In Britain, University of the First Age Extended Learning Centres use multiple intelligence theory and brain research to provide enrichment activities beyond school hours. Multisensory learning environments, a broad-based learning team, and peer tutoring are featured. (SK)

  12. Modelling and simulation of virtual Mars scene

    NASA Astrophysics Data System (ADS)

    Sun, Si-liang; Chen, Ren; Sun, Li; Yan, Jie

    2011-08-01

    There is a limited cognition on human beings comprehend the universe. Aiming at the impending need of mars exploration in the near future, starting from the mars three-dimensional (3D) model, the mars texture which based on several reality pictures was drew and the Bump mapping technique was managed to enhance the realistic rendering. In order to improve the simulation fidelity, the composing of mars atmospheric was discussed and the reason caused atmospheric scattering was investigated, the scattering algorithm was studied and calculated as well. The reasons why "Red storm" that frequently appeared on mars were particularized, these factors inevitable brought on another celestial body appearance. To conquer this problem, two methods which depended on different position of view point (universe point and terrestrial point) were proposed: in previous way, the 3D model was divided into different meshes to simulate the storm effect and the formula algorithm that mesh could rotate with any axis was educed. From a certain extent the model guaranteed rendering result when looked at the mars (with "Red storm") in universe; in latter way, 3D mars terrain scene was build up according to the mars pictures downloaded on "Google Mars", particle system used to simulated the storm effect, then the Billboard technique was managed to finish the color emendation and rendering compensation. At the end, the star field simulation based on multiple texture blending was given. The result of experiment showed that these methods had not only given a substantial increase in fidelity, but also guaranteed real-time rendering. It can be widely used in simulation of space battlefield and exploration tasks.

  13. Automatic determination of the transition between successive control mechanisms in upright stance assessed by modelling of the centre of pressure.

    PubMed

    Rougier, P

    1999-02-01

    A recently introduced concept models the trajectory of the centre of pressure as a fractional Brownian motion and reveals that two successive scaling regimes, acting hypothetically as open and closed loop mechanisms, are implicated in posture control. Objectivity is obviously required in the determination of the transition point, i.e. the point at which an open-loop control mechanism would switch to a closed-loop one, in order to provide reproducibility and automatism in the processing of data. In the method proposed herein, the transition point corresponds to the maximal distance separating a diffusion curve in a double logarithmic plot (mean square distances MSD calculated on each axis versus increasing time intervals Deltat) from a straight line characterising a pure stochastic behaviour. In closed eye conditions, the switch appears medio-laterally in a 0. 26-0.52 s range for Deltat, the corresponding MSD being in the range of 1.86-10.50 mm(2). In the forward-backward direction, the transition is in a 0.28-0.42 s range and the corresponding MSD is between 3.60 and 15.17 mm(2). Finally, these co-ordinates induce scaling exponents over 0.50 for the shortest Deltat, thus suggesting open-loop control, whereas those of longest Deltat, ranged between 0 and 0.20, give evidence of close-loop control. This data is compared to previous data based upon empirical methods. PMID:10455557

  14. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  15. Consequence modeling using the fire dynamics simulator.

    PubMed

    Ryder, Noah L; Sutula, Jason A; Schemel, Christopher F; Hamer, Andrew J; Van Brunt, Vincent

    2004-11-11

    The use of Computational Fluid Dynamics (CFD) and in particular Large Eddy Simulation (LES) codes to model fires provides an efficient tool for the prediction of large-scale effects that include plume characteristics, combustion product dispersion, and heat effects to adjacent objects. This paper illustrates the strengths of the Fire Dynamics Simulator (FDS), an LES code developed by the National Institute of Standards and Technology (NIST), through several small and large-scale validation runs and process safety applications. The paper presents two fire experiments--a small room fire and a large (15 m diameter) pool fire. The model results are compared to experimental data and demonstrate good agreement between the models and data. The validation work is then extended to demonstrate applicability to process safety concerns by detailing a model of a tank farm fire and a model of the ignition of a gaseous fuel in a confined space. In this simulation, a room was filled with propane, given time to disperse, and was then ignited. The model yields accurate results of the dispersion of the gas throughout the space. This information can be used to determine flammability and explosive limits in a space and can be used in subsequent models to determine the pressure and temperature waves that would result from an explosion. The model dispersion results were compared to an experiment performed by Factory Mutual. Using the above examples, this paper will demonstrate that FDS is ideally suited to build realistic models of process geometries in which large scale explosion and fire failure risks can be evaluated with several distinct advantages over more traditional CFD codes. Namely transient solutions to fire and explosion growth can be produced with less sophisticated hardware (lower cost) than needed for traditional CFD codes (PC type computer verses UNIX workstation) and can be solved for longer time histories (on the order of hundreds of seconds of computed time) with

  16. Dementia service centres in Austria: A comprehensive support and early detection model for persons with dementia and their caregivers – theoretical foundations and model description

    PubMed Central

    Span, Edith; Reisberg, Barry

    2015-01-01

    Despite the highly developed social services in Austria, the County of Upper Austria, one of the nine counties of Austria had only very limited specialized services for persons with dementia and their caregivers in 2001. Support groups existed in which the desire for more specialized services was voiced. In response to this situation, funding was received to develop a new structure for early disease detection and long term support for both the person with dementia and their caregivers. This article describes the development of the model of the Dementia Service Centres (DSCs) and the successes and difficulties encountered in the process of implementing the model in six different rural regions of Upper Austria. The DSC was described in the First Austrian Dementia Report as one of the potential service models for the future. PMID:24339114

  17. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  18. High-Fidelity Roadway Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit

    2010-01-01

    Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.

  19. Simulation of model swimmers near ciliated surfaces

    NASA Astrophysics Data System (ADS)

    Shum, Henry; Tripathi, Anurag; Yeomans, Julia; Balazs, Anna

    2013-03-01

    Biofouling by micro-organisms is problematic on scales from microfluidic devices to the largest ships in the ocean. One solution found in nature for clearing undesired material from surfaces is to employ active cilia, for example, in the respiratory tract. It is feasible to fabricate surfaces covered with artificial cilia actuated by an externally imposed field. Using numerical simulation, we investigate the interactions between these artificial cilia and self-propelled model swimmers. One of the key aims is to explore the possibility of steering swimmers to influence their trajectories through the flow field produced by the cilia. In our simulations, the fluid dynamics is solved using the lattice Boltzmann method while the cilia and model swimmers are governed by elastic internal mechanics. We implement an immersed boundary approach to couple the solid and fluid dynamics.

  20. Model parameters for simulation of physiological lipids.

    PubMed

    Hills, Ronald D; McGlinchey, Nicholas

    2016-05-01

    Coarse grain simulation of proteins in their physiological membrane environment can offer insight across timescales, but requires a comprehensive force field. Parameters are explored for multicomponent bilayers composed of unsaturated lipids DOPC and DOPE, mixed-chain saturation POPC and POPE, and anionic lipids found in bacteria: POPG and cardiolipin. A nonbond representation obtained from multiscale force matching is adapted for these lipids and combined with an improved bonding description of cholesterol. Equilibrating the area per lipid yields robust bilayer simulations and properties for common lipid mixtures with the exception of pure DOPE, which has a known tendency to form nonlamellar phase. The models maintain consistency with an existing lipid-protein interaction model, making the force field of general utility for studying membrane proteins in physiologically representative bilayers. © 2016 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26864972

  1. Refined Transistor Model For Simulation Of SEU

    NASA Technical Reports Server (NTRS)

    Zoutendyk, John A.; Benumof, Reuben

    1988-01-01

    Equivalent base resistance added. Theoretical study develops equations for parameters of Gummel-Poon model of bipolar junction transistor: includes saturation current, amplification factors, charging times, knee currents, capacitances, and resistances. Portion of study concerned with base region goes beyond Gummel-Poon analysis to provide more complete understanding of transistor behavior. Extended theory useful in simulation of single-event upset (SEU) caused in logic circuits by cosmic rays or other ionizing radiation.

  2. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  3. Theory, Modeling and Simulation Annual Report 2000

    SciTech Connect

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  4. Simulation and modeling of homogeneous, compressed turbulence

    NASA Astrophysics Data System (ADS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.

    1985-05-01

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression was simulated by numerically solving the Navier-Stokes equations. The numerical simulations were performed on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second-order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one-dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. Results from the simulated flow fields were used to test one-point closure, two-equation models. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  5. Simulation and modeling of homogeneous, compressed turbulence

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.

    1985-01-01

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression was simulated by numerically solving the Navier-Stokes equations. The numerical simulations were performed on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second-order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one-dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. Results from the simulated flow fields were used to test one-point closure, two-equation models. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  6. Progress in Modeling and Simulation of Batteries

    SciTech Connect

    Turner, John A

    2016-01-01

    Modeling and simulation of batteries, in conjunction with theory and experiment, are important research tools that offer opportunities for advancement of technologies that are critical to electric motors. The development of data from the application of these tools can provide the basis for managerial and technical decision-making. Together, these will continue to transform batteries for electric vehicles. This collection of nine papers presents the modeling and simulation of batteries and the continuing contribution being made to this impressive progress, including topics that cover: * Thermal behavior and characteristics * Battery management system design and analysis * Moderately high-fidelity 3D capabilities * Optimization Techniques and Durability As electric vehicles continue to gain interest from manufacturers and consumers alike, improvements in economy and affordability, as well as adoption of alternative fuel sources to meet government mandates are driving battery research and development. Progress in modeling and simulation will continue to contribute to battery improvements that deliver increased power, energy storage, and durability to further enhance the appeal of electric vehicles.

  7. Computer Models Simulate Fine Particle Dispersion

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  8. Qualitative simulation for process modeling and control

    NASA Technical Reports Server (NTRS)

    Dalle Molle, D. T.; Edgar, T. F.

    1989-01-01

    A qualitative model is developed for a first-order system with a proportional-integral controller without precise knowledge of the process or controller parameters. Simulation of the qualitative model yields all of the solutions to the system equations. In developing the qualitative model, a necessary condition for the occurrence of oscillatory behavior is identified. Initializations that cannot exhibit oscillatory behavior produce a finite set of behaviors. When the phase-space behavior of the oscillatory behavior is properly constrained, these initializations produce an infinite but comprehensible set of asymptotically stable behaviors. While the predictions include all possible behaviors of the real system, a class of spurious behaviors has been identified. When limited numerical information is included in the model, the number of predictions is significantly reduced.

  9. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  10. Integrating Visualizations into Modeling NEST Simulations

    PubMed Central

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  11. Atomistic modeling and simulation of nanopolycrystalline solids

    NASA Astrophysics Data System (ADS)

    Yang, Zidong

    In the past decades, nanostructured materials have opened new and fascinating avenues for research. Nanopolycrystalline solids, which consist of nano-sized crystalline grains and significant volume fractions of amorphous grain boundaries, are believed to have substantially different response to the thermal-mechanical-electric-magnetic loads, as compared to the response of single-crystalline materials. Nanopolycrystalline materials are expected to play a key role in the next generation of smart materials. This research presents a framework (1) to generate full atomistic models, (2) to perform non-equilibrium molecular dynamics simulations, and (3) to study multi-physics phenomena of nanopolycrystalline solids. This work starts the physical model and mathematical representation with the framework of molecular dynamics. In addition to the latest theories and techniques of molecular dynamics simulations, this work implemented principle of objectivity and incorporates multi-physics features. Further, a database of empirical interatomic potentials is established and the combination scheme for potentials is revisited, which enables investigation of a broad spectrum of chemical elements (as in periodic table) and compounds (such as rocksalt, perovskite, wurtzite, diamond, etc.). The configurational model of nanopolycrystalline solids consists of two spatial components: (1) crystalline grains, which can be obtained through crystal structure optimization, and (2) amorphous grain boundaries, which can be obtained through amorphization process. Therefore, multi-grain multi-phase nanopolycrystalline material system can be constructed by partitioning the space for grains, followed by filling the inter-grain space with amorphous grain boundaries. Computational simulations are performed on several representative crystalline materials and their mixture, such as rocksalt, perovskite and diamond. Problems of relaxation, mechanical loading, thermal stability, heat conduction

  12. Evaluating Global Streamflow Simulations by a Physically-based Routing Model Coupled with the Community Land Model

    SciTech Connect

    Li, Hongyi; Leung, Lai-Yung R.; Getirana, Augusto; Huang, Maoyi; Wu, Huan; Xu, Yubin; Guo, Jiali; Voisin, Nathalie

    2015-04-15

    Accurately simulating hydrological processes such as streamflow is important in land surface modeling because they can influence other land surface processes, such as carbon cycle dynamics, through various interaction pathways. This study aims to evaluate the global application of a recently developed Model for Scale Adaptive River Transport (MOSART)coupled with theCommunity Land Model, version 4 (CLM4). To support the global implementation of MOSART, a comprehensive global hydrography dataset has been derived at multiple resolutions from different sources. The simulated runoff fields are first evaluated against the composite runoff map from theGlobal RunoffData Centre (GRDC). The simulated streamflow is then shown to reproduce reasonably well the observed daily andmonthly streamflow at over 1600 of the world’smajor river stations in terms of annual, seasonal, and daily flow statistics. The impacts of model structure complexity are evaluated, and results show that the spatial and temporal variability of river velocity simulated byMOSART is necessary for capturing streamflow seasonality and annual maximum flood. Other sources of the simulation bias include uncertainties in the atmospheric forcing, as revealed by simulations driven by four different climate datasets, and human influences, based on a classification framework that quantifies the impact levels of large dams on the streamflow worldwide.

  13. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  14. [Modeling and Simulation of Spectral Polarimetric BRDF].

    PubMed

    Ling, Jin-jiang; Li, Gang; Zhang, Ren-bin; Tang, Qian; Ye, Qiu

    2016-01-01

    Under the conditions of the polarized light, The reflective surface of the object is affected by many factors, refractive index, surface roughness, and so the angle of incidence. For the rough surface in the different wavelengths of light exhibit different reflection characteristics of polarization, a spectral polarimetric BRDF based on Kirchhof theory is proposee. The spectral model of complex refraction index is combined with refraction index and extinction coefficient spectral model which were got by using the known complex refraction index at different value. Then get the spectral model of surface roughness derived from the classical surface roughness measuring method combined with the Fresnel reflection function. Take the spectral model of refraction index and roughness into the BRDF model, then the spectral polarimetirc BRDF model is proposed. Compare the simulation results of the refractive index varies with wavelength, roughness is constant, the refraction index and roughness both vary with wavelength and origin model with other papers, it shows that, the spectral polarimetric BRDF model can show the polarization characteristics of the surface accurately, and can provide a reliable basis for the application of polarization remote sensing, and other aspects of the classification of substances. PMID:27228737

  15. Interannual tropical rainfall variability in general circulation model simulations associated with the atmospheric model intercomparison project

    SciTech Connect

    Sperber, K.R.; Palmer, T.N.

    1996-11-01

    The interannual variability of rainfall over the Indian subcontinent, the African Sahel, and the Nordeste region of Brazil have been evaluated in 32 models for the period 1979 - 88 as part of the Atmospheric Model Intercomparison Project (AMIP). The interannual variations of Nordeste rainfall are the most readily captured, owing to the intimate link with Pacific and Atlantic sea surface temperatures. The precipitation variations over India and the Sahel are less well simulated. Additionally, an Indian monsoon wind shear index was calculated for each model. This subset of models also had a rainfall climatology that was in better agreement with observations, indicating a link between systematic model error and the ability to simulate interannual variations. A suite of six European Centre for Medium-Range Weather Forecasts (ECMWF) AMIP runs (differing only in their initial conditions) have also been examined. As observed, all-India rainfall was enhanced in 1988 relative to 1987 in each of these realizations. All-India rainfall variability during other years showed little or no predictability, possibly due to internal chaotic dynamics associated with intraseasonal monsoon fluctuations and/or unpredictable land surface process interactions. The interannual variations of Nordeste rainfall were best represented. The State University of New York at Albany /National Center for Atmospheric Research Genesis model was run in five initial condition realizations. In this model, the Nordeste rainfall variability was also best reproduced. However, for all regions the skill was less than that of the ECMWF model. The relationships of the all-India and Sahel rainfall/SST teleconnections with horizontal resolution, convection scheme closure, and numerics have been evaluated. 64 refs., 13 figs., 3 tabs.

  16. Best Practices for Crash Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jackson, Karen E.

    2002-01-01

    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.

  17. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  18. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  19. A Simple Memristor Model for Circuit Simulations

    NASA Astrophysics Data System (ADS)

    Fullerton, Farrah-Amoy; Joe, Aaleyah; Gergel-Hackett, Nadine; Department of Chemistry; Physics Team

    This work describes the development of a model for the memristor, a novel nanoelectronic technology. The model was designed to replicate the real-world electrical characteristics of previously fabricated memristor devices, but was constructed with basic circuit elements using a free widely available circuit simulator, LT Spice. The modeled memrsistors were then used to construct a circuit that performs material implication. Material implication is a digital logic that can be used to perform all of the same basic functions as traditional CMOS gates, but with fewer nanoelectronic devices. This memristor-based digital logic could enable memristors' use in new paradigms of computer architecture with advantages in size, speed, and power over traditional computing circuits. Additionally, the ability to model the real-world electrical characteristics of memristors in a free circuit simulator using its standard library of elements could enable not only the development of memristor material implication, but also the development of a virtually unlimited array of other memristor-based circuits.

  20. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  1. Simulation and Modeling of Homogeneous, Compressed Turbulence.

    NASA Astrophysics Data System (ADS)

    Wu, Chung-Teh

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression has been simulated by numerically solving the Navier-Stokes equations. The numerical simulations were carried out on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second -order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one -dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. It was found that the ratio of the turbulence time scale to the mean-flow time scale is an important parameter in these flows. When this ratio is large, the flow is immediately affected by the mean strain in a manner similar to that predicted by rapid distortion theory. When this ratio is small, the flow retains the character of decaying isotropic turbulence initially; only after the strain has been applied for a long period does the flow accumulate a significant reflection of the effect of mean strain. In these flows, the Kolmogorov length scale decreases rapidly with increasing total strain, due to the density increase that accompanies compression. Results from the simulated flow fields were used to test one-point-closure, two-equation turbulence models. The two-equation models perform well only when the compression rate is small compared to the eddy turn-over rate. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  2. Closed loop models for analyzing the effects of simulator characteristics. [digital simulation of human operators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D. L.

    1978-01-01

    The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.

  3. Models for naturally fractured, carbonate reservoir simulations

    SciTech Connect

    Tuncay, K.; Park, A.; Ozkan, G.; Zhan, X.; Ortoleva, P.; Hoak, T.; Sundberg, K.

    1998-12-31

    This report outlines the need for new tools for the simulation of fractured carbonate reservoirs. Several problems are identified that call for the development of new reservoir simulation physical models and numerical techniques. These include: karst and vuggy media wherein Darcy`s and traditional multi-phase flow laws do not apply; the need for predicting the preproduction state of fracturing and stress so that the later response of effective stress-dependent reservoirs can be predicted; and methods for predicting the fracturing and collapse of vuggy and karst reservoirs in response to draw-down pressure created during production. Specific research directions for addressing each problem are outlined and preliminary results are noted.

  4. Geometric Modeling, Radiation Simulation, Rendering, Analysis Package

    Energy Science and Technology Software Center (ESTSC)

    1995-01-17

    RADIANCE is intended to aid lighting designers and architects by predicting the light levels and appearance of a space prior to construction. The package includes programs for modeling and translating scene geometry, luminaire data and material properties, all of which are needed as input to the simulation. The lighting simulation itself uses ray tracing techniques to compute radiance values (ie. the quantity of light passing through a specific point in a specific direction), which aremore » typically arranged to form a photographic quality image. The resulting image may be analyzed, displayed and manipulated within the package, and converted to other popular image file formats for export to other packages, facilitating the production of hard copy output.« less

  5. Explaining low energy γ-ray excess from the galactic centre using a two-component dark matter model

    NASA Astrophysics Data System (ADS)

    Biswas, Anirban

    2016-06-01

    Over the past few years, there has been a hint of the γ-ray excess observed by the Fermi-LAT satellite-borne telescope from the regions surrounding the galactic centre (GC) at an energy range of ˜1-3 GeV. The nature of this excess γ-ray spectrum is found to be consistent with the γ-ray emission expected from dark matter (DM) annihilation at the GC while disfavouring other known astrophysical sources as the possible origin of this phenomena. It is also reported that the spectrum and morphology of this excess γ-rays can well be explained by the DM particles having mass in the range 30{--}40 {{GeV}} annihilating significantly into b\\bar{b} final state with an annihilation cross section σ v˜ (1.4-2.0)× {10}-26 cm{}3 {{{s}}}-1 at the GC. In this work, we propose a two-component DM model where two different types of DM particles, namely a complex scalar and a Dirac fermion are considered. The stability of both the dark sector particles are maintained by virtue of an additional local {{U}}{(1)}X gauge symmetry. We find that our proposed scenario can provide a viable explanation for this anomalous excess γ-rays besides satisfying all the existing relevant theoretical as well as experimental and observational bounds from LHC, PLANCK and LUX collaborations. The allowed range of ‘effective annihilation cross section’ of lighter DM particle for the b\\bar{b} annihilation channel thus obtained is finally compared with the limits reported by the Fermi-LAT and DES collaborations using data from various dwarf spheroidal galaxies.

  6. The Cooperative Research Centre for Living with Autism (Autism CRC) Conceptual Model to Promote Mental Health for Adolescents with ASD.

    PubMed

    Shochet, Ian M; Saggers, Beth R; Carrington, Suzanne B; Orr, Jayne A; Wurfl, Astrid M; Duncan, Bonnie M; Smith, Coral L

    2016-06-01

    Despite an increased risk of mental health problems in adolescents with autism spectrum disorder (ASD), there is limited research on effective prevention approaches for this population. Funded by the Cooperative Research Centre for Living with Autism, a theoretically and empirically supported school-based preventative model has been developed to alter the negative trajectory and promote wellbeing and positive mental health in adolescents with ASD. This conceptual paper provides the rationale, theoretical, empirical and methodological framework of a multilayered intervention targeting the school, parents and adolescents on the spectrum. Two important interrelated protective factors have been identified in community adolescent samples, namely the sense of belonging (connectedness) to school and the capacity for self and affect regulation in the face of stress (i.e. resilience). We describe how a confluence of theories from social psychology, developmental psychology and family systems theory, along with empirical evidence (including emerging neurobiological evidence), supports the interrelationships between these protective factors and many indices of wellbeing. However, the characteristics of ASD (including social and communication difficulties, and frequently difficulties with changes and transitions, and diminished optimism and self-esteem) impair access to these vital protective factors. The paper describes how evidence-based interventions at the school level for promoting inclusive schools (using the Index for Inclusion) and interventions for adolescents and parents to promote resilience and belonging [using the Resourceful Adolescent Program (RAP)] are adapted and integrated for adolescents with ASD. This multisite proof-of-concept study will confirm whether this multilevel school-based intervention is promising, feasible and sustainable. PMID:27072681

  7. A Flexible Microarray Data Simulation Model

    PubMed Central

    Dembélé, Doulaye

    2013-01-01

    Microarray technology allows monitoring of gene expression profiling at the genome level. This is useful in order to search for genes involved in a disease. The performances of the methods used to select interesting genes are most often judged after other analyzes (qPCR validation, search in databases...), which are also subject to error. A good evaluation of gene selection methods is possible with data whose characteristics are known, that is to say, synthetic data. We propose a model to simulate microarray data with similar characteristics to the data commonly produced by current platforms. The parameters used in this model are described to allow the user to generate data with varying characteristics. In order to show the flexibility of the proposed model, a commented example is given and illustrated. An R package is available for immediate use.

  8. Locally refined block-centred finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and the performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are: (a) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed, and (b) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  9. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  10. Perceptual centres in speech - an acoustic analysis

    NASA Astrophysics Data System (ADS)

    Scott, Sophie Kerttu

    Perceptual centres, or P-centres, represent the perceptual moments of occurrence of acoustic signals - the 'beat' of a sound. P-centres underlie the perception and production of rhythm in perceptually regular speech sequences. P-centres have been modelled both in speech and non speech (music) domains. The three aims of this thesis were toatest out current P-centre models to determine which best accounted for the experimental data bto identify a candidate parameter to map P-centres onto (a local approach) as opposed to the previous global models which rely upon the whole signal to determine the P-centre the final aim was to develop a model of P-centre location which could be applied to speech and non speech signals. The first aim was investigated by a series of experiments in which a) speech from different speakers was investigated to determine whether different models could account for variation between speakers b) whether rendering the amplitude time plot of a speech signal affects the P-centre of the signal c) whether increasing the amplitude at the offset of a speech signal alters P-centres in the production and perception of speech. The second aim was carried out by a) manipulating the rise time of different speech signals to determine whether the P-centre was affected, and whether the type of speech sound ramped affected the P-centre shift b) manipulating the rise time and decay time of a synthetic vowel to determine whether the onset alteration was had more affect on P-centre than the offset manipulation c) and whether the duration of a vowel affected the P-centre, if other attributes (amplitude, spectral contents) were held constant. The third aim - modelling P-centres - was based on these results. The Frequency dependent Amplitude Increase Model of P-centre location (FAIM) was developed using a modelling protocol, the APU GammaTone Filterbank and the speech from different speakers. The P-centres of the stimuli corpus were highly predicted by attributes of