Sample records for reliable simulation model

  1. Reliable results from stochastic simulation models

    Treesearch

    Donald L., Jr. Gochenour; Leonard R. Johnson

    1973-01-01

    Development of a computer simulation model is usually done without fully considering how long the model should run (e.g. computer time) before the results are reliable. However construction of confidence intervals (CI) about critical output parameters from the simulation model makes it possible to determine the point where model results are reliable. If the results are...

  2. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  3. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  4. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  5. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  6. Simulation-Based Training for Colonoscopy

    PubMed Central

    Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj; Svendsen, Lars Bo; Konge, Lars

    2015-01-01

    Abstract The aim of this study was to create simulation-based tests with credible pass/fail standards for 2 different fidelities of colonoscopy models. Only competent practitioners should perform colonoscopy. Reliable and valid simulation-based tests could be used to establish basic competency in colonoscopy before practicing on patients. Twenty-five physicians (10 consultants with endoscopic experience and 15 fellows with very little endoscopic experience) were tested on 2 different simulator models: a virtual-reality simulator and a physical model. Tests were repeated twice on each simulator model. Metrics with discriminatory ability were identified for both modalities and reliability was determined. The contrasting-groups method was used to create pass/fail standards and the consequences of these were explored. The consultants significantly performed faster and scored higher than the fellows on both the models (P < 0.001). Reliability analysis showed Cronbach α = 0.80 and 0.87 for the virtual-reality and the physical model, respectively. The established pass/fail standards failed one of the consultants (virtual-reality simulator) and allowed one fellow to pass (physical model). The 2 tested simulations-based modalities provided reliable and valid assessments of competence in colonoscopy and credible pass/fail standards were established for both the tests. We propose to use these standards in simulation-based training programs before proceeding to supervised training on patients. PMID:25634177

  7. Remotely piloted vehicle: Application of the GRASP analysis method

    NASA Technical Reports Server (NTRS)

    Andre, W. L.; Morris, J. B.

    1981-01-01

    The application of General Reliability Analysis Simulation Program (GRASP) to the remotely piloted vehicle (RPV) system is discussed. The model simulates the field operation of the RPV system. By using individual component reliabilities, the overall reliability of the RPV system is determined. The results of the simulations are given in operational days. The model represented is only a basis from which more detailed work could progress. The RPV system in this model is based on preliminary specifications and estimated values. The use of GRASP from basic system definition, to model input, and to model verification is demonstrated.

  8. A System for Integrated Reliability and Safety Analyses

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Coumeri, Marc; Scheidler, Peter, Jr.; Bonesteel, Charles

    1999-01-01

    We present an integrated reliability and aviation safety analysis tool. The reliability models for selected infrastructure components of the air traffic control system are described. The results of this model are used to evaluate the likelihood of seeing outcomes predicted by simulations with failures injected. We discuss the design of the simulation model, and the user interface to the integrated toolset.

  9. A Reliability Estimation in Modeling Watershed Runoff With Uncertainties

    NASA Astrophysics Data System (ADS)

    Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.

    1990-10-01

    The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.

  10. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  11. Automatic training and reliability estimation for 3D ASM applied to cardiac MRI segmentation

    NASA Astrophysics Data System (ADS)

    Tobon-Gomez, Catalina; Sukno, Federico M.; Butakoff, Constantine; Huguet, Marina; Frangi, Alejandro F.

    2012-07-01

    Training active shape models requires collecting manual ground-truth meshes in a large image database. While shape information can be reused across multiple imaging modalities, intensity information needs to be imaging modality and protocol specific. In this context, this study has two main purposes: (1) to test the potential of using intensity models learned from MRI simulated datasets and (2) to test the potential of including a measure of reliability during the matching process to increase robustness. We used a population of 400 virtual subjects (XCAT phantom), and two clinical populations of 40 and 45 subjects. Virtual subjects were used to generate simulated datasets (MRISIM simulator). Intensity models were trained both on simulated and real datasets. The trained models were used to segment the left ventricle (LV) and right ventricle (RV) from real datasets. Segmentations were also obtained with and without reliability information. Performance was evaluated with point-to-surface and volume errors. Simulated intensity models obtained average accuracy comparable to inter-observer variability for LV segmentation. The inclusion of reliability information reduced volume errors in hypertrophic patients (EF errors from 17 ± 57% to 10 ± 18% LV MASS errors from -27 ± 22 g to -14 ± 25 g), and in heart failure patients (EF errors from -8 ± 42% to -5 ± 14%). The RV model of the simulated images needs further improvement to better resemble image intensities around the myocardial edges. Both for real and simulated models, reliability information increased segmentation robustness without penalizing accuracy.

  12. Automatic training and reliability estimation for 3D ASM applied to cardiac MRI segmentation.

    PubMed

    Tobon-Gomez, Catalina; Sukno, Federico M; Butakoff, Constantine; Huguet, Marina; Frangi, Alejandro F

    2012-07-07

    Training active shape models requires collecting manual ground-truth meshes in a large image database. While shape information can be reused across multiple imaging modalities, intensity information needs to be imaging modality and protocol specific. In this context, this study has two main purposes: (1) to test the potential of using intensity models learned from MRI simulated datasets and (2) to test the potential of including a measure of reliability during the matching process to increase robustness. We used a population of 400 virtual subjects (XCAT phantom), and two clinical populations of 40 and 45 subjects. Virtual subjects were used to generate simulated datasets (MRISIM simulator). Intensity models were trained both on simulated and real datasets. The trained models were used to segment the left ventricle (LV) and right ventricle (RV) from real datasets. Segmentations were also obtained with and without reliability information. Performance was evaluated with point-to-surface and volume errors. Simulated intensity models obtained average accuracy comparable to inter-observer variability for LV segmentation. The inclusion of reliability information reduced volume errors in hypertrophic patients (EF errors from 17 ± 57% to 10 ± 18%; LV MASS errors from -27 ± 22 g to -14 ± 25 g), and in heart failure patients (EF errors from -8 ± 42% to -5 ± 14%). The RV model of the simulated images needs further improvement to better resemble image intensities around the myocardial edges. Both for real and simulated models, reliability information increased segmentation robustness without penalizing accuracy.

  13. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  14. Cost-effective solutions to maintaining smart grid reliability

    NASA Astrophysics Data System (ADS)

    Qin, Qiu

    As the aging power systems are increasingly working closer to the capacity and thermal limits, maintaining an sufficient reliability has been of great concern to the government agency, utility companies and users. This dissertation focuses on improving the reliability of transmission and distribution systems. Based on the wide area measurements, multiple model algorithms are developed to diagnose transmission line three-phase short to ground faults in the presence of protection misoperations. The multiple model algorithms utilize the electric network dynamics to provide prompt and reliable diagnosis outcomes. Computational complexity of the diagnosis algorithm is reduced by using a two-step heuristic. The multiple model algorithm is incorporated into a hybrid simulation framework, which consist of both continuous state simulation and discrete event simulation, to study the operation of transmission systems. With hybrid simulation, line switching strategy for enhancing the tolerance to protection misoperations is studied based on the concept of security index, which involves the faulted mode probability and stability coverage. Local measurements are used to track the generator state and faulty mode probabilities are calculated in the multiple model algorithms. FACTS devices are considered as controllers for the transmission system. The placement of FACTS devices into power systems is investigated with a criterion of maintaining a prescribed level of control reconfigurability. Control reconfigurability measures the small signal combined controllability and observability of a power system with an additional requirement on fault tolerance. For the distribution systems, a hierarchical framework, including a high level recloser allocation scheme and a low level recloser placement scheme, is presented. The impacts of recloser placement on the reliability indices is analyzed. Evaluation of reliability indices in the placement process is carried out via discrete event simulation. The reliability requirements are described with probabilities and evaluated from the empirical distributions of reliability indices.

  15. Reliability evaluation of microgrid considering incentive-based demand response

    NASA Astrophysics Data System (ADS)

    Huang, Ting-Cheng; Zhang, Yong-Jun

    2017-07-01

    Incentive-based demand response (IBDR) can guide customers to adjust their behaviour of electricity and curtail load actively. Meanwhile, distributed generation (DG) and energy storage system (ESS) can provide time for the implementation of IBDR. The paper focus on the reliability evaluation of microgrid considering IBDR. Firstly, the mechanism of IBDR and its impact on power supply reliability are analysed. Secondly, the IBDR dispatch model considering customer’s comprehensive assessment and the customer response model are developed. Thirdly, the reliability evaluation method considering IBDR based on Monte Carlo simulation is proposed. Finally, the validity of the above models and method is studied through numerical tests on modified RBTS Bus6 test system. Simulation results demonstrated that IBDR can improve the reliability of microgrid.

  16. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  17. Soldier Dimensions in Combat Models

    DTIC Science & Technology

    1990-05-07

    and performance. Questionnaires, SQTs, and ARTEPs were often used. Many scales had estimates of reliability but few had validity data. Most studies...pending its validation . Research plans were provided for applications in simulated combat and with simulation devices, for data previously gathered...regarding reliability and validity . Lack of information following an instrument indicates neither reliability nor validity information was provided by the

  18. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  19. Hardware and software reliability estimation using simulations

    NASA Technical Reports Server (NTRS)

    Swern, Frederic L.

    1994-01-01

    The simulation technique is used to explore the validation of both hardware and software. It was concluded that simulation is a viable means for validating both hardware and software and associating a reliability number with each. This is useful in determining the overall probability of system failure of an embedded processor unit, and improving both the code and the hardware where necessary to meet reliability requirements. The methodologies were proved using some simple programs, and simple hardware models.

  20. Modeling and experimental characterization of electromigration in interconnect trees

    NASA Astrophysics Data System (ADS)

    Thompson, C. V.; Hau-Riege, S. P.; Andleigh, V. K.

    1999-11-01

    Most modeling and experimental characterization of interconnect reliability is focussed on simple straight lines terminating at pads or vias. However, laid-out integrated circuits often have interconnects with junctions and wide-to-narrow transitions. In carrying out circuit-level reliability assessments it is important to be able to assess the reliability of these more complex shapes, generally referred to as `trees.' An interconnect tree consists of continuously connected high-conductivity metal within one layer of metallization. Trees terminate at diffusion barriers at vias and contacts, and, in the general case, can have more than one terminating branch when they include junctions. We have extended the understanding of `immortality' demonstrated and analyzed for straight stud-to-stud lines, to trees of arbitrary complexity. This leads to a hierarchical approach for identifying immortal trees for specific circuit layouts and models for operation. To complete a circuit-level-reliability analysis, it is also necessary to estimate the lifetimes of the mortal trees. We have developed simulation tools that allow modeling of stress evolution and failure in arbitrarily complex trees. We are testing our models and simulations through comparisons with experiments on simple trees, such as lines broken into two segments with different currents in each segment. Models, simulations and early experimental results on the reliability of interconnect trees are shown to be consistent.

  1. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  2. Modeling and Simulation Reliable Spacecraft On-Board Computing

    NASA Technical Reports Server (NTRS)

    Park, Nohpill

    1999-01-01

    The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.

  3. General Monte Carlo reliability simulation code including common mode failures and HARP fault/error-handling

    NASA Technical Reports Server (NTRS)

    Platt, M. E.; Lewis, E. E.; Boehm, F.

    1991-01-01

    A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.

  4. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  5. Bacterial transformation and biodegradation processes simulation in horizontal subsurface flow constructed wetlands using CWM1-RETRASO.

    PubMed

    Llorens, Esther; Saaltink, Maarten W; Poch, Manel; García, Joan

    2011-01-01

    The performance and reliability of the CWM1-RETRASO model for simulating processes in horizontal subsurface flow constructed wetlands (HSSF CWs) and the relative contribution of different microbial reactions to organic matter (COD) removal in a HSSF CW treating urban wastewater were evaluated. Various different approaches with diverse influent configurations were simulated. According to the simulations, anaerobic processes were more widespread in the simulated wetland and contributed to a higher COD removal rate [72-79%] than anoxic [0-1%] and aerobic reactions [20-27%] did. In all the cases tested, the reaction that most contributed to COD removal was methanogenesis [58-73%]. All results provided by the model were in consonance with literature and experimental field observations, suggesting a good performance and reliability of CWM1-RETRASO. According to the good simulation predictions, CWM1-RETRASO is the first mechanistic model able to successfully simulate the processes described by the CWM1 model in HSSF CWs. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  7. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  8. A particle swarm model for estimating reliability and scheduling system maintenance

    NASA Astrophysics Data System (ADS)

    Puzis, Rami; Shirtz, Dov; Elovici, Yuval

    2016-05-01

    Modifying data and information system components may introduce new errors and deteriorate the reliability of the system. Reliability can be efficiently regained with reliability centred maintenance, which requires reliability estimation for maintenance scheduling. A variant of the particle swarm model is used to estimate reliability of systems implemented according to the model view controller paradigm. Simulations based on data collected from an online system of a large financial institute are used to compare three component-level maintenance policies. Results show that appropriately scheduled component-level maintenance greatly reduces the cost of upholding an acceptable level of reliability by reducing the need in system-wide maintenance.

  9. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  10. Pediatric laryngeal simulator using 3D printed models: A novel technique.

    PubMed

    Kavanagh, Katherine R; Cote, Valerie; Tsui, Yvonne; Kudernatsch, Simon; Peterson, Donald R; Valdez, Tulio A

    2017-04-01

    Simulation to acquire and test technical skills is an essential component of medical education and residency training in both surgical and nonsurgical specialties. High-quality simulation education relies on the availability, accessibility, and reliability of models. The objective of this work was to describe a practical pediatric laryngeal model for use in otolaryngology residency training. Ideally, this model would be low-cost, have tactile properties resembling human tissue, and be reliably reproducible. Pediatric laryngeal models were developed using two manufacturing methods: direct three-dimensional (3D) printing of anatomical models and casted anatomical models using 3D-printed molds. Polylactic acid, acrylonitrile butadiene styrene, and high-impact polystyrene (HIPS) were used for the directly printed models, whereas a silicone elastomer (SE) was used for the casted models. The models were evaluated for anatomic quality, ease of manipulation, hardness, and cost of production. A tissue likeness scale was created to validate the simulation model. Fleiss' Kappa rating was performed to evaluate interrater agreement, and analysis of variance was performed to evaluate differences among the materials. The SE provided the most anatomically accurate models, with the tactile properties allowing for surgical manipulation of the larynx. Direct 3D printing was more cost-effective than the SE casting method but did not possess the material properties and tissue likeness necessary for surgical simulation. The SE models of the pediatric larynx created from a casting method demonstrated high quality anatomy, tactile properties comparable to human tissue, and easy manipulation with standard surgical instruments. Their use in a reliable, low-cost, accessible, modular simulation system provides a valuable training resource for otolaryngology residents. N/A. Laryngoscope, 127:E132-E137, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  11. Simulated Students and Classroom Use of Model-Based Intelligent Tutoring

    NASA Technical Reports Server (NTRS)

    Koedinger, Kenneth R.

    2008-01-01

    Two educational uses of models and simulations: 1) Students create models and use simulations ; and 2) Researchers create models of learners to guide development of reliably effective materials. Cognitive tutors simulate and support tutoring - data is crucial to create effective model. Pittsburgh Science of Learning Center: Resources for modeling, authoring, experimentation. Repository of data and theory. Examples of advanced modeling efforts: SimStudent learns rule-based model. Help-seeking model: Tutors metacognition. Scooter uses machine learning detectors of student engagement.

  12. Do downscaled general circulation models reliably simulate historical climatic conditions?

    USGS Publications Warehouse

    Bock, Andrew R.; Hay, Lauren E.; McCabe, Gregory J.; Markstrom, Steven L.; Atkinson, R. Dwight

    2018-01-01

    The accuracy of statistically downscaled (SD) general circulation model (GCM) simulations of monthly surface climate for historical conditions (1950–2005) was assessed for the conterminous United States (CONUS). The SD monthly precipitation (PPT) and temperature (TAVE) from 95 GCMs from phases 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) were used as inputs to a monthly water balance model (MWBM). Distributions of MWBM input (PPT and TAVE) and output [runoff (RUN)] variables derived from gridded station data (GSD) and historical SD climate were compared using the Kolmogorov–Smirnov (KS) test For all three variables considered, the KS test results showed that variables simulated using CMIP5 generally are more reliable than those derived from CMIP3, likely due to improvements in PPT simulations. At most locations across the CONUS, the largest differences between GSD and SD PPT and RUN occurred in the lowest part of the distributions (i.e., low-flow RUN and low-magnitude PPT). Results indicate that for the majority of the CONUS, there are downscaled GCMs that can reliably simulate historical climatic conditions. But, in some geographic locations, none of the SD GCMs replicated historical conditions for two of the three variables (PPT and RUN) based on the KS test, with a significance level of 0.05. In these locations, improved GCM simulations of PPT are needed to more reliably estimate components of the hydrologic cycle. Simple metrics and statistical tests, such as those described here, can provide an initial set of criteria to help simplify GCM selection.

  13. Reliability of analog quantum simulation

    DOE PAGES

    Sarovar, Mohan; Zhang, Jun; Zeng, Lishan

    2017-01-03

    Analog quantum simulators (AQS) will likely be the first nontrivial application of quantum technology for predictive simulation. However, there remain questions regarding the degree of confidence that can be placed in the results of AQS since they do not naturally incorporate error correction. Specifically, how do we know whether an analog simulation of a quantum model will produce predictions that agree with the ideal model in the presence of inevitable imperfections? At the same time there is a widely held expectation that certain quantum simulation questions will be robust to errors and perturbations in the underlying hardware. Resolving these twomore » points of view is a critical step in making the most of this promising technology. In this paper we formalize the notion of AQS reliability by determining sensitivity of AQS outputs to underlying parameters, and formulate conditions for robust simulation. Our approach naturally reveals the importance of model symmetries in dictating the robust properties. Finally, to demonstrate the approach, we characterize the robust features of a variety of quantum many-body models.« less

  14. Can one trust quantum simulators?

    PubMed

    Hauke, Philipp; Cucchietti, Fernando M; Tagliacozzo, Luca; Deutsch, Ivan; Lewenstein, Maciej

    2012-08-01

    Various fundamental phenomena of strongly correlated quantum systems such as high-T(c) superconductivity, the fractional quantum-Hall effect and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models which are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper (Feynman 1982 Int. J. Theor. Phys. 21 467), Richard Feynman suggested that such models might be solved by 'simulation' with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a 'quantum simulator,' would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question 'Can we trust quantum simulators?' is … to some extent.

  15. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    NASA Astrophysics Data System (ADS)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  16. A simulated training model for laparoscopic pyloromyotomy: Is 3D printing the way of the future?

    PubMed

    Williams, Andrew; McWilliam, Morgan; Ahlin, James; Davidson, Jacob; Quantz, Mackenzie A; Bütter, Andreana

    2018-05-01

    Hypertrophic pyloric stenosis (HPS) is a common neonatal condition treated with open or laparoscopic pyloromyotomy. 3D-printed organs offer realistic simulations to practice surgical techniques. The purpose of this study was to validate a 3D HPS stomach model and assess model reliability and surgical realism. Medical students, general surgery residents, and adult and pediatric general surgeons were recruited from a single center. Participants were videotaped three times performing a laparoscopic pyloromyotomy using box trainers and 3D-printed stomachs. Attempts were graded independently by three reviewers using GOALS and Task Specific Assessments (TSA). Participants were surveyed using the Index of Agreement of Assertions on Model Accuracy (IAAMA). Participants reported their experience levels as novice (22%), inexperienced (26%), intermediate (19%), and experienced (33%). Interrater reliability was similar for overall average GOALS and TSA scores. There was a significant improvement in GOALS (p<0.0001) and TSA scores (p=0.03) between attempts and overall. Participants felt the model accurately simulated a laparoscopic pyloromyotomy (82%) and would be a useful tool for beginners (100%). A 3D-printed stomach model for simulated laparoscopic pyloromyotomy is a useful training tool for learners to improve laparoscopic skills. The GOALS and TSA provide reliable technical skills assessments. II. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Mathematical Capture of Human Crowd Behavioral Data for Computational Model Building, Verification, and Validation

    DTIC Science & Technology

    2011-03-21

    throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983

  18. Big data analytics for the Future Circular Collider reliability and availability studies

    NASA Astrophysics Data System (ADS)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  19. Reliability Analysis of Sealing Structure of Electromechanical System Based on Kriging Model

    NASA Astrophysics Data System (ADS)

    Zhang, F.; Wang, Y. M.; Chen, R. W.; Deng, W. W.; Gao, Y.

    2018-05-01

    The sealing performance of aircraft electromechanical system has a great influence on flight safety, and the reliability of its typical seal structure is analyzed by researcher. In this paper, we regard reciprocating seal structure as a research object to study structural reliability. Having been based on the finite element numerical simulation method, the contact stress between the rubber sealing ring and the cylinder wall is calculated, and the relationship between the contact stress and the pressure of the hydraulic medium is built, and the friction force on different working conditions are compared. Through the co-simulation, the adaptive Kriging model obtained by EFF learning mechanism is used to describe the failure probability of the seal ring, so as to evaluate the reliability of the sealing structure. This article proposes a new idea of numerical evaluation for the reliability analysis of sealing structure, and also provides a theoretical basis for the optimal design of sealing structure.

  20. ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model

    NASA Technical Reports Server (NTRS)

    Hoffman, David J.; Viterna, Larry A.

    1991-01-01

    A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.

  1. A Topology Control Strategy with Reliability Assurance for Satellite Cluster Networks in Earth Observation

    PubMed Central

    Chen, Qing; Zhang, Jinxiu; Hu, Ze

    2017-01-01

    This article investigates the dynamic topology control problem of satellite cluster networks (SCNs) in Earth observation (EO) missions by applying a novel metric of stability for inter-satellite links (ISLs). The properties of the periodicity and predictability of satellites’ relative position are involved in the link cost metric which is to give a selection criterion for choosing the most reliable data routing paths. Also, a cooperative work model with reliability is proposed for the situation of emergency EO missions. Based on the link cost metric and the proposed reliability model, a reliability assurance topology control algorithm and its corresponding dynamic topology control (RAT) strategy are established to maximize the stability of data transmission in the SCNs. The SCNs scenario is tested through some numeric simulations of the topology stability of average topology lifetime and average packet loss rate. Simulation results show that the proposed reliable strategy applied in SCNs significantly improves the data transmission performance and prolongs the average topology lifetime. PMID:28241474

  2. A Topology Control Strategy with Reliability Assurance for Satellite Cluster Networks in Earth Observation.

    PubMed

    Chen, Qing; Zhang, Jinxiu; Hu, Ze

    2017-02-23

    This article investigates the dynamic topology control problemof satellite cluster networks (SCNs) in Earth observation (EO) missions by applying a novel metric of stability for inter-satellite links (ISLs). The properties of the periodicity and predictability of satellites' relative position are involved in the link cost metric which is to give a selection criterion for choosing the most reliable data routing paths. Also, a cooperative work model with reliability is proposed for the situation of emergency EO missions. Based on the link cost metric and the proposed reliability model, a reliability assurance topology control algorithm and its corresponding dynamic topology control (RAT) strategy are established to maximize the stability of data transmission in the SCNs. The SCNs scenario is tested through some numeric simulations of the topology stability of average topology lifetime and average packet loss rate. Simulation results show that the proposed reliable strategy applied in SCNs significantly improves the data transmission performance and prolongs the average topology lifetime.

  3. Can one trust quantum simulators?

    NASA Astrophysics Data System (ADS)

    Hauke, Philipp; Cucchietti, Fernando M.; Tagliacozzo, Luca; Deutsch, Ivan; Lewenstein, Maciej

    2012-08-01

    Various fundamental phenomena of strongly correlated quantum systems such as high-Tc superconductivity, the fractional quantum-Hall effect and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models which are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper (Feynman 1982 Int. J. Theor. Phys. 21 467), Richard Feynman suggested that such models might be solved by ‘simulation’ with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a ‘quantum simulator,’ would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question ‘Can we trust quantum simulators?’ is … to some extent.

  4. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.

  5. Evaluation of ceramics for stator application: Gas turbine engine report

    NASA Technical Reports Server (NTRS)

    Trela, W.; Havstad, P. H.

    1978-01-01

    Current ceramic materials, component fabrication processes, and reliability prediction capability for ceramic stators in an automotive gas turbine engine environment are assessed. Simulated engine duty cycle testing of stators conducted at temperatures up to 1093 C is discussed. Materials evaluated are SiC and Si3N4 fabricated from two near-net-shape processes: slip casting and injection molding. Stators for durability cycle evaluation and test specimens for material property characterization, and reliability prediction model prepared to predict stator performance in the simulated engine environment are considered. The status and description of the work performed for the reliability prediction modeling, stator fabrication, material property characterization, and ceramic stator evaluation efforts are reported.

  6. Probabilistic simulation of the human factor in structural reliability

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Many structural failures have occasionally been attributed to human factors in engineering design, analyses maintenance, and fabrication processes. Every facet of the engineering process is heavily governed by human factors and the degree of uncertainty associated with them. Factors such as societal, physical, professional, psychological, and many others introduce uncertainties that significantly influence the reliability of human performance. Quantifying human factors and associated uncertainties in structural reliability require: (1) identification of the fundamental factors that influence human performance, and (2) models to describe the interaction of these factors. An approach is being developed to quantify the uncertainties associated with the human performance. This approach consists of a multi factor model in conjunction with direct Monte-Carlo simulation.

  7. Cotton irrigation scheduling using a crop growth model and FAO-56 methods: Field and simulation studies

    USDA-ARS?s Scientific Manuscript database

    Crop growth simulation models can address a variety of agricultural problems, but their use to directly assist in-season irrigation management decisions is less common. Confidence in model reliability can be increased if models are shown to provide improved in-season management recommendations, whi...

  8. Simple, stable and reliable modeling of gas properties of organic working fluids in aerodynamic designs of turbomachinery for ORC and VCC

    NASA Astrophysics Data System (ADS)

    Kawakubo, T.

    2016-05-01

    A simple, stable and reliable modeling of the real gas nature of the working fluid is required for the aerodesigns of the turbine in the Organic Rankine Cycle and of the compressor in the Vapor Compression Cycle. Although many modern Computational Fluid Dynamics tools are capable of incorporating real gas models, simulations with such a gas model tend to be more time-consuming than those with a perfect gas model and even can be unstable due to the simulation near the saturation boundary. Thus a perfect gas approximation is still an attractive option to stably and swiftly conduct a design simulation. In this paper, an effective method of the CFD simulation with a perfect gas approximation is discussed. A method of representing the performance of the centrifugal compressor or the radial-inflow turbine by means of each set of non-dimensional performance parameters and translating the fictitious perfect gas result to the actual real gas performance is presented.

  9. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    PubMed

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  10. Modeling and simulation of reliability of unmanned intelligent vehicles

    NASA Astrophysics Data System (ADS)

    Singh, Harpreet; Dixit, Arati M.; Mustapha, Adam; Singh, Kuldip; Aggarwal, K. K.; Gerhart, Grant R.

    2008-04-01

    Unmanned ground vehicles have a large number of scientific, military and commercial applications. A convoy of such vehicles can have collaboration and coordination. For the movement of such a convoy, it is important to predict the reliability of the system. A number of approaches are available in literature which describes the techniques for determining the reliability of the system. Graph theoretic approaches are popular in determining terminal reliability and system reliability. In this paper we propose to exploit Fuzzy and Neuro-Fuzzy approaches for predicting the node and branch reliability of the system while Boolean algebra approaches are used to determine terminal reliability and system reliability. Hence a combination of intelligent approaches like Fuzzy, Neuro-Fuzzy and Boolean approaches is used to predict the overall system reliability of a convoy of vehicles. The node reliabilities may correspond to the collaboration of vehicles while branch reliabilities will determine the terminal reliabilities between different nodes. An algorithm is proposed for determining the system reliabilities of a convoy of vehicles. The simulation of the overall system is proposed. Such simulation should be helpful to the commander to take an appropriate action depending on the predicted reliability in different terrain and environmental conditions. It is hoped that results of this paper will lead to more important techniques to have a reliable convoy of vehicles in a battlefield.

  11. Reimplementation of the Biome-BGC model to simulate successional change.

    PubMed

    Bond-Lamberty, Ben; Gower, Stith T; Ahl, Douglas E; Thornton, Peter E

    2005-04-01

    Biogeochemical process models are increasingly employed to simulate current and future forest dynamics, but most simulate only a single canopy type. This limitation means that mixed stands, canopy succession and understory dynamics cannot be modeled, severe handicaps in many forests. The goals of this study were to develop a version of Biome-BGC that supported multiple, interacting vegetation types, and to assess its performance and limitations by comparing modeled results to published data from a 150-year boreal black spruce (Picea mariana (Mill.) BSP) chronosequence in northern Manitoba, Canada. Model data structures and logic were modified to support an arbitrary number of interacting vegetation types; an explicit height calculation was necessary to prioritize radiation and precipitation interception. Two vegetation types, evergreen needle-leaf and deciduous broadleaf, were modeled based on site-specific meteorological and physiological data. The new version of Biome-BGC reliably simulated observed changes in leaf area, net primary production and carbon stocks, and should be useful for modeling the dynamics of mixed-species stands and ecological succession. We discuss the strengths and limitations of Biome-BGC for this application, and note areas in which further work is necessary for reliable simulation of boreal biogeochemical cycling at a landscape scale.

  12. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  13. Nodal failure index approach to groundwater remediation design

    USGS Publications Warehouse

    Lee, J.; Reeves, H.W.; Dowding, C.H.

    2008-01-01

    Computer simulations often are used to design and to optimize groundwater remediation systems. We present a new computationally efficient approach that calculates the reliability of remedial design at every location in a model domain with a single simulation. The estimated reliability and other model information are used to select a best remedial option for given site conditions, conceptual model, and available data. To evaluate design performance, we introduce the nodal failure index (NFI) to determine the number of nodal locations at which the probability of success is below the design requirement. The strength of the NFI approach is that selected areas of interest can be specified for analysis and the best remedial design determined for this target region. An example application of the NFI approach using a hypothetical model shows how the spatial distribution of reliability can be used for a decision support system in groundwater remediation design. ?? 2008 ASCE.

  14. Developing Cognitive Models for Social Simulation from Survey Data

    NASA Astrophysics Data System (ADS)

    Alt, Jonathan K.; Lieberman, Stephen

    The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.

  15. Flow Channel Influence of a Collision-Based Piezoelectric Jetting Dispenser on Jet Performance

    PubMed Central

    Deng, Guiling; Li, Junhui; Duan, Ji’an

    2018-01-01

    To improve the jet performance of a bi-piezoelectric jet dispenser, mathematical and simulation models were established according to the operating principle. In order to improve the accuracy and reliability of the simulation calculation, a viscosity model of the fluid was fitted to a fifth-order function with shear rate based on rheological test data, and the needle displacement model was fitted to a nine-order function with time based on real-time displacement test data. The results show that jet performance is related to the diameter of the nozzle outlet and the cone angle of the nozzle, and the impacts of the flow channel structure were confirmed. The approach of numerical simulation is confirmed by the testing results of droplet volume. It will provide a reliable simulation platform for mechanical collision-based jet dispensing and a theoretical basis for micro jet valve design and improvement. PMID:29677140

  16. Numerical simulations of atmospheric dispersion of iodine-131 by different models.

    PubMed

    Leelőssy, Ádám; Mészáros, Róbert; Kovács, Attila; Lagzi, István; Kovács, Tibor

    2017-01-01

    Nowadays, several dispersion models are available to simulate the transport processes of air pollutants and toxic substances including radionuclides in the atmosphere. Reliability of atmospheric transport models has been demonstrated in several recent cases from local to global scale; however, very few actual emission data are available to evaluate model results in real-life cases. In this study, the atmospheric dispersion of 131I emitted to the atmosphere during an industrial process was simulated with different models, namely the WRF-Chem Eulerian online coupled model and the HYSPLIT and the RAPTOR Lagrangian models. Although only limited data of 131I detections has been available, the accuracy of modeled plume direction could be evaluated in complex late autumn weather situations. For the studied cases, the general reliability of models has been demonstrated. However, serious uncertainties arise related to low level inversions, above all in case of an emission event on 4 November 2011, when an important wind shear caused a significant difference between simulated and real transport directions. Results underline the importance of prudent interpretation of dispersion model results and the identification of weather conditions with a potential to cause large model errors.

  17. Numerical simulations of atmospheric dispersion of iodine-131 by different models

    PubMed Central

    Leelőssy, Ádám; Mészáros, Róbert; Kovács, Attila; Lagzi, István; Kovács, Tibor

    2017-01-01

    Nowadays, several dispersion models are available to simulate the transport processes of air pollutants and toxic substances including radionuclides in the atmosphere. Reliability of atmospheric transport models has been demonstrated in several recent cases from local to global scale; however, very few actual emission data are available to evaluate model results in real-life cases. In this study, the atmospheric dispersion of 131I emitted to the atmosphere during an industrial process was simulated with different models, namely the WRF-Chem Eulerian online coupled model and the HYSPLIT and the RAPTOR Lagrangian models. Although only limited data of 131I detections has been available, the accuracy of modeled plume direction could be evaluated in complex late autumn weather situations. For the studied cases, the general reliability of models has been demonstrated. However, serious uncertainties arise related to low level inversions, above all in case of an emission event on 4 November 2011, when an important wind shear caused a significant difference between simulated and real transport directions. Results underline the importance of prudent interpretation of dispersion model results and the identification of weather conditions with a potential to cause large model errors. PMID:28207853

  18. [Animal experimentation, computer simulation and surgical research].

    PubMed

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  19. SIERRA - A 3-D device simulator for reliability modeling

    NASA Astrophysics Data System (ADS)

    Chern, Jue-Hsien; Arledge, Lawrence A., Jr.; Yang, Ping; Maeda, John T.

    1989-05-01

    SIERRA is a three-dimensional general-purpose semiconductor-device simulation program which serves as a foundation for investigating integrated-circuit (IC) device and reliability issues. This program solves the Poisson and continuity equations in silicon under dc, transient, and small-signal conditions. Executing on a vector/parallel minisupercomputer, SIERRA utilizes a matrix solver which uses an incomplete LU (ILU) preconditioned conjugate gradient square (CGS, BCG) method. The ILU-CGS method provides a good compromise between memory size and convergence rate. The authors have observed a 5x to 7x speedup over standard direct methods in simulations of transient problems containing highly coupled Poisson and continuity equations such as those found in reliability-oriented simulations. The application of SIERRA to parasitic CMOS latchup and dynamic random-access memory single-event-upset studies is described.

  20. Space Vehicle Reliability Modeling in DIORAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tornga, Shawn Robert

    When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.

  1. Model of load balancing using reliable algorithm with multi-agent system

    NASA Astrophysics Data System (ADS)

    Afriansyah, M. F.; Somantri, M.; Riyadi, M. A.

    2017-04-01

    Massive technology development is linear with the growth of internet users which increase network traffic activity. It also increases load of the system. The usage of reliable algorithm and mobile agent in distributed load balancing is a viable solution to handle the load issue on a large-scale system. Mobile agent works to collect resource information and can migrate according to given task. We propose reliable load balancing algorithm using least time first byte (LFB) combined with information from the mobile agent. In system overview, the methodology consisted of defining identification system, specification requirements, network topology and design system infrastructure. The simulation method for simulated system was using 1800 request for 10 s from the user to the server and taking the data for analysis. Software simulation was based on Apache Jmeter by observing response time and reliability of each server and then compared it with existing method. Results of performed simulation show that the LFB method with mobile agent can perform load balancing with efficient systems to all backend server without bottleneck, low risk of server overload, and reliable.

  2. Evaluation of East Asian climatology as simulated by seven coupled models

    NASA Astrophysics Data System (ADS)

    Jiang, Dabang; Wang, Huijun; Lang, Xianmei

    2005-07-01

    Using observation and reanalysis data throughout 1961 1990, the East Asian surface air temperature, precipitation and sea level pressure climatology as simulated by seven fully coupled atmosphere-ocean models, namely CCSR/NIES, CGCM2, CSIRO-Mk2, ECHAM4/OPYC3, GFDL-R30, HadCM3, and NCAR-PCM, are systematically evaluated in this study. It is indicated that the above models can successfully reproduce the annual and seasonal surface air temperature and precipitation climatology in East Asia, with relatively good performance for boreal autumn and annual mean. The models’ ability to simulate surface air temperature is more reliable than precipitation. In addition, the models can dependably capture the geographical distribution pattern of annual, boreal winter, spring and autumn sea level pressure in East Asia. In contrast, relatively large simulation errors are displayed when simulated boreal summer sea level pressure is compared with reanalysis data in East Asia. It is revealed that the simulation errors for surface air temperature, precipitation and sea level pressure are generally large over and around the Tibetan Plateau. No individual model is best in every aspect. As a whole, the ECHAM4/OPYC3 and HadCM3 performances are much better, whereas the CGCM2 is relatively poorer in East Asia. Additionally, the seven-model ensemble mean usually shows a relatively high reliability.

  3. Advanced reliability modeling of fault-tolerant computer-based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1982-01-01

    Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.

  4. Optimization of life support systems and their systems reliability

    NASA Technical Reports Server (NTRS)

    Fan, L. T.; Hwang, C. L.; Erickson, L. E.

    1971-01-01

    The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.

  5. Assessing physician leadership styles: application of the situational leadership model to transitions in patient acuity.

    PubMed

    Skog, Alexander; Peyre, Sarah E; Pozner, Charles N; Thorndike, Mary; Hicks, Gloria; Dellaripa, Paul F

    2012-01-01

    The situational leadership model suggests that an effective leader adapts leadership style depending on the followers' level of competency. We assessed the applicability and reliability of the situational leadership model when observing residents in simulated hospital floor-based scenarios. Resident teams engaged in clinical simulated scenarios. Video recordings were divided into clips based on Emergency Severity Index v4 acuity scores. Situational leadership styles were identified in clips by two physicians. Interrater reliability was determined through descriptive statistical data analysis. There were 114 participants recorded in 20 sessions, and 109 clips were reviewed and scored. There was a high level of interrater reliability (weighted kappa r = .81) supporting situational leadership model's applicability to medical teams. A suggestive correlation was found between frequency of changes in leadership style and the ability to effectively lead a medical team. The situational leadership model represents a unique tool to assess medical leadership performance in the context of acuity changes.

  6. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  7. Test Reliability at the Individual Level

    PubMed Central

    Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly

    2016-01-01

    Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107

  8. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    NASA Astrophysics Data System (ADS)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  9. Preliminary Results Obtained in Integrated Safety Analysis of NASA Aviation Safety Program Technologies

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.

    2003-01-01

    The goal of the NASA Aviation Safety Program (AvSP) is to develop and demonstrate technologies that contribute to a reduction in the aviation fatal accident rate by a factor of 5 by the year 2007 and by a factor of 10 by the year 2022. Integrated safety analysis of day-to-day operations and risks within those operations will provide an understanding of the Aviation Safety Program portfolio. Safety benefits analyses are currently being conducted. Preliminary results for the Synthetic Vision Systems (SVS) and Weather Accident Prevention (WxAP) projects of the AvSP have been completed by the Logistics Management Institute under a contract with the NASA Glenn Research Center. These analyses include both a reliability analysis and a computer simulation model. The integrated safety analysis method comprises two principal components: a reliability model and a simulation model. In the reliability model, the results indicate how different technologies and systems will perform in normal, degraded, and failed modes of operation. In the simulation, an operational scenario is modeled. The primary purpose of the SVS project is to improve safety by providing visual-flightlike situation awareness during instrument conditions. The current analyses are an estimate of the benefits of SVS in avoiding controlled flight into terrain. The scenario modeled has an aircraft flying directly toward a terrain feature. When the flight crew determines that the aircraft is headed toward an obstruction, the aircraft executes a level turn at speed. The simulation is ended when the aircraft completes the turn.

  10. Using Model Replication to Improve the Reliability of Agent-Based Models

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  11. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    NASA Astrophysics Data System (ADS)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2015-03-01

    We perform a land surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies between 6 modern stand-alone land surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by 5 different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99-135 x 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the best current observation-based estimate of actual permafrost area (101 x 104 km2). However the uncertainty (1-128 x 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air temperature based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification and snow cover. Models are particularly poor at simulating permafrost distribution using definition that soil temperature remains at or below 0°C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in permafrost distribution can be made for the Tibetan Plateau.

  12. Uncertainty quantification and reliability assessment in operational oil spill forecast modeling system.

    PubMed

    Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao

    2017-03-15

    As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Parallelizing Timed Petri Net simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1993-01-01

    The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.

  14. Determining minimum staffing levels during snowstorms using an integrated simulation, regression, and reliability model.

    PubMed

    Kunkel, Amber; McLay, Laura A

    2013-03-01

    Emergency medical services (EMS) provide life-saving care and hospital transport to patients with severe trauma or medical conditions. Severe weather events, such as snow events, may lead to adverse patient outcomes by increasing call volumes and service times. Adequate staffing levels during such weather events are critical for ensuring that patients receive timely care. To determine staffing levels that depend on weather, we propose a model that uses a discrete event simulation of a reliability model to identify minimum staffing levels that provide timely patient care, with regression used to provide the input parameters. The system is said to be reliable if there is a high degree of confidence that ambulances can immediately respond to a given proportion of patients (e.g., 99 %). Four weather scenarios capture varying levels of snow falling and snow on the ground. An innovative feature of our approach is that we evaluate the mitigating effects of different extrinsic response policies and intrinsic system adaptation. The models use data from Hanover County, Virginia to quantify how snow reduces EMS system reliability and necessitates increasing staffing levels. The model and its analysis can assist in EMS preparedness by providing a methodology to adjust staffing levels during weather events. A key observation is that when it is snowing, intrinsic system adaptation has similar effects on system reliability as one additional ambulance.

  15. Predicting the difficulty of pure, strict, epistatic models: metrics for simulated model selection.

    PubMed

    Urbanowicz, Ryan J; Kiralis, Jeff; Fisher, Jonathan M; Moore, Jason H

    2012-09-26

    Algorithms designed to detect complex genetic disease associations are initially evaluated using simulated datasets. Typical evaluations vary constraints that influence the correct detection of underlying models (i.e. number of loci, heritability, and minor allele frequency). Such studies neglect to account for model architecture (i.e. the unique specification and arrangement of penetrance values comprising the genetic model), which alone can influence the detectability of a model. In order to design a simulation study which efficiently takes architecture into account, a reliable metric is needed for model selection. We evaluate three metrics as predictors of relative model detection difficulty derived from previous works: (1) Penetrance table variance (PTV), (2) customized odds ratio (COR), and (3) our own Ease of Detection Measure (EDM), calculated from the penetrance values and respective genotype frequencies of each simulated genetic model. We evaluate the reliability of these metrics across three very different data search algorithms, each with the capacity to detect epistatic interactions. We find that a model's EDM and COR are each stronger predictors of model detection success than heritability. This study formally identifies and evaluates metrics which quantify model detection difficulty. We utilize these metrics to intelligently select models from a population of potential architectures. This allows for an improved simulation study design which accounts for differences in detection difficulty attributed to model architecture. We implement the calculation and utilization of EDM and COR into GAMETES, an algorithm which rapidly and precisely generates pure, strict, n-locus epistatic models.

  16. Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System

    DTIC Science & Technology

    2010-09-13

    model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of

  17. ECO-DRIVING MODELING ENVIRONMENT

    DOT National Transportation Integrated Search

    2015-11-01

    This research project aims to examine the eco-driving modeling capabilities of different traffic modeling tools available and to develop a driver-simulator-based eco-driving modeling tool to evaluate driver behavior and to reliably estimate or measur...

  18. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Freeman, Paul Michael

    Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.

  19. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  20. Investigating Ground Swarm Robotics Using Agent Based Simulation

    DTIC Science & Technology

    2006-12-01

    Incorporation of virtual pheromones as a shared memory map is modeled as an additional capability that is found to enhance the robustness and reliability of the...virtual pheromones as a shared memory map is modeled as an additional capability that is found to enhance the robustness and reliability of the swarm... PHEROMONES .......................................... 42 1. Repel Friends under Inorganic SA.................................................. 45 2. Max

  1. Enterprise Systems Analysis

    DTIC Science & Technology

    2016-03-14

    flows , or continuous state changes, with feedback loops and lags modeled in the flow system. Agent based simulations operate using a discrete event...DeLand, S. M., Rutherford, B . M., Diegert, K. V., & Alvin, K. F. (2002). Error and uncertainty in modeling and simulation . Reliability Engineering...intrinsic complexity of the underlying social systems fundamentally limits the ability to make

  2. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  3. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  4. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  5. Reliability of the North America CORDEX and NARCCAP simulations in the context of uncertainty in regional climate change projections

    NASA Astrophysics Data System (ADS)

    Karmalkar, A.

    2017-12-01

    Ensembles of dynamically downscaled climate change simulations are routinely used to capture uncertainty in projections at regional scales. I assess the reliability of two such ensembles for North America - NARCCAP and NA-CORDEX - by investigating the impact of model selection on representing uncertainty in regional projections, and the ability of the regional climate models (RCMs) to provide reliable information. These aspects - discussed for the six regions used in the US National Climate Assessment - provide an important perspective on the interpretation of downscaled results. I show that selecting general circulation models for downscaling based on their equilibrium climate sensitivities is a reasonable choice, but the six models chosen for NA-CORDEX do a poor job at representing uncertainty in winter temperature and precipitation projections in many parts of the eastern US, which lead to overconfident projections. The RCM performance is highly variable across models, regions, and seasons and the ability of the RCMs to provide improved seasonal mean performance relative to their parent GCMs seems limited in both RCM ensembles. Additionally, the ability of the RCMs to simulate historical climates is not strongly related to their ability to simulate climate change across the ensemble. This finding suggests limited use of models' historical performance to constrain their projections. Given these challenges in dynamical downscaling, the RCM results should not be used in isolation. Information on how well the RCM ensembles represent known uncertainties in regional climate change projections discussed here needs to be communicated clearly to inform maagement decisions.

  6. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  7. Assessment of wheelchair driving performance in a virtual reality-based simulator

    PubMed Central

    Mahajan, Harshal P.; Dicianno, Brad E.; Cooper, Rory A.; Ding, Dan

    2013-01-01

    Objective To develop a virtual reality (VR)-based simulator that can assist clinicians in performing standardized wheelchair driving assessments. Design A completely within-subjects repeated measures design. Methods Participants drove their wheelchairs along a virtual driving circuit modeled after the Power Mobility Road Test (PMRT) and in a hallway with decreasing width. The virtual simulator was displayed on computer screen and VR screens and participants interacted with it using a set of instrumented rollers and a wheelchair joystick. Driving performances of participants were estimated and compared using quantitative metrics from the simulator. Qualitative ratings from two experienced clinicians were used to estimate intra- and inter-rater reliability. Results Ten regular wheelchair users (seven men, three women; mean age ± SD, 39.5 ± 15.39 years) participated. The virtual PMRT scores from the two clinicians show high inter-rater reliability (78–90%) and high intra-rater reliability (71–90%) for all test conditions. More research is required to explore user preferences and effectiveness of the two control methods (rollers and mathematical model) and the display screens. Conclusions The virtual driving simulator seems to be a promising tool for wheelchair driving assessment that clinicians can use to supplement their real-world evaluations. PMID:23820148

  8. A New Reliability Analysis Model of the Chegongzhuang Heat-Supplying Tunnel Structure Considering the Coupling of Pipeline Thrust and Thermal Effect

    PubMed Central

    Zhang, Jiawen; He, Shaohui; Wang, Dahai; Liu, Yangpeng; Yao, Wenbo; Liu, Xiabing

    2018-01-01

    Based on the operating Chegongzhuang heat-supplying tunnel in Beijing, the reliability of its lining structure under the action of large thrust and thermal effect is studied. According to the characteristics of a heat-supplying tunnel service, a three-dimensional numerical analysis model was established based on the mechanical tests on the in-situ specimens. The stress and strain of the tunnel structure were obtained before and after the operation. Compared with the field monitoring data, the rationality of the model was verified. After extracting the internal force of the lining structure, the improved method of subset simulation was proposed as the performance function to calculate the reliability of the main control section of the tunnel. In contrast to the traditional calculation method, the analytic relationship between the sample numbers in the subset simulation method and Monte Carlo method was given. The results indicate that the lining structure is greatly influenced by coupling in the range of six meters from the fixed brackets, especially the tunnel floor. The improved subset simulation method can greatly save computation time and improve computational efficiency under the premise of ensuring the accuracy of calculation. It is suitable for the reliability calculation of tunnel engineering, because “the lower the probability, the more efficient the calculation.” PMID:29401691

  9. Warranty optimisation based on the prediction of costs to the manufacturer using neural network model and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Stamenkovic, Dragan D.; Popovic, Vladimir M.

    2015-02-01

    Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.

  10. Science Based Human Reliability Analysis: Using Digital Nuclear Power Plant Simulators for Human Reliability Research

    NASA Astrophysics Data System (ADS)

    Shirley, Rachel Elizabeth

    Nuclear power plant (NPP) simulators are proliferating in academic research institutions and national laboratories in response to the availability of affordable, digital simulator platforms. Accompanying the new research facilities is a renewed interest in using data collected in NPP simulators for Human Reliability Analysis (HRA) research. An experiment conducted in The Ohio State University (OSU) NPP Simulator Facility develops data collection methods and analytical tools to improve use of simulator data in HRA. In the pilot experiment, student operators respond to design basis accidents in the OSU NPP Simulator Facility. Thirty-three undergraduate and graduate engineering students participated in the research. Following each accident scenario, student operators completed a survey about perceived simulator biases and watched a video of the scenario. During the video, they periodically recorded their perceived strength of significant Performance Shaping Factors (PSFs) such as Stress. This dissertation reviews three aspects of simulator-based research using the data collected in the OSU NPP Simulator Facility: First, a qualitative comparison of student operator performance to computer simulations of expected operator performance generated by the Information Decision Action Crew (IDAC) HRA method. Areas of comparison include procedure steps, timing of operator actions, and PSFs. Second, development of a quantitative model of the simulator bias introduced by the simulator environment. Two types of bias are defined: Environmental Bias and Motivational Bias. This research examines Motivational Bias--that is, the effect of the simulator environment on an operator's motivations, goals, and priorities. A bias causal map is introduced to model motivational bias interactions in the OSU experiment. Data collected in the OSU NPP Simulator Facility are analyzed using Structural Equation Modeling (SEM). Data include crew characteristics, operator surveys, and time to recognize and diagnose the accident in the scenario. These models estimate how the effects of the scenario conditions are mediated by simulator bias, and demonstrate how to quantify the strength of the simulator bias. Third, development of a quantitative model of subjective PSFs based on objective data (plant parameters, alarms, etc.) and PSF values reported by student operators. The objective PSF model is based on the PSF network in the IDAC HRA method. The final model is a mixed effects Bayesian hierarchical linear regression model. The subjective PSF model includes three factors: The Environmental PSF, the simulator Bias, and the Context. The Environmental Bias is mediated by an operator sensitivity coefficient that captures the variation in operator reactions to plant conditions. The data collected in the pilot experiments are not expected to reflect professional NPP operator performance, because the students are still novice operators. However, the models used in this research and the methods developed to analyze them demonstrate how to consider simulator bias in experiment design and how to use simulator data to enhance the technical basis of a complex HRA method. The contributions of the research include a framework for discussing simulator bias, a quantitative method for estimating simulator bias, a method for obtaining operator-reported PSF values, and a quantitative method for incorporating the variability in operator perception into PSF models. The research demonstrates applications of Structural Equation Modeling and hierarchical Bayesian linear regression models in HRA. Finally, the research demonstrates the benefits of using student operators as a test platform for HRA research.

  11. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  12. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  13. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 1. Mathematical models, computing methods, and results. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less

  14. A new statistical model for subgrid dispersion in large eddy simulations of particle-laden flows

    NASA Astrophysics Data System (ADS)

    Muela, Jordi; Lehmkuhl, Oriol; Pérez-Segarra, Carles David; Oliva, Asensi

    2016-09-01

    Dispersed multiphase turbulent flows are present in many industrial and commercial applications like internal combustion engines, turbofans, dispersion of contaminants, steam turbines, etc. Therefore, there is a clear interest in the development of models and numerical tools capable of performing detailed and reliable simulations about these kind of flows. Large Eddy Simulations offer good accuracy and reliable results together with reasonable computational requirements, making it a really interesting method to develop numerical tools for particle-laden turbulent flows. Nonetheless, in multiphase dispersed flows additional difficulties arises in LES, since the effect of the unresolved scales of the continuous phase over the dispersed phase is lost due to the filtering procedure. In order to solve this issue a model able to reconstruct the subgrid velocity seen by the particles is required. In this work a new model for the reconstruction of the subgrid scale effects over the dispersed phase is presented and assessed. This innovative methodology is based in the reconstruction of statistics via Probability Density Functions (PDFs).

  15. Simulations of stress evolution and the current density scaling of electromigration-induced failure times in pure and alloyed interconnects

    NASA Astrophysics Data System (ADS)

    Park, Young-Joon; Andleigh, Vaibhav K.; Thompson, Carl V.

    1999-04-01

    An electromigration model is developed to simulate the reliability of Al and Al-Cu interconnects. A polynomial expression for the free energy of solution by Murray [Int. Met. Rev. 30, 211 (1985)] was used to calculate the chemical potential for Al and Cu while the diffusivities were defined based on a Cu-trapping model by Rosenberg [J. Vac. Sci. Technol. 9, 263 (1972)]. The effects of Cu on stress evolution and lifetime were investigated in all-bamboo and near-bamboo stud-to-stud structures. In addition, the significance of the effect of mechanical stress on the diffusivity of both Al and Cu was determined in all-bamboo and near-bamboo lines. The void nucleation and growth process was simulated in 200 μm, stud-to-stud lines. Current density scaling behavior for void-nucleation-limited failure and void-growth-limited failure modes was simulated in long, stud-to-stud lines. Current density exponents of both n=2 for void nucleation and n=1 for void growth failure modes were found in both pure Al and Al-Cu lines. Limitations of the most widely used current density scaling law (Black's equation) in the analysis of the reliability of stud-to-stud lines are discussed. By modifying the input materials properties used in this model (when they are known), this model can be adapted to predict the reliability of other interconnect materials such as pure Cu and Cu alloys.

  16. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  17. Improving Water Level and Soil Moisture Over Peatlands in a Global Land Modeling System

    NASA Technical Reports Server (NTRS)

    Bechtold, M.; De Lannoy, G. J. M.; Roose, D.; Reichle, R. H.; Koster, R. D.; Mahanama, S. P.

    2017-01-01

    New model structure for peatlands results in improved skill metrics (without any parameter calibration) Simulated surface soil moisture strongly affected by new model, but reliable soil moisture data lacking for validation.

  18. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  19. Fog-computing concept usage as means to enhance information and control system reliability

    NASA Astrophysics Data System (ADS)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  20. Road simulation for four-wheel vehicle whole input power spectral density

    NASA Astrophysics Data System (ADS)

    Wang, Jiangbo; Qiang, Baomin

    2017-05-01

    As the vibration of running vehicle mainly comes from road and influence vehicle ride performance. So the road roughness power spectral density simulation has great significance to analyze automobile suspension vibration system parameters and evaluate ride comfort. Firstly, this paper based on the mathematical model of road roughness power spectral density, established the integral white noise road random method. Then in the MATLAB/Simulink environment, according to the research method of automobile suspension frame from simple two degree of freedom single-wheel vehicle model to complex multiple degrees of freedom vehicle model, this paper built the simple single incentive input simulation model. Finally the spectrum matrix was used to build whole vehicle incentive input simulation model. This simulation method based on reliable and accurate mathematical theory and can be applied to the random road simulation of any specified spectral which provides pavement incentive model and foundation to vehicle ride performance research and vibration simulation.

  1. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    USGS Publications Warehouse

    Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-01-01

     We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135  ×  104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101  × 104 km2). However the uncertainty (1 to 128  ×  104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for the Tibetan Plateau.

  2. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    NASA Astrophysics Data System (ADS)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for the Tibetan Plateau.

  3. A dynamic Thurstonian item response theory of motive expression in the picture story exercise: solving the internal consistency paradox of the PSE.

    PubMed

    Lang, Jonas W B

    2014-07-01

    The measurement of implicit or unconscious motives using the picture story exercise (PSE) has long been a target of debate in the psychological literature. Most debates have centered on the apparent paradox that PSE measures of implicit motives typically show low internal consistency reliability on common indices like Cronbach's alpha but nevertheless predict behavioral outcomes. I describe a dynamic Thurstonian item response theory (IRT) model that builds on dynamic system theories of motivation, theorizing on the PSE response process, and recent advancements in Thurstonian IRT modeling of choice data. To assess the models' capability to explain the internal consistency paradox, I first fitted the model to archival data (Gurin, Veroff, & Feld, 1957) and then simulated data based on bias-corrected model estimates from the real data. Simulation results revealed that the average squared correlation reliability for the motives in the Thurstonian IRT model was .74 and that Cronbach's alpha values were similar to the real data (<.35). These findings suggest that PSE motive measures have long been reliable and increase the scientific value of extant evidence from motivational research using PSE motive measures. (c) 2014 APA, all rights reserved.

  4. Improving Hydrological Simulations by Incorporating GRACE Data for Parameter Calibration

    NASA Astrophysics Data System (ADS)

    Bai, P.

    2017-12-01

    Hydrological model parameters are commonly calibrated by observed streamflow data. This calibration strategy is questioned when the modeled hydrological variables of interest are not limited to streamflow. Well-performed streamflow simulations do not guarantee the reliable reproduction of other hydrological variables. One of the reasons is that hydrological model parameters are not reasonably identified. The Gravity Recovery and Climate Experiment (GRACE) satellite-derived total water storage change (TWSC) data provide an opportunity to constrain hydrological model parameterizations in combination with streamflow observations. We constructed a multi-objective calibration scheme based on GRACE-derived TWSC and streamflow observations, with the aim of improving the parameterizations of hydrological models. The multi-objective calibration scheme was compared with the traditional single-objective calibration scheme, which is based only on streamflow observations. Two monthly hydrological models were employed on 22 Chinese catchments with different hydroclimatic conditions. The model evaluation was performed using observed streamflows, GRACE-derived TWSC, and evapotranspiraiton (ET) estimates from flux towers and from the water balance approach. Results showed that the multi-objective calibration provided more reliable TWSC and ET simulations without significant deterioration in the accuracy of streamflow simulations than the single-objective calibration. In addition, the improvements of TWSC and ET simulations were more significant in relatively dry catchments than in relatively wet catchments. This study highlights the importance of including additional constraints besides streamflow observations in the parameter estimation to improve the performances of hydrological models.

  5. Validation databases for simulation models: aboveground biomass and net primary productive, (NPP) estimation using eastwide FIA data

    Treesearch

    Jennifer C. Jenkins; Richard A. Birdsey

    2000-01-01

    As interest grows in the role of forest growth in the carbon cycle, and as simulation models are applied to predict future forest productivity at large spatial scales, the need for reliable and field-based data for evaluation of model estimates is clear. We created estimates of potential forest biomass and annual aboveground production for the Chesapeake Bay watershed...

  6. Review on applications of artificial intelligence methods for dam and reservoir-hydro-environment models.

    PubMed

    Allawi, Mohammed Falah; Jaafar, Othman; Mohamad Hamzah, Firdaus; Abdullah, Sharifah Mastura Syed; El-Shafie, Ahmed

    2018-05-01

    Efficacious operation for dam and reservoir system could guarantee not only a defenselessness policy against natural hazard but also identify rule to meet the water demand. Successful operation of dam and reservoir systems to ensure optimal use of water resources could be unattainable without accurate and reliable simulation models. According to the highly stochastic nature of hydrologic parameters, developing accurate predictive model that efficiently mimic such a complex pattern is an increasing domain of research. During the last two decades, artificial intelligence (AI) techniques have been significantly utilized for attaining a robust modeling to handle different stochastic hydrological parameters. AI techniques have also shown considerable progress in finding optimal rules for reservoir operation. This review research explores the history of developing AI in reservoir inflow forecasting and prediction of evaporation from a reservoir as the major components of the reservoir simulation. In addition, critical assessment of the advantages and disadvantages of integrated AI simulation methods with optimization methods has been reported. Future research on the potential of utilizing new innovative methods based AI techniques for reservoir simulation and optimization models have also been discussed. Finally, proposal for the new mathematical procedure to accomplish the realistic evaluation of the whole optimization model performance (reliability, resilience, and vulnerability indices) has been recommended.

  7. A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Lipeng; Wang, Feiyi; Oral, H. Sarp

    High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storagemore » systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results« less

  8. The GRASP 3: Graphical Reliability Analysis Simulation Program. Version 3: A users' manual and modelling guide

    NASA Technical Reports Server (NTRS)

    Phillips, D. T.; Manseur, B.; Foster, J. W.

    1982-01-01

    Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.

  9. Consistent data-driven computational mechanics

    NASA Astrophysics Data System (ADS)

    González, D.; Chinesta, F.; Cueto, E.

    2018-05-01

    We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.

  10. Aerodynamic force measurement on a large-scale model in a short duration test facility

    NASA Astrophysics Data System (ADS)

    Tanno, H.; Kodera, M.; Komuro, T.; Sato, K.; Takahasi, M.; Itoh, K.

    2005-03-01

    A force measurement technique has been developed for large-scale aerodynamic models with a short test time. The technique is based on direct acceleration measurements, with miniature accelerometers mounted on a test model suspended by wires. Measuring acceleration at two different locations, the technique can eliminate oscillations from natural vibration of the model. The technique was used for drag force measurements on a 3m long supersonic combustor model in the HIEST free-piston driven shock tunnel. A time resolution of 350μs is guaranteed during measurements, whose resolution is enough for ms order test time in HIEST. To evaluate measurement reliability and accuracy, measured values were compared with results from a three-dimensional Navier-Stokes numerical simulation. The difference between measured values and numerical simulation values was less than 5%. We conclude that this measurement technique is sufficiently reliable for measuring aerodynamic force within test durations of 1ms.

  11. Work-in-Progress Presented at the Army Symposium on Solid Mechanics, 1980 - Designing for Extremes: Environment, Loading, and Structural Behavior Held at Cape Cod, Massachusetts, 29 September-2 October 1980

    DTIC Science & Technology

    1980-09-01

    relating x’and y’ Figure 2: Basic Laboratory Simulation Model 73 COMPARISON OF COMPUTED AND MEASURED ACCELERATIONS IN A DYNAMICALLY LOADED TACTICAL...Survival (General) Displacements Mines (Ordnance) Telemeter Systems Dynamic Response Models Temperatures Dynamics Moisture Thermal Stresses Energy...probabilistic reliability model for the XM 753 projectile rocket motor to bulkhead joint under extreme loading conditions is constructed. The reliability

  12. Identification of the contribution of the ankle and hip joints to multi-segmental balance control

    PubMed Central

    2013-01-01

    Background Human stance involves multiple segments, including the legs and trunk, and requires coordinated actions of both. A novel method was developed that reliably estimates the contribution of the left and right leg (i.e., the ankle and hip joints) to the balance control of individual subjects. Methods The method was evaluated using simulations of a double-inverted pendulum model and the applicability was demonstrated with an experiment with seven healthy and one Parkinsonian participant. Model simulations indicated that two perturbations are required to reliably estimate the dynamics of a double-inverted pendulum balance control system. In the experiment, two multisine perturbation signals were applied simultaneously. The balance control system dynamic behaviour of the participants was estimated by Frequency Response Functions (FRFs), which relate ankle and hip joint angles to joint torques, using a multivariate closed-loop system identification technique. Results In the model simulations, the FRFs were reliably estimated, also in the presence of realistic levels of noise. In the experiment, the participants responded consistently to the perturbations, indicated by low noise-to-signal ratios of the ankle angle (0.24), hip angle (0.28), ankle torque (0.07), and hip torque (0.33). The developed method could detect that the Parkinson patient controlled his balance asymmetrically, that is, the right ankle and hip joints produced more corrective torque. Conclusion The method allows for a reliable estimate of the multisegmental feedback mechanism that stabilizes stance, of individual participants and of separate legs. PMID:23433148

  13. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  14. Simulation supported POD for RT test case-concept and modeling

    NASA Astrophysics Data System (ADS)

    Gollwitzer, C.; Bellon, C.; Deresch, A.; Ewert, U.; Jaenisch, G.-R.; Zscherpel, U.; Mistral, Q.

    2012-05-01

    Within the framework of the European project PICASSO, the radiographic simulator aRTist (analytical Radiographic Testing inspection simulation tool) developed by BAM has been extended for reliability assessment of film and digital radiography. NDT of safety relevant components of aerospace industry requires the proof of probability of detection (POD) of the inspection. Modeling tools can reduce the expense of such extended, time consuming NDT trials, if the result of simulation fits to the experiment. Our analytic simulation tool consists of three modules for the description of the radiation source, the interaction of radiation with test pieces and flaws, and the detection process with special focus on film and digital industrial radiography. It features high processing speed with near-interactive frame rates and a high level of realism. A concept has been developed as well as a software extension for reliability investigations, completed by a user interface for planning automatic simulations with varying parameters and defects. Furthermore, an automatic image analysis procedure is included to evaluate the defect visibility. The radiographic modeling from 3D CAD of aero engine components and quality test samples are compared as a precondition for real trials. This enables the evaluation and optimization of film replacement for application of modern digital equipment for economical NDT and defined POD.

  15. Safety and reliability analysis in a polyvinyl chloride batch process using dynamic simulator-case study: Loss of containment incident.

    PubMed

    Rizal, Datu; Tani, Shinichi; Nishiyama, Kimitoshi; Suzuki, Kazuhiko

    2006-10-11

    In this paper, a novel methodology in batch plant safety and reliability analysis is proposed using a dynamic simulator. A batch process involving several safety objects (e.g. sensors, controller, valves, etc.) is activated during the operational stage. The performance of the safety objects is evaluated by the dynamic simulation and a fault propagation model is generated. By using the fault propagation model, an improved fault tree analysis (FTA) method using switching signal mode (SSM) is developed for estimating the probability of failures. The timely dependent failures can be considered as unavailability of safety objects that can cause the accidents in a plant. Finally, the rank of safety object is formulated as performance index (PI) and can be estimated using the importance measures. PI shows the prioritization of safety objects that should be investigated for safety improvement program in the plants. The output of this method can be used for optimal policy in safety object improvement and maintenance. The dynamic simulator was constructed using Visual Modeler (VM, the plant simulator, developed by Omega Simulation Corp., Japan). A case study is focused on the loss of containment (LOC) incident at polyvinyl chloride (PVC) batch process which is consumed the hazardous material, vinyl chloride monomer (VCM).

  16. A numerical insight into elastomer normally closed micro valve actuation with cohesive interfacial cracking modelling

    NASA Astrophysics Data System (ADS)

    Wang, Dongyang; Ba, Dechun; Hao, Ming; Duan, Qihui; Liu, Kun; Mei, Qi

    2018-05-01

    Pneumatic NC (normally closed) valves are widely used in high density microfluidics systems. To improve actuation reliability, the actuation pressure needs to be reduced. In this work, we utilize 3D FEM (finite element method) modelling to get an insight into the valve actuation process numerically. Specifically, the progressive debonding process at the elastomer interface is simulated with CZM (cohesive zone model) method. To minimize the actuation pressure, the V-shape design has been investigated and compared with a normal straight design. The geometrical effects of valve shape has been elaborated, in terms of valve actuation pressure. Based on our simulated results, we formulate the main concerns for micro valve design and fabrication, which is significant for minimizing actuation pressures and ensuring reliable operation.

  17. An Investigation of the Impact of Guessing on Coefficient α and Reliability

    PubMed Central

    2014-01-01

    Guessing is known to influence the test reliability of multiple-choice tests. Although there are many studies that have examined the impact of guessing, they used rather restrictive assumptions (e.g., parallel test assumptions, homogeneous inter-item correlations, homogeneous item difficulty, and homogeneous guessing levels across items) to evaluate the relation between guessing and test reliability. Based on the item response theory (IRT) framework, this study investigated the extent of the impact of guessing on reliability under more realistic conditions where item difficulty, item discrimination, and guessing levels actually vary across items with three different test lengths (TL). By accommodating multiple item characteristics simultaneously, this study also focused on examining interaction effects between guessing and other variables entered in the simulation to be more realistic. The simulation of the more realistic conditions and calculations of reliability and classical test theory (CTT) item statistics were facilitated by expressing CTT item statistics, coefficient α, and reliability in terms of IRT model parameters. In addition to the general negative impact of guessing on reliability, results showed interaction effects between TL and guessing and between guessing and test difficulty.

  18. Watershed Models for Decision Support for Inflows to Potholes Reservoir, Washington

    USGS Publications Warehouse

    Mastin, Mark C.

    2009-01-01

    A set of watershed models for four basins (Crab Creek, Rocky Ford Creek, Rocky Coulee, and Lind Coulee), draining into Potholes Reservoir in east-central Washington, was developed as part of a decision support system to aid the U.S. Department of the Interior, Bureau of Reclamation, in managing water resources in east-central Washington State. The project is part of the U.S. Geological Survey and Bureau of Reclamation collaborative Watershed and River Systems Management Program. A conceptual model of hydrology is outlined for the study area that highlights the significant processes that are important to accurately simulate discharge under a wide range of conditions. The conceptual model identified the following factors as significant for accurate discharge simulations: (1) influence of frozen ground on peak discharge, (2) evaporation and ground-water flow as major pathways in the system, (3) channel losses, and (4) influence of irrigation practices on reducing or increasing discharge. The Modular Modeling System was used to create a watershed model for the four study basins by combining standard Precipitation Runoff Modeling System modules with modified modules from a previous study and newly modified modules. The model proved unreliable in simulating peak-flow discharge because the index used to track frozen ground conditions was not reliable. Mean monthly and mean annual discharges were more reliable when simulated. Data from seven USGS streamflow-gaging stations were used to compare with simulated discharge for model calibration and evaluation. Mean annual differences between simulated and observed discharge varied from 1.2 to 13.8 percent for all stations used in the comparisons except one station on a regional ground-water discharge stream. Two thirds of the mean monthly percent differences between the simulated mean and the observed mean discharge for these six stations were between -20 and 240 percent, or in absolute terms, between -0.8 and 11 cubic feet per second. A graphical user interface was developed for the user to easily run the model, make runoff forecasts, and evaluate the results. The models; however, are not reliable for managing short-term operations because of their demonstrated inability to match individual storm peaks and individual monthly discharge values. Short-term forecasting may be improved with real-time monitoring of the extent of frozen ground and the snow-water equivalent in the basin. Despite the models unreliability for short-term runoff forecasts, they are useful in providing long-term, time-series discharge data where no observed data exist.

  19. Residual stress investigation of via-last through-silicon via by polarized Raman spectroscopy measurement and finite element simulation

    NASA Astrophysics Data System (ADS)

    Feng, Wei; Watanabe, Naoya; Shimamoto, Haruo; Aoyagi, Masahiro; Kikuchi, Katsuya

    2018-07-01

    The residual stresses induced around through-silicon vias (TSVs) by a fabrication process is one of the major concerns of reliability. We proposed a methodology to investigate the residual stress in a via-last TSV. Firstly, radial and axial thermal stresses were measured by polarized Raman spectroscopy. The agreement between the simulated stress level and measured results validated the detail simulation model. Furthermore, the validated simulation model was adopted to the study of residual stress by element death/birth methods. The residual stress at room temperature concentrates at passivation layers owing to the high fabrication process temperatures of 420 °C for SiN film and 350 °C for SiO2 films. For a Si substrate, a high-level stress was observed near potential device locations, which requires attention to address reliability concerns in stress-sensitive devices. This methodology of residual stress analysis can be adopted to investigate the residual stress in other devices.

  20. Building Blocks for Reliable Complex Nonlinear Numerical Simulations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi N. (Technical Monitor)

    2002-01-01

    This talk describes some of the building blocks to ensure a higher level of confidence in the predictability and reliability (PAR) of numerical simulation of multiscale complex nonlinear problems. The focus is on relating PAR of numerical simulations with complex nonlinear phenomena of numerics. To isolate sources of numerical uncertainties, the possible discrepancy between the chosen partial differential equation (PDE) model and the real physics and/or experimental data is set aside. The discussion is restricted to how well numerical schemes can mimic the solution behavior of the underlying PDE model for finite time steps and grid spacings. The situation is complicated by the fact that the available theory for the understanding of nonlinear behavior of numerics is not at a stage to fully analyze the nonlinear Euler and Navier-Stokes equations. The discussion is based on the knowledge gained for nonlinear model problems with known analytical solutions to identify and explain the possible sources and remedies of numerical uncertainties in practical computations. Examples relevant to turbulent flow computations are included.

  1. Building Blocks for Reliable Complex Nonlinear Numerical Simulations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.

    2005-01-01

    This chapter describes some of the building blocks to ensure a higher level of confidence in the predictability and reliability (PAR) of numerical simulation of multiscale complex nonlinear problems. The focus is on relating PAR of numerical simulations with complex nonlinear phenomena of numerics. To isolate sources of numerical uncertainties, the possible discrepancy between the chosen partial differential equation (PDE) model and the real physics and/or experimental data is set aside. The discussion is restricted to how well numerical schemes can mimic the solution behavior of the underlying PDE model for finite time steps and grid spacings. The situation is complicated by the fact that the available theory for the understanding of nonlinear behavior of numerics is not at a stage to fully analyze the nonlinear Euler and Navier-Stokes equations. The discussion is based on the knowledge gained for nonlinear model problems with known analytical solutions to identify and explain the possible sources and remedies of numerical uncertainties in practical computations.

  2. Building Blocks for Reliable Complex Nonlinear Numerical Simulations. Chapter 2

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi N. (Technical Monitor)

    2001-01-01

    This chapter describes some of the building blocks to ensure a higher level of confidence in the predictability and reliability (PAR) of numerical simulation of multiscale complex nonlinear problems. The focus is on relating PAR of numerical simulations with complex nonlinear phenomena of numerics. To isolate sources of numerical uncertainties, the possible discrepancy between the chosen partial differential equation (PDE) model and the real physics and/or experimental data is set aside. The discussion is restricted to how well numerical schemes can mimic the solution behavior of the underlying PDE model for finite time steps and grid spacings. The situation is complicated by the fact that the available theory for the understanding of nonlinear behavior of numerics is not at a stage to fully analyze the nonlinear Euler and Navier-Stokes equations. The discussion is based on the knowledge gained for nonlinear model problems with known analytical solutions to identify and explain the possible sources and remedies of numerical uncertainties in practical computations. Examples relevant to turbulent flow computations are included.

  3. Determination of Turboprop Reduction Gearbox System Fatigue Life and Reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Lewicki, David G.; Savage, Michael; Vlcek, Brian L.

    2007-01-01

    Two computational models to determine the fatigue life and reliability of a commercial turboprop gearbox are compared with each other and with field data. These models are (1) Monte Carlo simulation of randomly selected lives of individual bearings and gears comprising the system and (2) two-parameter Weibull distribution function for bearings and gears comprising the system using strict-series system reliability to combine the calculated individual component lives in the gearbox. The Monte Carlo simulation included the virtual testing of 744,450 gearboxes. Two sets of field data were obtained from 64 gearboxes that were first-run to removal for cause, were refurbished and placed back in service, and then were second-run until removal for cause. A series of equations were empirically developed from the Monte Carlo simulation to determine the statistical variation in predicted life and Weibull slope as a function of the number of gearboxes failed. The resultant L(sub 10) life from the field data was 5,627 hr. From strict-series system reliability, the predicted L(sub 10) life was 774 hr. From the Monte Carlo simulation, the median value for the L(sub 10) gearbox lives equaled 757 hr. Half of the gearbox L(sub 10) lives will be less than this value and the other half more. The resultant L(sub 10) life of the second-run (refurbished) gearboxes was 1,334 hr. The apparent load-life exponent p for the roller bearings is 5.2. Were the bearing lives to be recalculated with a load-life exponent p equal to 5.2, the predicted L(sub 10) life of the gearbox would be equal to the actual life obtained in the field. The component failure distribution of the gearbox from the Monte Carlo simulation was nearly identical to that using the strict-series system reliability analysis, proving the compatibility of these methods.

  4. Functionalization of MEMS cantilever beams for interconnect reliability investigation: development practice

    NASA Astrophysics Data System (ADS)

    Bieniek, T.; Janczyk, G.; Dobrowolski, R.; Wojciechowska, K.; Malinowska, A.; Panas, A.; Nieprzecki, M.; Kłos, H.

    2016-11-01

    This paper covers research results on development of the cantilevers beams test structures for interconnects reliability and robustness investigation. Presented results include design, modelling, simulation, optimization and finally fabrication stage performed on 4 inch Si wafers using the ITE microfabrication facility. This paper also covers experimental results from the test structures characterization.

  5. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  6. Validation of A Global Hydrological Model

    NASA Astrophysics Data System (ADS)

    Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

    Freshwater availability has been recognized as a global issue, and its consistent quan- tification not only in individual river basins but also at the global scale is required to support the sustainable use of water. The Global Hydrology Model WGHM, which is a submodel of the global water use and availability model WaterGAP 2, computes sur- face runoff, groundwater recharge and river discharge at a spatial resolution of 0.5. WGHM is based on the best global data sets currently available, including a newly developed drainage direction map and a data set of wetlands, lakes and reservoirs. It calculates both natural and actual discharge by simulating the reduction of river discharge by human water consumption (as computed by the water use submodel of WaterGAP 2). WGHM is calibrated against observed discharge at 724 gauging sta- tions (representing about 50% of the global land area) by adjusting a parameter of the soil water balance. It not only computes the long-term average water resources but also water availability indicators that take into account the interannual and seasonal variability of runoff and discharge. The reliability of the model results is assessed by comparing observed and simulated discharges at the calibration stations and at se- lected other stations. We conclude that reliable results can be obtained for basins of more than 20,000 km2. In particular, the 90% reliable monthly discharge is simu- lated well. However, there is the tendency that semi-arid and arid basins are modeled less satisfactorily than humid ones, which is partially due to neglecting river channel losses and evaporation of runoff from small ephemeral ponds in the model. Also, the hydrology of highly developed basins with large artificial storages, basin transfers and irrigation schemes cannot be simulated well. The seasonality of discharge in snow- dominated basins is overestimated by WGHM, and if the snow-dominated basin is uncalibrated, discharge is likely to be underestimated due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.

  7. Investigating a self-scoring interview simulation for learning and assessment in the medical consultation.

    PubMed

    Bruen, Catherine; Kreiter, Clarence; Wade, Vincent; Pawlikowska, Teresa

    2017-01-01

    Experience with simulated patients supports undergraduate learning of medical consultation skills. Adaptive simulations are being introduced into this environment. The authors investigate whether it can underpin valid and reliable assessment by conducting a generalizability analysis using IT data analytics from the interaction of medical students (in psychiatry) with adaptive simulations to explore the feasibility of adaptive simulations for supporting automated learning and assessment. The generalizability (G) study was focused on two clinically relevant variables: clinical decision points and communication skills. While the G study on the communication skills score yielded low levels of true score variance, the results produced by the decision points, indicating clinical decision-making and confirming user knowledge of the process of the Calgary-Cambridge model of consultation, produced reliability levels similar to what might be expected with rater-based scoring. The findings indicate that adaptive simulations have potential as a teaching and assessment tool for medical consultations.

  8. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  9. Simulated training in colonoscopic stenting of colonic strictures: validation of a cadaver model.

    PubMed

    Iordache, F; Bucobo, J C; Devlin, D; You, K; Bergamaschi, R

    2015-07-01

    There are currently no available simulation models for training in colonoscopic stent deployment. The aim of this study was to validate a cadaver model for simulation training in colonoscopy with stent deployment for colonic strictures. This was a prospective study enrolling surgeons at a single institution. Participants performed colonoscopic stenting on a cadaver model. Their performance was assessed by two independent observers. Measurements were performed for quantitative analysis (time to identify stenosis, time for deployment, accuracy) and a weighted score was devised for assessment. The Mann-Whitney U-test and Student's t-test were used for nonparametric and parametric data, respectively. Cohen's kappa coefficient was used for reliability. Twenty participants performed a colonoscopy with deployment of a self-expandable metallic stent in two cadavers (groups A and B) with 20 strictures overall. The median time was 206 s. The model was able to differentiate between experts and novices (P = 0. 013). The results showed a good consensus estimate of reliability, with kappa = 0.571 (P < 0.0001). The cadaver model described in this study has content, construct and concurrent validity for simulation training in colonoscopic deployment of self-expandable stents for colonic strictures. Further studies are needed to evaluate the predictive validity of this model in terms of skill transfer to clinical practice. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.

  10. Two-bead polarizable water models combined with a two-bead multipole force field (TMFF) for coarse-grained simulation of proteins.

    PubMed

    Li, Min; Zhang, John Z H

    2017-03-08

    The development of polarizable water models at coarse-grained (CG) levels is of much importance to CG molecular dynamics simulations of large biomolecular systems. In this work, we combined the newly developed two-bead multipole force field (TMFF) for proteins with the two-bead polarizable water models to carry out CG molecular dynamics simulations for benchmark proteins. In our simulations, two different two-bead polarizable water models are employed, the RTPW model representing five water molecules by Riniker et al. and the LTPW model representing four water molecules. The LTPW model is developed in this study based on the Martini three-bead polarizable water model. Our simulation results showed that the combination of TMFF with the LTPW model significantly stabilizes the protein's native structure in CG simulations, while the use of the RTPW model gives better agreement with all-atom simulations in predicting the residue-level fluctuation dynamics. Overall, the TMFF coupled with the two-bead polarizable water models enables one to perform an efficient and reliable CG dynamics study of the structural and functional properties of large biomolecules.

  11. Design process for applying the nonlocal thermal transport iSNB model to a Polar-Drive ICF simulation

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Moses, Gregory; Delettrez, Jacques; Collins, Timothy

    2014-10-01

    A design process is presented for the nonlocal thermal transport iSNB (implicit Schurtz, Nicolai, and Busquet) model to provide reliable nonlocal thermal transport in polar-drive ICF simulations. Results from the iSNB model are known to be sensitive to changes in the SNB ``mean free path'' formula, and the latter's original form required modification to obtain realistic preheat levels. In the presented design process, SNB mean free paths are first modified until the model can match temperatures from Goncharov's thermal transport model in 1D temperature relaxation simulations. Afterwards the same mean free paths are tested in a 1D polar-drive surrogate simulation to match adiabats from Goncharov's model. After passing the two previous steps, the model can then be run in a full 2D polar-drive simulation. This research is supported by the University of Rochester Laboratory for Laser Energetics.

  12. The impact of pharmacophore modeling in drug design.

    PubMed

    Guner, Osman F

    2005-07-01

    With the reliable use of computer simulations in scientific research, it is possible to achieve significant increases in productivity as well as a reduction in research costs compared with experimental approaches. For example, computer-simulation can substantially enchance productivity by focusing the scientist to better, more informed choices, while also driving the 'fail-early' concept to result in a significant reduction in cost. Pharmacophore modeling is a reliable computer-aided design tool used in the discovery of new classes of compounds for a given therapeutic category. This commentary will briefly review the benefits and applications of this technology in drug discovery and design, and will also highlight its historical evolution. The two most commonly used approaches for pharmacophore model development will be discussed, and several examples of how this technology was successfully applied to identify new potent leads will be provided. The article concludes with a brief outline of the controversial issue of patentability of pharmacophore models.

  13. Modeling and simulation of different and representative engineering problems using Network Simulation Method

    PubMed Central

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model. PMID:29518121

  14. Modeling and simulation of different and representative engineering problems using Network Simulation Method.

    PubMed

    Sánchez-Pérez, J F; Marín, F; Morales, J L; Cánovas, M; Alhama, F

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model.

  15. Research on support effectiveness modeling and simulating of aviation materiel autonomic logistics system

    NASA Astrophysics Data System (ADS)

    Zhou, Yan; Zhou, Yang; Yuan, Kai; Jia, Zhiyu; Li, Shuo

    2018-05-01

    Aiming at the demonstration of autonomic logistics system to be used at the new generation of aviation materiel in our country, the modeling and simulating method of aviation materiel support effectiveness considering autonomic logistics are studied. Firstly, this paper introduced the idea of JSF autonomic logistics and analyzed the influence of autonomic logistics on support effectiveness from aspects of reliability, false alarm rate, troubleshooting time, and support delay time and maintenance level. On this basis, the paper studies the modeling and simulating methods of support effectiveness considering autonomic logistics, and puts forward the maintenance support simulation process considering autonomic logistics. Finally, taking the typical aviation materiel as an example, this paper analyzes and verifies the above-mentioned support effectiveness modeling and simulating method of aviation materiel considering autonomic logistics.

  16. Technical Reliability Studies. EOS/ESD Technology Abstracts

    DTIC Science & Technology

    1982-01-01

    RESISTANT BIPOLAR TRANSISTOR DESIGN AND ITS APPLICATIONS TO LINEAR INTEGRATED CIRCUITS 16145 MODULE ELECTROSTATIC DISCHARGE SIMULATOR 15786 SOME...T.M. 16476 STATIC DISCHARGE MODELING TECHNIQUES FOR EVALUATION OF INTEGRATED (FET) CIRCUIT DESTRUCTION 16145 MODULE ELECTAOSTATIC DISCHARGE SIMULATOR...PLASTIC LSI CIRCUITS PRklE, L.A., II 16145 MODULE ELECTROSTATIC DISCHARGE SIMULATOR PRICE, R.D. 13455 EVALUATION OF PLASTIC LSI CIRCUITS PSHAENICH, A

  17. Modeling Imperfect Generator Behavior in Power System Operation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim

    A key component in power system operations is the use of computer models to quickly study and analyze different operating conditions and futures in an efficient manner. The output of these models are sensitive to the data used in them as well as the assumptions made during their execution. One typical assumption is that generators and load assets perfectly follow operator control signals. While this is a valid simulation assumption, generators may not always accurately follow control signals. This imperfect response of generators could impact cost and reliability metrics. This paper proposes a generator model that capture this imperfect behaviormore » and examines its impact on production costs and reliability metrics using a steady-state power system operations model. Preliminary analysis shows that while costs remain relatively unchanged, there could be significant impacts on reliability metrics.« less

  18. Stress Rupture Life Reliability Measures for Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Thesken, John C.; Phoenix, S. Leigh; Grimes-Ledesma, Lorie

    2007-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are often used for storing pressurant gases onboard spacecraft. Kevlar (DuPont), glass, carbon and other more recent fibers have all been used as overwraps. Due to the fact that overwraps are subjected to sustained loads for an extended period during a mission, stress rupture failure is a major concern. It is therefore important to ascertain the reliability of these vessels by analysis, since the testing of each flight design cannot be completed on a practical time scale. The present paper examines specifically a Weibull statistics based stress rupture model and considers the various uncertainties associated with the model parameters. The paper also examines several reliability estimate measures that would be of use for the purpose of recertification and for qualifying flight worthiness of these vessels. Specifically, deterministic values for a point estimate, mean estimate and 90/95 percent confidence estimates of the reliability are all examined for a typical flight quality vessel under constant stress. The mean and the 90/95 percent confidence estimates are computed using Monte-Carlo simulation techniques by assuming distribution statistics of model parameters based also on simulation and on the available data, especially the sample sizes represented in the data. The data for the stress rupture model are obtained from the Lawrence Livermore National Laboratories (LLNL) stress rupture testing program, carried out for the past 35 years. Deterministic as well as probabilistic sensitivities are examined.

  19. Assessments of a Turbulence Model Based on Menter's Modification to Rotta's Two-Equation Model

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.

    2013-01-01

    The main objective of this paper is to construct a turbulence model with a more reliable second equation simulating length scale. In the present paper, we assess the length scale equation based on Menter s modification to Rotta s two-equation model. Rotta shows that a reliable second equation can be formed in an exact transport equation from the turbulent length scale L and kinetic energy. Rotta s equation is well suited for a term-by-term modeling and shows some interesting features compared to other approaches. The most important difference is that the formulation leads to a natural inclusion of higher order velocity derivatives into the source terms of the scale equation, which has the potential to enhance the capability of Reynolds-averaged Navier-Stokes (RANS) to simulate unsteady flows. The model is implemented in the PAB3D solver with complete formulation, usage methodology, and validation examples to demonstrate its capabilities. The detailed studies include grid convergence. Near-wall and shear flows cases are documented and compared with experimental and Large Eddy Simulation (LES) data. The results from this formulation are as good or better than the well-known SST turbulence model and much better than k-epsilon results. Overall, the study provides useful insights into the model capability in predicting attached and separated flows.

  20. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  1. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  2. Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit

    NASA Astrophysics Data System (ADS)

    Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi

    2017-02-01

    In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.

  3. Finite element simulation of cutting grey iron HT250 by self-prepared Si3N4 ceramic insert

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Wang, Li; Zhang, Enguang

    2017-04-01

    The finite element method has been able to simulate and solve practical machining problems, achieve the required accuracy and the highly reliability. In this paper, the simulation models based on the material properties of the self-prepared Si3N4 insert and HT250 were created. Using these models, the results of cutting force, cutting temperature and tool wear rate were obtained, and tool wear mode was predicted after cutting simulation. These approaches may develop as the new method for testing new cutting-tool materials, shortening development cycle and reducing the cost.

  4. Development and validation of the Simulation Learning Effectiveness Inventory.

    PubMed

    Chen, Shiah-Lian; Huang, Tsai-Wei; Liao, I-Chen; Liu, Chienchi

    2015-10-01

    To develop and psychometrically test the Simulation Learning Effectiveness Inventory. High-fidelity simulation helps students develop clinical skills and competencies. Yet, reliable instruments measuring learning outcomes are scant. A descriptive cross-sectional survey was used to validate psychometric properties of the instrument measuring students' perception of stimulation learning effectiveness. A purposive sample of 505 nursing students who had taken simulation courses was recruited from a department of nursing of a university in central Taiwan from January 2010-June 2010. The study was conducted in two phases. In Phase I, question items were developed based on the literature review and the preliminary psychometric properties of the inventory were evaluated using exploratory factor analysis. Phase II was conducted to evaluate the reliability and validity of the finalized inventory using confirmatory factor analysis. The results of exploratory and confirmatory factor analyses revealed the instrument was composed of seven factors, named course arrangement, equipment resource, debriefing, clinical ability, problem-solving, confidence and collaboration. A further second-order analysis showed comparable fits between a three second-order factor (preparation, process and outcome) and the seven first-order factor models. Internal consistency was supported by adequate Cronbach's alphas and composite reliability. Convergent and discriminant validities were also supported by confirmatory factor analysis. The study provides evidence that the Simulation Learning Effectiveness Inventory is reliable and valid for measuring student perception of learning effectiveness. The instrument is helpful in building the evidence-based knowledge of the effect of simulation teaching on students' learning outcomes. © 2015 John Wiley & Sons Ltd.

  5. Specific spice modeling of microcrystalline silicon TFTs

    NASA Astrophysics Data System (ADS)

    Moustapha, O.; Bui, V. D.; Bonnassieux, Y.; Parey, J. Y.

    2008-03-01

    In this paper we present a specific spice static and dynamic model of microcrystalline silicon (μc-Si) thin film transistors (TFTs) taking into account the access resistances and the capacitors contributions. The previously existing models of amorphous silicon and polysilicon TFTs were not completely suited, so we combined them to build a new specific model of μc-Si TFTs. The reliability of the model is then checked by the comparison of experimental measurements to simulations and by simulating the characteristics of some electronic devices (OLED pixels, inverters, and so on).

  6. Comparison of Varied Precipitation and Soil Data Types for Use in Watershed Modeling.

    EPA Science Inventory

    The accuracy of water quality and quantity models depends on calibration to ensure reliable simulations of streamflow, which in turn requires accurate climatic forcing data. Precipitation is widely acknowledged to be the largest source of uncertainty in watershed modeling, and so...

  7. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, John M.; Coffin, Peter; Robbins, Brian A.

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins withmore » a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.« less

  8. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  9. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    PubMed Central

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524

  10. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    PubMed

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  11. Development of a biosphere hydrological model considering vegetation dynamics and its evaluation at basin scale under climate change

    NASA Astrophysics Data System (ADS)

    Li, Qiaoling; Ishidaira, Hiroshi

    2012-01-01

    SummaryThe biosphere and hydrosphere are intrinsically coupled. The scientific question is if there is a substantial change in one component such as vegetation cover, how will the other components such as transpiration and runoff generation respond, especially under climate change conditions? Stand-alone hydrological models have a detailed description of hydrological processes but do not sufficiently parameterize vegetation as a dynamic component. Dynamic global vegetation models (DGVMs) are able to simulate transient structural changes in major vegetation types but do not simulate runoff generation reliably. Therefore, both hydrological models and DGVMs have their limitations as well as advantages for addressing this question. In this study a biosphere hydrological model (LPJH) is developed by coupling a prominent DGVM (Lund-Postdam-Jena model referred to as LPJ) with a stand-alone hydrological model (HYMOD), with the objective of analyzing the role of vegetation in the hydrological processes at basin scale and evaluating the impact of vegetation change on the hydrological processes under climate change. The application and validation of the LPJH model to four basins representing a variety of climate and vegetation conditions shows that the performance of LPJH is much better than that of the original LPJ and is similar to that of stand-alone hydrological models for monthly and daily runoff simulation at the basin scale. It is argued that the LPJH model gives more reasonable hydrological simulation since it considers both the spatial variability of soil moisture and vegetation dynamics, which make the runoff generation mechanism more reliable. As an example, it is shown that changing atmospheric CO 2 content alone would result in runoff increases in humid basins and decreases in arid basins. Theses changes are mainly attributable to changes in transpiration driven by vegetation dynamics, which are not simulated in stand-alone hydrological models. Therefore LPJH potentially provides a powerful tool for simulating vegetation response to climate changes in the biosphere hydrological cycle.

  12. Representing pump-capacity relations in groundwater simulation models

    USGS Publications Warehouse

    Konikow, Leonard F.

    2010-01-01

    The yield (or discharge) of constant-speed pumps varies with the total dynamic head (or lift) against which the pump is discharging. The variation in yield over the operating range of the pump may be substantial. In groundwater simulations that are used for management evaluations or other purposes, where predictive accuracy depends on the reliability of future discharge estimates, model reliability may be enhanced by including the effects of head-capacity (or pump-capacity) relations on the discharge from the well. A relatively simple algorithm has been incorporated into the widely used MODFLOW groundwater flow model that allows a model user to specify head-capacity curves. The algorithm causes the model to automatically adjust the pumping rate each time step to account for the effect of drawdown in the cell and changing lift, and will shut the pump off if lift exceeds a critical value. The algorithm is available as part of a new multinode well package (MNW2) for MODFLOW.

  13. A continuous optimization approach for inferring parameters in mathematical models of regulatory networks.

    PubMed

    Deng, Zhimin; Tian, Tianhai

    2014-07-29

    The advances of systems biology have raised a large number of sophisticated mathematical models for describing the dynamic property of complex biological systems. One of the major steps in developing mathematical models is to estimate unknown parameters of the model based on experimentally measured quantities. However, experimental conditions limit the amount of data that is available for mathematical modelling. The number of unknown parameters in mathematical models may be larger than the number of observation data. The imbalance between the number of experimental data and number of unknown parameters makes reverse-engineering problems particularly challenging. To address the issue of inadequate experimental data, we propose a continuous optimization approach for making reliable inference of model parameters. This approach first uses a spline interpolation to generate continuous functions of system dynamics as well as the first and second order derivatives of continuous functions. The expanded dataset is the basis to infer unknown model parameters using various continuous optimization criteria, including the error of simulation only, error of both simulation and the first derivative, or error of simulation as well as the first and second derivatives. We use three case studies to demonstrate the accuracy and reliability of the proposed new approach. Compared with the corresponding discrete criteria using experimental data at the measurement time points only, numerical results of the ERK kinase activation module show that the continuous absolute-error criteria using both function and high order derivatives generate estimates with better accuracy. This result is also supported by the second and third case studies for the G1/S transition network and the MAP kinase pathway, respectively. This suggests that the continuous absolute-error criteria lead to more accurate estimates than the corresponding discrete criteria. We also study the robustness property of these three models to examine the reliability of estimates. Simulation results show that the models with estimated parameters using continuous fitness functions have better robustness properties than those using the corresponding discrete fitness functions. The inference studies and robustness analysis suggest that the proposed continuous optimization criteria are effective and robust for estimating unknown parameters in mathematical models.

  14. Multi-Site λ-dynamics for simulated Structure-Activity Relationship studies

    PubMed Central

    Knight, Jennifer L.; Brooks, Charles L.

    2011-01-01

    Multi-Site λ-dynamics (MSλD) is a new free energy simulation method that is based on λ-dynamics. It has been developed to enable multiple substituents at multiple sites on a common ligand core to be modeled simultaneously and their free energies assessed. The efficacy of MSλD for estimating relative hydration free energies and relative binding affinties is demonstrated using three test systems. Model compounds representing multiple identical benzene, dihydroxybenzene and dimethoxybenzene molecules show total combined MSλD trajectory lengths of ~1.5 ns are sufficient to reliably achieve relative hydration free energy estimates within 0.2 kcal/mol and are less sensitive to the number of trajectories that are used to generate these estimates for hybrid ligands that contain up to ten substituents modeled at a single site or five substituents modeled at each of two sites. Relative hydration free energies among six benzene derivatives calculated from MSλD simulations are in very good agreement with those from alchemical free energy simulations (with average unsigned differences of 0.23 kcal/mol and R2=0.991) and experiment (with average unsigned errors of 1.8 kcal/mol and R2=0.959). Estimates of the relative binding affinities among 14 inhibitors of HIV-1 reverse transcriptase obtained from MSλD simulations are in reasonable agreement with those from traditional free energy simulations and experiment (average unsigned errors of 0.9 kcal/mol and R2=0.402). For the same level of accuracy and precision MSλD simulations are achieved ~20–50 times faster than traditional free energy simulations and thus with reliable force field parameters can be used effectively to screen tens to hundreds of compounds in structure-based drug design applications. PMID:22125476

  15. Free Energy and Heat Capacity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurata, Masaki; Devanathan, Ramaswami

    2015-10-13

    Free energy and heat capacity of actinide elements and compounds are important properties for the evaluation of the safety and reliable performance of nuclear fuel. They are essential inputs for models that describe complex phenomena that govern the behaviour of actinide compounds during nuclear fuel fabrication and irradiation. This chapter introduces various experimental methods to measure free energy and heat capacity to serve as inputs for models and to validate computer simulations. This is followed by a discussion of computer simulation of these properties, and recent simulations of thermophysical properties of nuclear fuel are briefly reviewed.

  16. Estimation and enhancement of real-time software reliability through mutation analysis

    NASA Technical Reports Server (NTRS)

    Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.

    1992-01-01

    A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.

  17. SHEDS-Multimedia Model Version 3 (a) Technical Manual; (b) User Guide; and (c) Executable File to Launch SAS Program and Install Model

    EPA Science Inventory

    Reliable models for assessing human exposures are important for understanding health risks from chemicals. The Stochastic Human Exposure and Dose Simulation model for multimedia, multi-route/pathway chemicals (SHEDS-Multimedia), developed by EPA’s Office of Research and Developm...

  18. Boeing's Dart and Starliner Parachute System Test

    NASA Image and Video Library

    2018-02-22

    Boeing conducted the first in a series of reliability tests of its CST-100 Starliner flight drogue and main parachute system by releasing a long, dart-shaped test vehicle from a C-17 aircraft over Yuma, Arizona. Two more tests are planned using the dart module, as well as three similar reliability tests using a high fidelity capsule simulator designed to simulate the CST-100 Starliner capsule’s exact shape and mass. In both the dart and capsule simulator tests, the test spacecraft are released at various altitudes to test the parachute system at different deployment speeds, aerodynamic loads, and or weight demands. Data collected from each test is fed into computer models to more accurately predict parachute performance and to verify consistency from test to test.

  19. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 2. Computer-program documentation. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    This report describes the structure and operation of prototype computer programs developed for a Monte Carlo simulation model, GENESIS, and for two analytical models, OPCON and OPPLAN. It includes input data requirements and sample test cases.

  20. How much detail is needed in modeling a transcranial magnetic stimulation figure-8 coil: Measurements and brain simulations

    PubMed Central

    Mandija, Stefano; Sommer, Iris E. C.; van den Berg, Cornelis A. T.; Neggers, Sebastiaan F. W.

    2017-01-01

    Background Despite TMS wide adoption, its spatial and temporal patterns of neuronal effects are not well understood. Although progress has been made in predicting induced currents in the brain using realistic finite element models (FEM), there is little consensus on how a magnetic field of a typical TMS coil should be modeled. Empirical validation of such models is limited and subject to several limitations. Methods We evaluate and empirically validate models of a figure-of-eight TMS coil that are commonly used in published modeling studies, of increasing complexity: simple circular coil model; coil with in-plane spiral winding turns; and finally one with stacked spiral winding turns. We will assess the electric fields induced by all 3 coil models in the motor cortex using a computer FEM model. Biot-Savart models of discretized wires were used to approximate the 3 coil models of increasing complexity. We use a tailored MR based phase mapping technique to get a full 3D validation of the incident magnetic field induced in a cylindrical phantom by our TMS coil. FEM based simulations on a meshed 3D brain model consisting of five tissues types were performed, using two orthogonal coil orientations. Results Substantial differences in the induced currents are observed, both theoretically and empirically, between highly idealized coils and coils with correctly modeled spiral winding turns. Thickness of the coil winding turns affect minimally the induced electric field, and it does not influence the predicted activation. Conclusion TMS coil models used in FEM simulations should include in-plane coil geometry in order to make reliable predictions of the incident field. Modeling the in-plane coil geometry is important to correctly simulate the induced electric field and to correctly make reliable predictions of neuronal activation PMID:28640923

  1. Determination of output factors for small proton therapy fields.

    PubMed

    Fontenot, Jonas D; Newhauser, Wayne D; Bloch, Charles; White, R Allen; Titt, Uwe; Starkschall, George

    2007-02-01

    Current protocols for the measurement of proton dose focus on measurements under reference conditions; methods for measuring dose under patient-specific conditions have not been standardized. In particular, it is unclear whether dose in patient-specific fields can be determined more reliably with or without the presence of the patient-specific range compensator. The aim of this study was to quantitatively assess the reliability of two methods for measuring dose per monitor unit (DIMU) values for small-field treatment portals: one with the range compensator and one without the range compensator. A Monte Carlo model of the Proton Therapy Center-Houston double-scattering nozzle was created, and estimates of D/MU values were obtained from 14 simulated treatments of a simple geometric patient model. Field-specific D/MU calibration measurements were simulated with a dosimeter in a water phantom with and without the range compensator. D/MU values from the simulated calibration measurements were compared with D/MU values from the corresponding treatment simulation in the patient model. To evaluate the reliability of the calibration measurements, six metrics and four figures of merit were defined to characterize accuracy, uncertainty, the standard deviations of accuracy and uncertainty, worst agreement, and maximum uncertainty. Measuring D/MU without the range compensator provided superior results for five of the six metrics and for all four figures of merit. The two techniques yielded different results primarily because of high-dose gradient regions introduced into the water phantom when the range compensator was present. Estimated uncertainties (approximately 1 mm) in the position of the dosimeter in these regions resulted in large uncertainties and high variability in D/MU values. When the range compensator was absent, these gradients were minimized and D/MU values were less sensitive to dosimeter positioning errors. We conclude that measuring D/MU without the range compensator present provides more reliable results than measuring it with the range compensator in place.

  2. Numerical simulation of three-component multiphase flows at high density and viscosity ratios using lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Haghani Hassan Abadi, Reza; Fakhari, Abbas; Rahimian, Mohammad Hassan

    2018-03-01

    In this paper, we propose a multiphase lattice Boltzmann model for numerical simulation of ternary flows at high density and viscosity ratios free from spurious velocities. The proposed scheme, which is based on the phase-field modeling, employs the Cahn-Hilliard theory to track the interfaces among three different fluid components. Several benchmarks, such as the spreading of a liquid lens, binary droplets, and head-on collision of two droplets in binary- and ternary-fluid systems, are conducted to assess the reliability and accuracy of the model. The proposed model can successfully simulate both partial and total spreadings while reducing the parasitic currents to the machine precision.

  3. Comparison of N2O Emissions from Soils at Three Temperate Agricultural Sites

    NASA Technical Reports Server (NTRS)

    Frolking, S. E.; Moiser, A. R.; Ojima, D. S.; Li, C.; Parton, W. J.; Potter, C. S.; Priesack, E.; Stenger, R.; Haberbosch, C.; Dorsch, P.; hide

    1997-01-01

    Nitrous oxide (N2O) flux simulations by four models were compared with year-round field measurements from five temperate agricultural sites in three countries. The field sites included an unfertilized, semi-arid rangeland with low N2O fluxes in eastern Colorado, USA; two fertilizer treatments (urea and nitrate) on a fertilized grass ley cut for silage in Scotland; and two fertilized, cultivated crop fields in Germany where N2O loss during the winter was quite high. The models used were daily trace gas versions of the CENTURY model, DNDC, ExpertN, and the NASA-Ames version of the CASA model. These models included similar components (soil physics, decomposition, plant growth, and nitrogen transformations), but in some cases used very different algorithms for these processes. All models generated similar results for the general cycling of nitrogen through the agro-ecosystems, but simulated nitrogen trace gas fluxes were quite different. In most cases the simulated N20 fluxes were within a factor of about 2 of the observed annual fluxes, but even when models produced similar N2O fluxes they often produced very different estimates of gaseous N loss as nitric oxide (NO), dinitrogen (N2), and ammonia (NH3). Accurate simulation of soil moisture appears to be a key requirement for reliable simulation of N2O emissions. All models simulated the general pattern of low background fluxes with high fluxes following fertilization at the Scottish sites, but they could not (or were not designed to) accurately capture the observed effects of different fertilizer types on N2O flux. None of the models were able to reliably generate large pulses of N2O during brief winter thaws that were observed at the two German sites. All models except DNDC simulated very low N2O fluxes for the dry site in Colorado. The US Trace Gas Network (TRAGNET) has provided a mechanism for this model and site intercomparison. Additional intercomparisons are needed with these and other models and additional data sets; these should include both tropical agro-ecosystems and new agricultural management techniques designed for sustainability.

  4. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    NASA Technical Reports Server (NTRS)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  5. Relations between winter precipitation and atmospheric circulation simulated by the Geophysical Fluid Dynamics Laboratory general circulation model

    USGS Publications Warehouse

    McCabe, G.J.; Dettinger, M.D.

    1995-01-01

    General circulation model (GCM) simulations of atmospheric circulation are more reliable than GCM simulations of temperature and precipitation. In this study, temporal correlations between 700 hPa height anomalies simulated winter precipitation at eight locations in the conterminous United States are compared with corresponding correlations in observations. The objectives are to 1) characterize the relations between atmospheric circulation and winter precipitation simulated by the GFDL, GCM for selected locations in the conterminous USA, ii) determine whether these relations are similar to those found in observations of the actual climate system, and iii) determine if GFDL-simulated precipitation is forced by the same circulation patterns as in the real atmosphere. -from Authors

  6. Investigations of FAK inhibitors: a combination of 3D-QSAR, docking, and molecular dynamics simulations studies.

    PubMed

    Cheng, Peng; Li, Jiaojiao; Wang, Juan; Zhang, Xiaoyun; Zhai, Honglin

    2018-05-01

    Focal adhesion kinase (FAK) is one kind of tyrosine kinases that modulates integrin and growth factor signaling pathways, which is a promising therapeutic target because of involving in cancer cell migration, proliferation, and survival. To investigate the mechanism between FAK and triazinic inhibitors and design high activity inhibitors, a molecular modeling integrated with 3D-QSAR, molecular docking, molecular dynamics simulations, and binding free energy calculations was performed. The optimum CoMFA and CoMSIA models showed good reliability and satisfactory predictability (with Q 2  = 0.663, R 2  = 0.987, [Formula: see text] = 0.921 and Q 2  = 0.670, R 2  = 0.981, [Formula: see text] = 0.953). Its contour maps could provide structural features to improve inhibitory activity. Furthermore, a good consistency between contour maps, docking, and molecular dynamics simulations strongly demonstrates that the molecular modeling is reliable. Based on it, we designed several new compounds and their inhibitory activities were validated by the molecular models. We expect our studies could bring new ideas to promote the development of novel inhibitors with higher inhibitory activity for FAK.

  7. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Weatherbee, J. E.; Taylor, D. S.

    1972-01-01

    A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.

  8. Physical activity into the meal glucose-insulin model of type 1 diabetes: in silico studies.

    PubMed

    Man, Chiara Dalla; Breton, Marc D; Cobelli, Claudio

    2009-01-01

    A simulation model of a glucose-insulin system accounting for physical activity is needed to reliably simulate normal life conditions, thus accelerating the development of an artificial pancreas. In fact, exercise causes a transient increase of insulin action and may lead to hypoglycemia. However, physical activity is difficult to model. In the past, it was described indirectly as a rise in insulin. Recently, a new parsimonious model of exercise effect on glucose homeostasis has been proposed that links the change in insulin action and glucose effectiveness to heart rate (HR). The aim of this study was to plug this exercise model into our recently proposed large-scale simulation model of glucose metabolism in type 1 diabetes to better describe normal life conditions. The exercise model describes changes in glucose-insulin dynamics in two phases: a rapid on-and-off change in insulin-independent glucose clearance and a rapid-on/slow-off change in insulin sensitivity. Three candidate models of glucose effectiveness and insulin sensitivity as a function of HR have been considered, both during exercise and recovery after exercise. By incorporating these three models into the type 1 diabetes model, we simulated different levels (from mild to moderate) and duration of exercise (15 and 30 minutes), both in steady-state (e.g., during euglycemic-hyperinsulinemic clamp) and in nonsteady state (e.g., after a meal) conditions. One candidate exercise model was selected as the most reliable. A type 1 diabetes model also describing physical activity is proposed. The model represents a step forward to accurately describe glucose homeostasis in normal life conditions; however, further studies are needed to validate it against data. © Diabetes Technology Society

  9. Hydrological simulation of Sperchios River basin in Central Greece using the MIKE SHE model and geographic information systems

    NASA Astrophysics Data System (ADS)

    Paparrizos, Spyridon; Maris, Fotios

    2017-05-01

    The MIKE SHE model is able to simulate the entire stream flow which includes direct and basic flow. Many models either do not simulate or use simplistic methods to determine the basic flow. The MIKE SHE model takes into account many hydrological data. Since this study was directed towards the simulation of surface runoff and infiltration into saturated and unsaturated zone, the MIKE SHE is an appropriate model for reliable conclusions. In the current research, the MIKE SHE model was used to simulate runoff in the area of Sperchios River basin. Meteorological data from eight rainfall stations within the Sperchios River basin were used as inputs. Vegetation as well as geological data was used to perform the calibration and validation of the physical processes of the model. Additionally, ArcGIS program was used. The results indicated that the model was able to simulate the surface runoff satisfactorily, representing all the hydrological data adequately. Some minor differentiations appeared which can be eliminated with the appropriate adjustments that can be decided by the researcher's experience.

  10. An assessment of the reliability of quantitative genetics estimates in study systems with high rate of extra-pair reproduction and low recruitment.

    PubMed

    Bourret, A; Garant, D

    2017-03-01

    Quantitative genetics approaches, and particularly animal models, are widely used to assess the genetic (co)variance of key fitness related traits and infer adaptive potential of wild populations. Despite the importance of precision and accuracy of genetic variance estimates and their potential sensitivity to various ecological and population specific factors, their reliability is rarely tested explicitly. Here, we used simulations and empirical data collected from an 11-year study on tree swallow (Tachycineta bicolor), a species showing a high rate of extra-pair paternity and a low recruitment rate, to assess the importance of identity errors, structure and size of the pedigree on quantitative genetic estimates in our dataset. Our simulations revealed an important lack of precision in heritability and genetic-correlation estimates for most traits, a low power to detect significant effects and important identifiability problems. We also observed a large bias in heritability estimates when using the social pedigree instead of the genetic one (deflated heritabilities) or when not accounting for an important cause of resemblance among individuals (for example, permanent environment or brood effect) in model parameterizations for some traits (inflated heritabilities). We discuss the causes underlying the low reliability observed here and why they are also likely to occur in other study systems. Altogether, our results re-emphasize the difficulties of generalizing quantitative genetic estimates reliably from one study system to another and the importance of reporting simulation analyses to evaluate these important issues.

  11. Reliability analysis using an exponential power model with bathtub-shaped failure rate function: a Bayes study.

    PubMed

    Shehla, Romana; Khan, Athar Ali

    2016-01-01

    Models with bathtub-shaped hazard function have been widely accepted in the field of reliability and medicine and are particularly useful in reliability related decision making and cost analysis. In this paper, the exponential power model capable of assuming increasing as well as bathtub-shape, is studied. This article makes a Bayesian study of the same model and simultaneously shows how posterior simulations based on Markov chain Monte Carlo algorithms can be straightforward and routine in R. The study is carried out for complete as well as censored data, under the assumption of weakly-informative priors for the parameters. In addition to this, inference interest focuses on the posterior distribution of non-linear functions of the parameters. Also, the model has been extended to include continuous explanatory variables and R-codes are well illustrated. Two real data sets are considered for illustrative purposes.

  12. Thermal–hydraulic–mechanical modeling of a large-scale heater test to investigate rock salt and crushed salt behavior under repository conditions for heat-generating nuclear waste

    DOE PAGES

    Blanco-Martín, Laura; Wolters, Ralf; Rutqvist, Jonny; ...

    2016-04-28

    The Thermal Simulation for Drift Emplacement heater test is modeled with two simulators for coupled thermal-hydraulic-mechanical processes. Results from the two simulators are in very good agreement. The comparison between measurements and numerical results is also very satisfactory, regarding temperature, drift closure and rock deformation. Concerning backfill compaction, a parameter calibration through inverse modeling was performed due to insufficient data on crushed salt reconsolidation, particularly at high temperatures. We conclude that the two simulators investigated have the capabilities to reproduce the data available, which increases confidence in their use to reliably investigate disposal of heat-generating nuclear waste in saliferous geosystems.

  13. Thermal–hydraulic–mechanical modeling of a large-scale heater test to investigate rock salt and crushed salt behavior under repository conditions for heat-generating nuclear waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanco-Martín, Laura; Wolters, Ralf; Rutqvist, Jonny

    The Thermal Simulation for Drift Emplacement heater test is modeled with two simulators for coupled thermal-hydraulic-mechanical processes. Results from the two simulators are in very good agreement. The comparison between measurements and numerical results is also very satisfactory, regarding temperature, drift closure and rock deformation. Concerning backfill compaction, a parameter calibration through inverse modeling was performed due to insufficient data on crushed salt reconsolidation, particularly at high temperatures. We conclude that the two simulators investigated have the capabilities to reproduce the data available, which increases confidence in their use to reliably investigate disposal of heat-generating nuclear waste in saliferous geosystems.

  14. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  15. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  16. Superior model for fault tolerance computation in designing nano-sized circuit systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, N. S. S., E-mail: narinderjit@petronas.com.my; Muthuvalu, M. S., E-mail: msmuthuvalu@gmail.com; Asirvadam, V. S., E-mail: vijanth-sagayan@petronas.com.my

    2014-10-24

    As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalizationmore » of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.« less

  17. Improving rice models for more reliable prediction of responses of rice yield to CO2 and temperature elevaton

    USDA-ARS?s Scientific Manuscript database

    Materials and Methods The simulation exercise and model improvement were implemented in phase-wise. In the first modelling activities, the model sensitivities were evaluated to given CO2 concentrations varying from 360 to 720 'mol mol-1 at an interval of 90 'mol mol-1 and air temperature increments...

  18. Spatial surplus production modeling of Atlantic tunas and billfish.

    PubMed

    Carruthers, Thomas R; McAllister, Murdoch K; Taylor, Nathan G

    2011-10-01

    We formulate and simulation-test a spatial surplus production model that provides a basis with which to undertake multispecies, multi-area, stock assessment. Movement between areas is parameterized using a simple gravity model that includes a "residency" parameter that determines the degree of stock mixing among areas. The model is deliberately simple in order to (1) accommodate nontarget species that typically have fewer available data and (2) minimize computational demand to enable simulation evaluation of spatial management strategies. Using this model, we demonstrate that careful consideration of spatial catch and effort data can provide the basis for simple yet reliable spatial stock assessments. If simple spatial dynamics can be assumed, tagging data are not required to reliably estimate spatial distribution and movement. When applied to eight stocks of Atlantic tuna and billfish, the model tracks regional catch data relatively well by approximating local depletions and exchange among high-abundance areas. We use these results to investigate and discuss the implications of using spatially aggregated stock assessment for fisheries in which the distribution of both the population and fishing vary over time.

  19. Reliability of analog quantum simulation

    NASA Astrophysics Data System (ADS)

    Sarovar, Mohan; Zhang, Jun; Zeng, Lishan

    Analog quantum simulators (AQS) will likely be the first nontrivial application of quantum technology for predictive simulation. However, there remain questions regarding the degree of confidence that can be placed in the results of AQS since they do not naturally incorporate error correction. We formalize the notion of AQS reliability to calibration errors by determining sensitivity of AQS outputs to underlying parameters, and formulate conditions for robust simulation. Our approach connects to the notion of parameter space compression in statistical physics and naturally reveals the importance of model symmetries in dictating the robust properties. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the United States Department of Energy's National Nuclear Security Administration under Contract No. DE-AC04-94AL85000.

  20. Coupling model of aerobic waste degradation considering temperature, initial moisture content and air injection volume.

    PubMed

    Ma, Jun; Liu, Lei; Ge, Sai; Xue, Qiang; Li, Jiangshan; Wan, Yong; Hui, Xinminnan

    2018-03-01

    A quantitative description of aerobic waste degradation is important in evaluating landfill waste stability and economic management. This research aimed to develop a coupling model to predict the degree of aerobic waste degradation. On the basis of the first-order kinetic equation and the law of conservation of mass, we first developed the coupling model of aerobic waste degradation that considered temperature, initial moisture content and air injection volume to simulate and predict the chemical oxygen demand in the leachate. Three different laboratory experiments on aerobic waste degradation were simulated to test the model applicability. Parameter sensitivity analyses were conducted to evaluate the reliability of parameters. The coupling model can simulate aerobic waste degradation, and the obtained simulation agreed with the corresponding results of the experiment. Comparison of the experiment and simulation demonstrated that the coupling model is a new approach to predict aerobic waste degradation and can be considered as the basis for selecting the economic air injection volume and appropriate management in the future.

  1. Probabilistic modelling of overflow, surcharge and flooding in urban drainage using the first-order reliability method and parameterization of local rain series.

    PubMed

    Thorndahl, S; Willems, P

    2008-01-01

    Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storms or long historical rainfall series in a hydrodynamic model of the urban drainage system. In this paper, an alternative probabilistic method is investigated: the first-order reliability method (FORM). To apply this method, a long rainfall time series was divided in rainstorms (rain events), and each rainstorm conceptualized to a synthetic rainfall hyetograph by a Gaussian shape with the parameters rainstorm depth, duration and peak intensity. Probability distributions were calibrated for these three parameters and used on the basis of the failure probability estimation, together with a hydrodynamic simulation model to determine the failure conditions for each set of parameters. The method takes into account the uncertainties involved in the rainstorm parameterization. Comparison is made between the failure probability results of the FORM method, the standard method using long-term simulations and alternative methods based on random sampling (Monte Carlo direct sampling and importance sampling). It is concluded that without crucial influence on the modelling accuracy, the FORM is very applicable as an alternative to traditional long-term simulations of urban drainage systems.

  2. 7 CFR 400.705 - Contents required for a new submission or changes to a previously approved submission.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... information from market research, producers or producer groups, agents, lending institutions, and other... reliability of the data; (5) An analysis of the results of simulations or modeling showing the performance of proposed rates and commodity prices, as applicable, based on one or more of the following (Such simulations...

  3. 7 CFR 400.705 - Contents required for a new submission or changes to a previously approved submission.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... information from market research, producers or producer groups, agents, lending institutions, and other... reliability of the data; (5) An analysis of the results of simulations or modeling showing the performance of proposed rates and commodity prices, as applicable, based on one or more of the following (Such simulations...

  4. 7 CFR 400.705 - Contents required for a new submission or changes to a previously approved submission.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... information from market research, producers or producer groups, agents, lending institutions, and other... reliability of the data; (5) An analysis of the results of simulations or modeling showing the performance of proposed rates and commodity prices, as applicable, based on one or more of the following (Such simulations...

  5. 7 CFR 400.705 - Contents required for a new submission or changes to a previously approved submission.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... information from market research, producers or producer groups, agents, lending institutions, and other... reliability of the data; (5) An analysis of the results of simulations or modeling showing the performance of proposed rates and commodity prices, as applicable, based on one or more of the following (Such simulations...

  6. High Performance Parallel Computational Nanotechnology

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    At a recent press conference, NASA Administrator Dan Goldin encouraged NASA Ames Research Center to take a lead role in promoting research and development of advanced, high-performance computer technology, including nanotechnology. Manufacturers of leading-edge microprocessors currently perform large-scale simulations in the design and verification of semiconductor devices and microprocessors. Recently, the need for this intensive simulation and modeling analysis has greatly increased, due in part to the ever-increasing complexity of these devices, as well as the lessons of experiences such as the Pentium fiasco. Simulation, modeling, testing, and validation will be even more important for designing molecular computers because of the complex specification of millions of atoms, thousands of assembly steps, as well as the simulation and modeling needed to ensure reliable, robust and efficient fabrication of the molecular devices. The software for this capacity does not exist today, but it can be extrapolated from the software currently used in molecular modeling for other applications: semi-empirical methods, ab initio methods, self-consistent field methods, Hartree-Fock methods, molecular mechanics; and simulation methods for diamondoid structures. In as much as it seems clear that the application of such methods in nanotechnology will require powerful, highly powerful systems, this talk will discuss techniques and issues for performing these types of computations on parallel systems. We will describe system design issues (memory, I/O, mass storage, operating system requirements, special user interface issues, interconnects, bandwidths, and programming languages) involved in parallel methods for scalable classical, semiclassical, quantum, molecular mechanics, and continuum models; molecular nanotechnology computer-aided designs (NanoCAD) techniques; visualization using virtual reality techniques of structural models and assembly sequences; software required to control mini robotic manipulators for positional control; scalable numerical algorithms for reliability, verifications and testability. There appears no fundamental obstacle to simulating molecular compilers and molecular computers on high performance parallel computers, just as the Boeing 777 was simulated on a computer before manufacturing it.

  7. Using Modeling and Simulation to Complement Testing for Increased Understanding of Weapon Subassembly Response.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Michael K.; Davidson, Megan

    As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to themore » deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping to improve these simulations so they provide increasingly credible analysis of the system response and performance over the full range of conditions.« less

  8. Research on Novel Algorithms for Smart Grid Reliability Assessment and Economic Dispatch

    NASA Astrophysics Data System (ADS)

    Luo, Wenjin

    In this dissertation, several studies of electric power system reliability and economy assessment methods are presented. To be more precise, several algorithms in evaluating power system reliability and economy are studied. Furthermore, two novel algorithms are applied to this field and their simulation results are compared with conventional results. As the electrical power system develops towards extra high voltage, remote distance, large capacity and regional networking, the application of a number of new technique equipments and the electric market system have be gradually established, and the results caused by power cut has become more and more serious. The electrical power system needs the highest possible reliability due to its complication and security. In this dissertation the Boolean logic Driven Markov Process (BDMP) method is studied and applied to evaluate power system reliability. This approach has several benefits. It allows complex dynamic models to be defined, while maintaining its easy readability as conventional methods. This method has been applied to evaluate IEEE reliability test system. The simulation results obtained are close to IEEE experimental data which means that it could be used for future study of the system reliability. Besides reliability, modern power system is expected to be more economic. This dissertation presents a novel evolutionary algorithm named as quantum evolutionary membrane algorithm (QEPS), which combines the concept and theory of quantum-inspired evolutionary algorithm and membrane computation, to solve the economic dispatch problem in renewable power system with on land and offshore wind farms. The case derived from real data is used for simulation tests. Another conventional evolutionary algorithm is also used to solve the same problem for comparison. The experimental results show that the proposed method is quick and accurate to obtain the optimal solution which is the minimum cost for electricity supplied by wind farm system.

  9. Reliability analysis in interdependent smart grid systems

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  10. Probabilistic Design of a Wind Tunnel Model to Match the Response of a Full-Scale Aircraft

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Stroud, W. Jefferson; Krishnamurthy, T.; Spain, Charles V.; Naser, Ahmad S.

    2005-01-01

    approach is presented for carrying out the reliability-based design of a plate-like wing that is part of a wind tunnel model. The goal is to design the wind tunnel model to match the stiffness characteristics of the wing box of a flight vehicle while satisfying strength-based risk/reliability requirements that prevents damage to the wind tunnel model and fixtures. The flight vehicle is a modified F/A-18 aircraft. The design problem is solved using reliability-based optimization techniques. The objective function to be minimized is the difference between the displacements of the wind tunnel model and the corresponding displacements of the flight vehicle. The design variables control the thickness distribution of the wind tunnel model. Displacements of the wind tunnel model change with the thickness distribution, while displacements of the flight vehicle are a set of fixed data. The only constraint imposed is that the probability of failure is less than a specified value. Failure is assumed to occur if the stress caused by aerodynamic pressure loading is greater than the specified strength allowable. Two uncertain quantities are considered: the allowable stress and the thickness distribution of the wind tunnel model. Reliability is calculated using Monte Carlo simulation with response surfaces that provide approximate values of stresses. The response surface equations are, in turn, computed from finite element analyses of the wind tunnel model at specified design points. Because the response surface approximations were fit over a small region centered about the current design, the response surfaces were refit periodically as the design variables changed. Coarse-grained parallelism was used to simultaneously perform multiple finite element analyses. Studies carried out in this paper demonstrate that this scheme of using moving response surfaces and coarse-grained computational parallelism reduce the execution time of the Monte Carlo simulation enough to make the design problem tractable. The results of the reliability-based designs performed in this paper show that large decreases in the probability of stress-based failure can be realized with only small sacrifices in the ability of the wind tunnel model to represent the displacements of the full-scale vehicle.

  11. Fault trees for decision making in systems analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Howard E.

    1975-10-09

    The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less

  12. The reliability of molecular dynamics simulations of the multidrug transporter P-glycoprotein in a membrane environment

    PubMed Central

    Condic-Jurkic, Karmen; Subramanian, Nandhitha; Mark, Alan E.

    2018-01-01

    Despite decades of research, the mechanism of action of the ABC multidrug transporter P-glycoprotein (P-gp) remains elusive. Due to experimental limitations, many researchers have turned to molecular dynamics simulation studies in order to investigate different aspects of P-gp function. However, such studies are challenging and caution is required when interpreting the results. P-gp is highly flexible and the time scale on which it can be simulated is limited. There is also uncertainty regarding the accuracy of the various crystal structures available, let alone the structure of the protein in a physiologically relevant environment. In this study, three alternative structural models of mouse P-gp (3G5U, 4KSB, 4M1M), all resolved to 3.8 Å, were used to initiate sets of simulations of P-gp in a membrane environment in order to determine: a) the sensitivity of the results to differences in the starting configuration; and b) the extent to which converged results could be expected on the times scales commonly simulated for this system. The simulations suggest that the arrangement of the nucleotide binding domains (NBDs) observed in the crystal structures is not stable in a membrane environment. In all simulations, the NBDs rapidly associated (within 10 ns) and changes within the transmembrane helices were observed. The secondary structure within the transmembrane domain was best preserved in the 4M1M model under the simulation conditions used. However, the extent to which replicate simulations diverged on a 100 to 200 ns timescale meant that it was not possible to draw definitive conclusions as to which structure overall was most stable, or to obtain converged and reliable results for any of the properties examined. The work brings into question the reliability of conclusions made in regard to the nature of specific interactions inferred from previous simulation studies on this system involving similar sampling times. It also highlights the need to demonstrate the statistical significance of any results obtained in simulations of large flexible proteins, especially where the initial structure is uncertain. PMID:29370310

  13. The reliability of molecular dynamics simulations of the multidrug transporter P-glycoprotein in a membrane environment.

    PubMed

    Condic-Jurkic, Karmen; Subramanian, Nandhitha; Mark, Alan E; O'Mara, Megan L

    2018-01-01

    Despite decades of research, the mechanism of action of the ABC multidrug transporter P-glycoprotein (P-gp) remains elusive. Due to experimental limitations, many researchers have turned to molecular dynamics simulation studies in order to investigate different aspects of P-gp function. However, such studies are challenging and caution is required when interpreting the results. P-gp is highly flexible and the time scale on which it can be simulated is limited. There is also uncertainty regarding the accuracy of the various crystal structures available, let alone the structure of the protein in a physiologically relevant environment. In this study, three alternative structural models of mouse P-gp (3G5U, 4KSB, 4M1M), all resolved to 3.8 Å, were used to initiate sets of simulations of P-gp in a membrane environment in order to determine: a) the sensitivity of the results to differences in the starting configuration; and b) the extent to which converged results could be expected on the times scales commonly simulated for this system. The simulations suggest that the arrangement of the nucleotide binding domains (NBDs) observed in the crystal structures is not stable in a membrane environment. In all simulations, the NBDs rapidly associated (within 10 ns) and changes within the transmembrane helices were observed. The secondary structure within the transmembrane domain was best preserved in the 4M1M model under the simulation conditions used. However, the extent to which replicate simulations diverged on a 100 to 200 ns timescale meant that it was not possible to draw definitive conclusions as to which structure overall was most stable, or to obtain converged and reliable results for any of the properties examined. The work brings into question the reliability of conclusions made in regard to the nature of specific interactions inferred from previous simulation studies on this system involving similar sampling times. It also highlights the need to demonstrate the statistical significance of any results obtained in simulations of large flexible proteins, especially where the initial structure is uncertain.

  14. 77 FR 26954 - 1-Naphthaleneacetic acid; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-08

    ... for which there is reliable information.'' This includes exposure through drinking water and in... exposure from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for NAA in drinking water. These simulation models take into account data on...

  15. Performance and Reliability of Bonded Interfaces for High-temperature Packaging: Annual Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeVoto, Douglas J.

    2017-10-19

    As maximum device temperatures approach 200 °Celsius, continuous operation, sintered silver materials promise to maintain bonds at these high temperatures without excessive degradation rates. A detailed characterization of the thermal performance and reliability of sintered silver materials and processes has been initiated for the next year. Future steps in crack modeling include efforts to simulate crack propagation directly using the extended finite element method (X-FEM), a numerical technique that uses the partition of unity method for modeling discontinuities such as cracks in a system.

  16. Simulation: Moving from Technology Challenge to Human Factors Success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gould, Derek A., E-mail: dgould@liv.ac.uk; Chalmers, Nicholas; Johnson, Sheena J.

    2012-06-15

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used.

  17. Construction simulation analysis of 120m continuous rigid frame bridge based on Midas Civil

    NASA Astrophysics Data System (ADS)

    Shi, Jing-xian; Ran, Zhi-hong

    2018-03-01

    In this paper, a three-dimensional finite element model of a continuous rigid frame bridge with a main span of 120m is established by the simulation and analysis of Midas Civil software. The deflection and stress of the main beam in each construction stage of continuous beam bridge are simulated and analyzed, which provides a reliable technical guarantee for the safe construction of the bridge.

  18. Space-flight simulations of calcium metabolism using a mathematical model of calcium regulation

    NASA Technical Reports Server (NTRS)

    Brand, S. N.

    1985-01-01

    The results of a series of simulation studies of calcium matabolic changes which have been recorded during human exposure to bed rest and space flight are presented. Space flight and bed rest data demonstrate losses of total body calcium during exposure to hypogravic environments. These losses are evidenced by higher than normal rates of urine calcium excretion and by negative calcium balances. In addition, intestinal absorption rates and bone mineral content are assumed to decrease. The bed rest and space flight simulations were executed on a mathematical model of the calcium metabolic system. The purpose of the simulations is to theoretically test hypotheses and predict system responses which are occurring during given experimental stresses. In this case, hypogravity occurs through the comparison of simulation and experimental data and through the analysis of model structure and system responses. The model reliably simulates the responses of selected bed rest and space flight parameters. When experimental data are available, the simulated skeletal responses and regulatory factors involved in the responses agree with space flight data collected on rodents. In addition, areas within the model that need improvement are identified.

  19. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.

    PubMed

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.

  20. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator

    PubMed Central

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730

  1. Assessing communication quality of consultations in primary care: initial reliability of the Global Consultation Rating Scale, based on the Calgary-Cambridge Guide to the Medical Interview.

    PubMed

    Burt, Jenni; Abel, Gary; Elmore, Natasha; Campbell, John; Roland, Martin; Benson, John; Silverman, Jonathan

    2014-03-06

    To investigate initial reliability of the Global Consultation Rating Scale (GCRS: an instrument to assess the effectiveness of communication across an entire doctor-patient consultation, based on the Calgary-Cambridge guide to the medical interview), in simulated patient consultations. Multiple ratings of simulated general practitioner (GP)-patient consultations by trained GP evaluators. UK primary care. 21 GPs and six trained GP evaluators. GCRS score. 6 GP raters used GCRS to rate randomly assigned video recordings of GP consultations with simulated patients. Each of the 42 consultations was rated separately by four raters. We considered whether a fixed difference between scores had the same meaning at all levels of performance. We then examined the reliability of GCRS using mixed linear regression models. We augmented our regression model to also examine whether there were systematic biases between the scores given by different raters and to look for possible order effects. Assessing the communication quality of individual consultations, GCRS achieved a reliability of 0.73 (95% CI 0.44 to 0.79) for two raters, 0.80 (0.54 to 0.85) for three and 0.85 (0.61 to 0.88) for four. We found an average difference of 1.65 (on a 0-10 scale) in the scores given by the least and most generous raters: adjusting for this evaluator bias increased reliability to 0.78 (0.53 to 0.83) for two raters; 0.85 (0.63 to 0.88) for three and 0.88 (0.69 to 0.91) for four. There were considerable order effects, with later consultations (after 15-20 ratings) receiving, on average, scores more than one point higher on a 0-10 scale. GCRS shows good reliability with three raters assessing each consultation. We are currently developing the scale further by assessing a large sample of real-world consultations.

  2. Payload maintenance cost model for the space telescope

    NASA Technical Reports Server (NTRS)

    White, W. L.

    1980-01-01

    An optimum maintenance cost model for the space telescope for a fifteen year mission cycle was developed. Various documents and subsequent updates of failure rates and configurations were made. The reliability of the space telescope for one year, two and one half years, and five years were determined using the failure rates and configurations. The failure rates and configurations were also used in the maintenance simulation computer model which simulate the failure patterns for the fifteen year mission life of the space telescope. Cost algorithms associated with the maintenance options as indicated by the failure patterns were developed and integrated into the model.

  3. Towards improving software security by using simulation to inform requirements and conceptual design

    DOE PAGES

    Nutaro, James J.; Allgood, Glenn O.; Kuruganti, Teja

    2015-06-17

    We illustrate the use of modeling and simulation early in the system life-cycle to improve security and reduce costs. The models that we develop for this illustration are inspired by problems in reliability analysis and supervisory control, for which similar models are used to quantify failure probabilities and rates. In the context of security, we propose that models of this general type can be used to understand trades between risk and cost while writing system requirements and during conceptual design, and thereby significantly reduce the need for expensive security corrections after a system enters operation

  4. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  5. Data flow modeling techniques

    NASA Technical Reports Server (NTRS)

    Kavi, K. M.

    1984-01-01

    There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.

  6. RAM simulation model for SPH/RSV systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Primm, A.H.; Nelson, S.C.

    1995-12-31

    The US Army`s Project Manager, Crusader is sponsoring the development of technologies that apply to the Self-Propelled Howitzer (SPH), formerly the Advanced Field Artillery System (AFAS), and Resupply Vehicle (RSV), formerly the Future Armored Resupply Vehicle (FARV), weapon system. Oak Ridge National Laboratory (ORNL) is currently performing developmental work in support of the SPH/PSV Crusader system. Supportive analyses of reliability, availability, and maintainability (RAM) aspects were also performed for the SPH/RSV effort. During FY 1994 and FY 1995 OPNL conducted a feasibility study to demonstrate the application of simulation modeling for RAM analysis of the Crusader system. Following completion ofmore » the feasibility study, a full-scale RAM simulation model of the Crusader system was developed for both the SPH and PSV. This report provides documentation for the simulation model as well as instructions in the proper execution and utilization of the model for the conduct of RAM analyses.« less

  7. A human body model for efficient numerical characterization of UWB signal propagation in wireless body area networks.

    PubMed

    Lim, Hooi Been; Baumann, Dirk; Li, Er-Ping

    2011-03-01

    Wireless body area network (WBAN) is a new enabling system with promising applications in areas such as remote health monitoring and interpersonal communication. Reliable and optimum design of a WBAN system relies on a good understanding and in-depth studies of the wave propagation around a human body. However, the human body is a very complex structure and is computationally demanding to model. This paper aims to investigate the effects of the numerical model's structure complexity and feature details on the simulation results. Depending on the application, a simplified numerical model that meets desired simulation accuracy can be employed for efficient simulations. Measurements of ultra wideband (UWB) signal propagation along a human arm are performed and compared to the simulation results obtained with numerical arm models of different complexity levels. The influence of the arm shape and size, as well as tissue composition and complexity is investigated.

  8. The Application of Voltage Transformer Simulator in Electrical Test Training

    NASA Astrophysics Data System (ADS)

    Li, Nan; Zhang, Jun; Chai, Ziqi; Wang, Jingpeng; Yang, Baowei

    2018-02-01

    The voltage transformer test is an important means to monitor its operating state. The accuracy and reliability of the test data is directly related to the test skill level of the operator. However, the risk of test instruments damage, equipment being tested damage and electric shock in operator is caused by improper operation when training the transformer test. In this paper, a simulation device of voltage transformer is set up, and a simulation model is built for the most common 500kV capacitor voltage transformer (CVT), the simulation model can realize several test items of CVT by combing with teaching guidance platform, simulation instrument, complete set of system software and auxiliary equipment in Changchun. Many successful applications show that the simulation device has good practical value and wide application prospect.

  9. Land surface models systematically overestimate the intensity, duration and magnitude of seasonal-scale evaporative droughts

    DOE PAGES

    Ukkola, A. M.; De Kauwe, M. G.; Pitman, A. J.; ...

    2016-10-13

    Land surface models (LSMs) must accurately simulate observed energy and water fluxes during droughts in order to provide reliable estimates of future water resources. We evaluated 8 different LSMs (14 model versions) for simulating evapotranspiration (ET) during periods of evaporative drought (Edrought) across six flux tower sites. Using an empirically defined Edrought threshold (a decline in ET below the observed 15th percentile), we show that LSMs simulated 58 Edrought days per year, on average, across the six sites, ~3 times as many as the observed 20 d. The simulated Edrought magnitude was ~8 times greater than observed and twice asmore » intense. Our findings point to systematic biases across LSMs when simulating water and energy fluxes under water-stressed conditions. The overestimation of key Edrought characteristics undermines our confidence in the models' capability in simulating realistic drought responses to climate change and has wider implications for phenomena sensitive to soil moisture, including heat waves.« less

  10. Coupling the WRF model with a temperature index model based on remote sensing for snowmelt simulations in a river basin in the Altay Mountains, northwest China

    NASA Astrophysics Data System (ADS)

    Wu, X.; Shen, Y.; Wang, N.; Pan, X.; Zhang, W.; He, J.; Wang, G.

    2017-12-01

    Snowmelt water is an important freshwater resource in the Altay Mountains in northwest China, and it is also crucial for local ecological system, economic and social sustainable development; however, warming climate and rapid spring snowmelt can cause floods that endanger both eco-environment and public and personal property and safety. This study simulates snowmelt in the Kayiertesi River catchment using a temperature-index model based on remote sensing coupled with high-resolution meteorological data obtained from NCEP reanalysis fields that were downscaled using Weather Research Forecasting model, then bias-corrected using a statistical downscaled model. Validation of the forcing data revealed that the high-resolution meteorological fields derived from downscaled NCEP reanalysis were reliable for driving the snowmelt model. Parameters of temperature-index model based on remote sensing were calibrated for spring 2014, and model performance was validated using MODIS snow cover and snow observations from spring 2012. The results show that the temperature-index model based on remote sensing performed well, with a simulation mean relative error of 6.7% and a Nash-Sutchliffe efficiency of 0.98 in spring 2012 in the river of Altay Mountains. Based on the reliable distributed snow water equivalent simulation, daily snowmelt runoff was calculated for spring 2012 in the basin. In the study catchment, spring snowmelt runoff accounts for 72% of spring runoff and 21% of annual runoff. Snowmelt is the main source of runoff for the catchment and should be managed and utilized effectively. The results provide a basis for snowmelt runoff predictions, so as to prevent snowmelt-induced floods, and also provide a generalizable approach that can be applied to other remote locations where high-density, long-term observational data is lacking.

  11. Recalibration and predictive reliability of a solute-transport model of an irrigated stream-aquifer system

    USGS Publications Warehouse

    Person, M.; Konikow, Leonard F.

    1986-01-01

    A solute-transport model of an irrigated stream-aquifer system was recalibrated because of discrepancies between prior predictions of ground-water salinity trends during 1971-1982 and the observed outcome in February 1982. The original model was calibrated with a 1-year record of data collected during 1971-1972 in an 18-km reach of the Arkansas River Valley in southeastern Colorado. The model is improved by incorporating additional hydrologic processes (salt transport through the unsaturated zone) and through reexamination of the reliability of some input data (regression relationship used to estimate salinity from specific conductance data). Extended simulations using the recalibrated model are made to investigate the usefulness of the model for predicting long-term trends of salinity and water levels within the study area. Predicted ground-water levels during 1971-1982 are in good agreement with the observed, indicating that the original 1971-1972 study period was sufficient to calibrate the flow model. However, long-term simulations using the recalibrated model based on recycling the 1971-1972 data alone yield an average ground-water salinity for 1982 that is too low by about 10%. Simulations that incorporate observed surface-water salinity variations yield better results, in that the calculated average ground-water salinity for 1982 is within 3% of the observed value. Statistical analysis of temporal salinity variations of the applied surface water indicates that at least a 4-year sampling period is needed to accurately calibrate the transport model. ?? 1986.

  12. Simulation of Voltage SET Operation in Phase-Change Random Access Memories with Heater Addition and Ring-Type Contactor for Low-Power Consumption by Finite Element Modeling

    NASA Astrophysics Data System (ADS)

    Gong, Yue-Feng; Song, Zhi-Tang; Ling, Yun; Liu, Yan; Li, Yi-Jin

    2010-06-01

    A three-dimensional finite element model for phase change random access memory is established to simulate electric, thermal and phase state distribution during (SET) operation. The model is applied to simulate the SET behaviors of the heater addition structure (HS) and the ring-type contact in the bottom electrode (RIB) structure. The simulation results indicate that the small bottom electrode contactor (BEC) is beneficial for heat efficiency and reliability in the HS cell, and the bottom electrode contactor with size Fx = 80 nm is a good choice for the RIB cell. Also shown is that the appropriate SET pulse time is 100 ns for the low power consumption and fast operation.

  13. Numerical simulation of hull curved plate forming by electromagnetic force assisted line heating

    NASA Astrophysics Data System (ADS)

    Wang, Ji; Wang, Shun; Liu, Yujun; Li, Rui; Liu, xiao

    2017-11-01

    Line heating is a common method in shipyards for forming of hull curved plate. The aluminum alloy plate is widely used in shipbuilding. To solve the problem of thick aluminum alloy plate forming with complex curved surface, a new technology named electromagnetic force assisted line heating(EFALH) was proposed in this paper. The FEM model of EFALH was established and the effect of electromagnetic force assisted forming was verified by self development equipment. Firstly, the solving idea of numerical simulation for EFALH was illustrated. Then, the coupled numerical simulation model of multi physical fields were established. Lastly, the reliability of the numerical simulation model was verified by comparing the experimental data. This paper lays a foundation for solving the forming problems of thick aluminum alloy curved plate in shipbuilding.

  14. A Simulation Model for Setting Terms for Performance Based Contract Terms

    DTIC Science & Technology

    2010-05-01

    torpedo self-noise and the use of ruggedized, embedded, digital micro - processors . The latter capability made it possible for digitally controlled...inventories are: System Reliability, Product Reliability, Operational Availability, Mean Time to Repair (MTTR), Mean Time to Failure ( MTTF ...Failure ( MTTF ) Mean Logistics Delay Time (MLDT) Mean Supply Response Time (MSRT) D ep en de nt M et ric s Mean Accumulated Down Time (MADT

  15. Small Wind Research Turbine: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corbus, D.; Meadors, M.

    2005-10-01

    The Small Wind Research Turbine (SWRT) project was initiated to provide reliable test data for model validation of furling wind turbines and to help understand small wind turbine loads. This report will familiarize the user with the scope of the SWRT test and support the use of these data. In addition to describing all the testing details and results, the report presents an analysis of the test data and compares the SWRT test data to simulation results from the FAST aeroelastic simulation model.

  16. A 1-D Model of the 4 Bed Molecular Sieve of the Carbon Dioxide Removal Assembly

    NASA Technical Reports Server (NTRS)

    Coker, Robert; Knox, Jim

    2015-01-01

    Developments to improve system efficiency and reliability for water and carbon dioxide separation systems on crewed vehicles combine sub-scale systems testing and multi-physics simulations. This paper describes the development of COMSOL simulations in support of the Life Support Systems (LSS) project within NASA's Advanced Exploration Systems (AES) program. Specifically, we model the 4 Bed Molecular Sieve (4BMS) of the Carbon Dioxide Removal Assembly (CDRA) operating on the International Space Station (ISS).

  17. Observer-based monitoring of heat exchangers.

    PubMed

    Astorga-Zaragoza, Carlos-Manuel; Alvarado-Martínez, Víctor-Manuel; Zavala-Río, Arturo; Méndez-Ocaña, Rafael-Maxim; Guerrero-Ramírez, Gerardo-Vicente

    2008-01-01

    The goal of this work is to provide a method for monitoring performance degradation in counter-flow double-pipe heat exchangers. The overall heat transfer coefficient is estimated by an adaptive observer and monitored in order to infer when the heat exchanger needs preventive or corrective maintenance. A simplified mathematical model is used to synthesize the adaptive observer and a more complex model is used for simulation. The reliability of the proposed method was demonstrated via numerical simulations and laboratory experiments with a bench-scale pilot plant.

  18. Comparison of Two Global Sensitivity Analysis Methods for Hydrologic Modeling over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Hameed, M.; Demirel, M. C.; Moradkhani, H.

    2015-12-01

    Global Sensitivity Analysis (GSA) approach helps identify the effectiveness of model parameters or inputs and thus provides essential information about the model performance. In this study, the effects of the Sacramento Soil Moisture Accounting (SAC-SMA) model parameters, forcing data, and initial conditions are analysed by using two GSA methods: Sobol' and Fourier Amplitude Sensitivity Test (FAST). The simulations are carried out over five sub-basins within the Columbia River Basin (CRB) for three different periods: one-year, four-year, and seven-year. Four factors are considered and evaluated by using the two sensitivity analysis methods: the simulation length, parameter range, model initial conditions, and the reliability of the global sensitivity analysis methods. The reliability of the sensitivity analysis results is compared based on 1) the agreement between the two sensitivity analysis methods (Sobol' and FAST) in terms of highlighting the same parameters or input as the most influential parameters or input and 2) how the methods are cohered in ranking these sensitive parameters under the same conditions (sub-basins and simulation length). The results show the coherence between the Sobol' and FAST sensitivity analysis methods. Additionally, it is found that FAST method is sufficient to evaluate the main effects of the model parameters and inputs. Another conclusion of this study is that the smaller parameter or initial condition ranges, the more consistency and coherence between the sensitivity analysis methods results.

  19. Adaptive vehicle motion estimation and prediction

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Thorpe, Chuck E.

    1999-01-01

    Accurate motion estimation and reliable maneuver prediction enable an automated car to react quickly and correctly to the rapid maneuvers of the other vehicles, and so allow safe and efficient navigation. In this paper, we present a car tracking system which provides motion estimation, maneuver prediction and detection of the tracked car. The three strategies employed - adaptive motion modeling, adaptive data sampling, and adaptive model switching probabilities - result in an adaptive interacting multiple model algorithm (AIMM). The experimental results on simulated and real data demonstrate that our tracking system is reliable, flexible, and robust. The adaptive tracking makes the system intelligent and useful in various autonomous driving tasks.

  20. Analysis of Food Hub Commerce and Participation Using Agent-Based Modeling: Integrating Financial and Social Drivers.

    PubMed

    Krejci, Caroline C; Stone, Richard T; Dorneich, Michael C; Gilbert, Stephen B

    2016-02-01

    Factors influencing long-term viability of an intermediated regional food supply network (food hub) were modeled using agent-based modeling techniques informed by interview data gathered from food hub participants. Previous analyses of food hub dynamics focused primarily on financial drivers rather than social factors and have not used mathematical models. Based on qualitative and quantitative data gathered from 22 customers and 11 vendors at a midwestern food hub, an agent-based model (ABM) was created with distinct consumer personas characterizing the range of consumer priorities. A comparison study determined if the ABM behaved differently than a model based on traditional economic assumptions. Further simulation studies assessed the effect of changes in parameters, such as producer reliability and the consumer profiles, on long-term food hub sustainability. The persona-based ABM model produced different and more resilient results than the more traditional way of modeling consumers. Reduced producer reliability significantly reduced trade; in some instances, a modest reduction in reliability threatened the sustainability of the system. Finally, a modest increase in price-driven consumers at the outset of the simulation quickly resulted in those consumers becoming a majority of the overall customer base. Results suggest that social factors, such as desire to support the community, can be more important than financial factors. An ABM of food hub dynamics, based on human factors data gathered from the field, can be a useful tool for policy decisions. Similar approaches can be used for modeling customer dynamics with other sustainable organizations. © 2015, Human Factors and Ergonomics Society.

  1. PV Systems Reliability Final Technical Report: Ground Fault Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lavrova, Olga; Flicker, Jack David; Johnson, Jay

    We have examined ground faults in PhotoVoltaic (PV) arrays and the efficacy of fuse, current detection (RCD), current sense monitoring/relays (CSM), isolation/insulation (Riso) monitoring, and Ground Fault Detection and Isolation (GFID) using simulations based on a Simulation Program with Integrated Circuit Emphasis SPICE ground fault circuit model, experimental ground faults installed on real arrays, and theoretical equations.

  2. The FireBGCv2 landscape fire and succession model: a research simulation platform for exploring fire and vegetation dynamics

    Treesearch

    Robert E. Keane; Rachel A. Loehman; Lisa M. Holsinger

    2011-01-01

    Fire management faces important emergent issues in the coming years such as climate change, fire exclusion impacts, and wildland-urban development, so new, innovative means are needed to address these challenges. Field studies, while preferable and reliable, will be problematic because of the large time and space scales involved. Therefore, landscape simulation...

  3. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  4. GRIZZLY Model of Multi-Reactive Species Diffusion, Moisture/Heat Transfer and Alkali-Silica Reaction for Simulating Concrete Aging and Degradation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Hai; Spencer, Benjamin W.; Cai, Guowei

    Concrete is widely used in the construction of nuclear facilities because of its structural strength and its ability to shield radiation. The use of concrete in nuclear power plants for containment and shielding of radiation and radioactive materials has made its performance crucial for the safe operation of the facility. As such, when life extension is considered for nuclear power plants, it is critical to have accurate and reliable predictive tools to address concerns related to various aging processes of concrete structures and the capacity of structures subjected to age-related degradation. The goal of this report is to document themore » progress of the development and implementation of a fully coupled thermo-hydro-mechanical-chemical model in GRIZZLY code with the ultimate goal to reliably simulate and predict long-term performance and response of aged NPP concrete structures subjected to a number of aging mechanisms including external chemical attacks and volume-changing chemical reactions within concrete structures induced by alkali-silica reactions and long-term exposure to irradiation. Based on a number of survey reports of concrete aging mechanisms relevant to nuclear power plants and recommendations from researchers in concrete community, we’ve implemented three modules during FY15 in GRIZZLY code, (1) multi-species reactive diffusion model within cement materials; (2) coupled moisture and heat transfer model in concrete; and (3) anisotropic, stress-dependent, alkali-silica reaction induced swelling model. The multi-species reactive diffusion model was implemented with the objective to model aging of concrete structures subjected to aggressive external chemical attacks (e.g., chloride attack, sulfate attack, etc.). It considers multiple processes relevant to external chemical attacks such as diffusion of ions in aqueous phase within pore spaces, equilibrium chemical speciation reactions and kinetic mineral dissolution/precipitation. The moisture/heat transfer module was implemented to simulate long-term spatial and temporal evolutions of the moisture and temperature fields within concrete structures at both room and elevated temperatures. The ASR swelling model implemented in GRIZZLY code can simulate anisotropic expansions of ASR gel under either uniaxial, biaxial and triaxial stress states, and can be run simultaneously with the moisture/heat transfer model and coupled with various elastic/inelastic solid mechanics models that were implemented in GRIZZLY code previously. This report provides detailed descriptions of the governing equations, constitutive equations and numerical algorithms of the three modules implemented in GRIZZLY during FY15, simulation results of example problems and model validation results by comparing simulations with available experimental data reported in the literature. The close match between the experiments and simulations clearly demonstrate the potential of GRIZZLY code for reliable evaluation and prediction of long-term performance and response of aged concrete structures in nuclear power plants.« less

  5. Modelling Pollutant Dispersion in a Street Network

    NASA Astrophysics Data System (ADS)

    Salem, N. Ben; Garbero, V.; Salizzoni, P.; Lamaison, G.; Soulhac, L.

    2015-04-01

    This study constitutes a further step in the analysis of the performances of a street network model to simulate atmospheric pollutant dispersion in urban areas. The model, named SIRANE, is based on the decomposition of the urban atmosphere into two sub-domains: the urban boundary layer, whose dynamics is assumed to be well established, and the urban canopy, represented as a series of interconnected boxes. Parametric laws govern the mass exchanges between the boxes under the assumption that the pollutant dispersion within the canopy can be fully simulated by modelling three main bulk transfer phenomena: channelling along street axes, transfers at street intersections, and vertical exchange between street canyons and the overlying atmosphere. Here, we aim to evaluate the reliability of the parametrizations adopted to simulate these phenomena, by focusing on their possible dependence on the external wind direction. To this end, we test the model against concentration measurements within an idealized urban district whose geometrical layout closely matches the street network represented in SIRANE. The analysis is performed for an urban array with a fixed geometry and a varying wind incidence angle. The results show that the model provides generally good results with the reference parametrizations adopted in SIRANE and that its performances are quite robust for a wide range of the model parameters. This proves the reliability of the street network approach in simulating pollutant dispersion in densely built city districts. The results also show that the model performances may be improved by considering a dependence of the wind fluctuations at street intersections and of the vertical exchange velocity on the direction of the incident wind. This opens the way for further investigations to clarify the dependence of these parameters on wind direction and street aspect ratios.

  6. Pitfalls and important issues in testing reliability using intraclass correlation coefficients in orthopaedic research.

    PubMed

    Lee, Kyoung Min; Lee, Jaebong; Chung, Chin Youb; Ahn, Soyeon; Sung, Ki Hyuk; Kim, Tae Won; Lee, Hui Jong; Park, Moon Seok

    2012-06-01

    Intra-class correlation coefficients (ICCs) provide a statistical means of testing the reliability. However, their interpretation is not well documented in the orthopedic field. The purpose of this study was to investigate the use of ICCs in the orthopedic literature and to demonstrate pitfalls regarding their use. First, orthopedic articles that used ICCs were retrieved from the Pubmed database, and journal demography, ICC models and concurrent statistics used were evaluated. Second, reliability test was performed on three common physical examinations in cerebral palsy, namely, the Thomas test, the Staheli test, and popliteal angle measurement. Thirty patients were assessed by three orthopedic surgeons to explore the statistical methods testing reliability. Third, the factors affecting the ICC values were examined by simulating the data sets based on the physical examination data where the ranges, slopes, and interobserver variability were modified. Of the 92 orthopedic articles identified, 58 articles (63%) did not clarify the ICC model used, and only 5 articles (5%) described all models, types, and measures. In reliability testing, although the popliteal angle showed a larger mean absolute difference than the Thomas test and the Staheli test, the ICC of popliteal angle was higher, which was believed to be contrary to the context of measurement. In addition, the ICC values were affected by the model, type, and measures used. In simulated data sets, the ICC showed higher values when the range of data sets were larger, the slopes of the data sets were parallel, and the interobserver variability was smaller. Care should be taken when interpreting the absolute ICC values, i.e., a higher ICC does not necessarily mean less variability because the ICC values can also be affected by various factors. The authors recommend that researchers clarify ICC models used and ICC values are interpreted in the context of measurement.

  7. A Jones matrix formalism for simulating three-dimensional polarized light imaging of brain tissue.

    PubMed

    Menzel, M; Michielsen, K; De Raedt, H; Reckfort, J; Amunts, K; Axer, M

    2015-10-06

    The neuroimaging technique three-dimensional polarized light imaging (3D-PLI) provides a high-resolution reconstruction of nerve fibres in human post-mortem brains. The orientations of the fibres are derived from birefringence measurements of histological brain sections assuming that the nerve fibres—consisting of an axon and a surrounding myelin sheath—are uniaxial birefringent and that the measured optic axis is oriented in the direction of the nerve fibres (macroscopic model). Although experimental studies support this assumption, the molecular structure of the myelin sheath suggests that the birefringence of a nerve fibre can be described more precisely by multiple optic axes oriented radially around the fibre axis (microscopic model). In this paper, we compare the use of the macroscopic and the microscopic model for simulating 3D-PLI by means of the Jones matrix formalism. The simulations show that the macroscopic model ensures a reliable estimation of the fibre orientations as long as the polarimeter does not resolve structures smaller than the diameter of single fibres. In the case of fibre bundles, polarimeters with even higher resolutions can be used without losing reliability. When taking the myelin density into account, the derived fibre orientations are considerably improved. © 2015 The Author(s).

  8. A review on vegetation models and applicability to climate simulations at regional scale

    NASA Astrophysics Data System (ADS)

    Myoung, Boksoon; Choi, Yong-Sang; Park, Seon Ki

    2011-11-01

    The lack of accurate representations of biospheric components and their biophysical and biogeochemical processes is a great source of uncertainty in current climate models. The interactions between terrestrial ecosystems and the climate include exchanges not only of energy, water and momentum, but also of carbon and nitrogen. Reliable simulations of these interactions are crucial for predicting the potential impacts of future climate change and anthropogenic intervention on terrestrial ecosystems. In this paper, two biogeographical (Neilson's rule-based model and BIOME), two biogeochemical (BIOME-BGC and PnET-BGC), and three dynamic global vegetation models (Hybrid, LPJ, and MC1) were reviewed and compared in terms of their biophysical and physiological processes. The advantages and limitations of the models were also addressed. Lastly, the applications of the dynamic global vegetation models to regional climate simulations have been discussed.

  9. 75 FR 35689 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... using realistic simulations.\\14\\ \\13\\ Id. P 1331. \\14\\ Reliability Standard PER-002-0. 9. In Order No... development process to: (1) Include formal training requirements for reliability coordinators similar to those... simulation technology such as a simulator, virtual technology, or other technology in their emergency...

  10. The Design and Semi-Physical Simulation Test of Fault-Tolerant Controller for Aero Engine

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Zhang, Xin; Zhang, Tianhong

    2017-11-01

    A new fault-tolerant control method for aero engine is proposed, which can accurately diagnose the sensor fault by Kalman filter banks and reconstruct the signal by real-time on-board adaptive model combing with a simplified real-time model and an improved Kalman filter. In order to verify the feasibility of the method proposed, a semi-physical simulation experiment has been carried out. Besides the real I/O interfaces, controller hardware and the virtual plant model, semi-physical simulation system also contains real fuel system. Compared with the hardware-in-the-loop (HIL) simulation, semi-physical simulation system has a higher degree of confidence. In order to meet the needs of semi-physical simulation, a rapid prototyping controller with fault-tolerant control ability based on NI CompactRIO platform is designed and verified on the semi-physical simulation test platform. The result shows that the controller can realize the aero engine control safely and reliably with little influence on controller performance in the event of fault on sensor.

  11. Evaluating the Sensitivity of Agricultural Model Performance to Different Climate Inputs: Supplemental Material

    NASA Technical Reports Server (NTRS)

    Glotter, Michael J.; Ruane, Alex C.; Moyer, Elisabeth J.; Elliott, Joshua W.

    2015-01-01

    Projections of future food production necessarily rely on models, which must themselves be validated through historical assessments comparing modeled and observed yields. Reliable historical validation requires both accurate agricultural models and accurate climate inputs. Problems with either may compromise the validation exercise. Previous studies have compared the effects of different climate inputs on agricultural projections but either incompletely or without a ground truth of observed yields that would allow distinguishing errors due to climate inputs from those intrinsic to the crop model. This study is a systematic evaluation of the reliability of a widely used crop model for simulating U.S. maize yields when driven by multiple observational data products. The parallelized Decision Support System for Agrotechnology Transfer (pDSSAT) is driven with climate inputs from multiple sources reanalysis, reanalysis that is bias corrected with observed climate, and a control dataset and compared with observed historical yields. The simulations show that model output is more accurate when driven by any observation-based precipitation product than when driven by non-bias-corrected reanalysis. The simulations also suggest, in contrast to previous studies, that biased precipitation distribution is significant for yields only in arid regions. Some issues persist for all choices of climate inputs: crop yields appear to be oversensitive to precipitation fluctuations but under sensitive to floods and heat waves. These results suggest that the most important issue for agricultural projections may be not climate inputs but structural limitations in the crop models themselves.

  12. Evaluating the sensitivity of agricultural model performance to different climate inputs

    PubMed Central

    Glotter, Michael J.; Moyer, Elisabeth J.; Ruane, Alex C.; Elliott, Joshua W.

    2017-01-01

    Projections of future food production necessarily rely on models, which must themselves be validated through historical assessments comparing modeled to observed yields. Reliable historical validation requires both accurate agricultural models and accurate climate inputs. Problems with either may compromise the validation exercise. Previous studies have compared the effects of different climate inputs on agricultural projections, but either incompletely or without a ground truth of observed yields that would allow distinguishing errors due to climate inputs from those intrinsic to the crop model. This study is a systematic evaluation of the reliability of a widely-used crop model for simulating U.S. maize yields when driven by multiple observational data products. The parallelized Decision Support System for Agrotechnology Transfer (pDSSAT) is driven with climate inputs from multiple sources – reanalysis, reanalysis bias-corrected with observed climate, and a control dataset – and compared to observed historical yields. The simulations show that model output is more accurate when driven by any observation-based precipitation product than when driven by un-bias-corrected reanalysis. The simulations also suggest, in contrast to previous studies, that biased precipitation distribution is significant for yields only in arid regions. However, some issues persist for all choices of climate inputs: crop yields appear oversensitive to precipitation fluctuations but undersensitive to floods and heat waves. These results suggest that the most important issue for agricultural projections may be not climate inputs but structural limitations in the crop models themselves. PMID:29097985

  13. Reliability Stress-Strength Models for Dependent Observations with Applications in Clinical Trials

    NASA Technical Reports Server (NTRS)

    Kushary, Debashis; Kulkarni, Pandurang M.

    1995-01-01

    We consider the applications of stress-strength models in studies involving clinical trials. When studying the effects and side effects of certain procedures (treatments), it is often the case that observations are correlated due to subject effect, repeated measurements and observing many characteristics simultaneously. We develop maximum likelihood estimator (MLE) and uniform minimum variance unbiased estimator (UMVUE) of the reliability which in clinical trial studies could be considered as the chances of increased side effects due to a particular procedure compared to another. The results developed apply to both univariate and multivariate situations. Also, for the univariate situations we develop simple to use lower confidence bounds for the reliability. Further, we consider the cases when both stress and strength constitute time dependent processes. We define the future reliability and obtain methods of constructing lower confidence bounds for this reliability. Finally, we conduct simulation studies to evaluate all the procedures developed and also to compare the MLE and the UMVUE.

  14. Implementation of a combined algorithm designed to increase the reliability of information systems: simulation modeling

    NASA Astrophysics Data System (ADS)

    Popov, A.; Zolotarev, V.; Bychkov, S.

    2016-11-01

    This paper examines the results of experimental studies of a previously submitted combined algorithm designed to increase the reliability of information systems. The data that illustrates the organization and conduct of the studies is provided. Within the framework of a comparison of As a part of the study conducted, the comparison of the experimental data of simulation modeling and the data of the functioning of the real information system was made. The hypothesis of the homogeneity of the logical structure of the information systems was formulated, thus enabling to reconfigure the algorithm presented, - more specifically, to transform it into the model for the analysis and prediction of arbitrary information systems. The results presented can be used for further research in this direction. The data of the opportunity to predict the functioning of the information systems can be used for strategic and economic planning. The algorithm can be used as a means for providing information security.

  15. Development of robust flexible OLED encapsulations using simulated estimations and experimental validations

    NASA Astrophysics Data System (ADS)

    Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung

    2012-07-01

    This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations.

  16. Fatigue reliability of deck structures subjected to correlated crack growth

    NASA Astrophysics Data System (ADS)

    Feng, G. Q.; Garbatov, Y.; Guedes Soares, C.

    2013-12-01

    The objective of this work is to analyse fatigue reliability of deck structures subjected to correlated crack growth. The stress intensity factors of the correlated cracks are obtained by finite element analysis and based on which the geometry correction functions are derived. The Monte Carlo simulations are applied to predict the statistical descriptors of correlated cracks based on the Paris-Erdogan equation. A probabilistic model of crack growth as a function of time is used to analyse the fatigue reliability of deck structures accounting for the crack propagation correlation. A deck structure is modelled as a series system of stiffened panels, where a stiffened panel is regarded as a parallel system composed of plates and are longitudinal. It has been proven that the method developed here can be conveniently applied to perform the fatigue reliability assessment of structures subjected to correlated crack growth.

  17. Dynamic one-dimensional modeling of secondary settling tanks and system robustness evaluation.

    PubMed

    Li, Ben; Stenstrom, M K

    2014-01-01

    One-dimensional secondary settling tank models are widely used in current engineering practice for design and optimization, and usually can be expressed as a nonlinear hyperbolic or nonlinear strongly degenerate parabolic partial differential equation (PDE). Reliable numerical methods are needed to produce approximate solutions that converge to the exact analytical solutions. In this study, we introduced a reliable numerical technique, the Yee-Roe-Davis (YRD) method as the governing PDE solver, and compared its reliability with the prevalent Stenstrom-Vitasovic-Takács (SVT) method by assessing their simulation results at various operating conditions. The YRD method also produced a similar solution to the previously developed Method G and Enquist-Osher method. The YRD and SVT methods were also used for a time-to-failure evaluation, and the results show that the choice of numerical method can greatly impact the solution. Reliable numerical methods, such as the YRD method, are strongly recommended.

  18. Simulation of Nitrogen and Phosphorus Removal in Ecological Ditch Based on EFDC Model

    NASA Astrophysics Data System (ADS)

    Li, S. M.; Wang, X. L.; Zhou, Q. Y.; Han, N. N.

    2018-03-01

    Agricultural non-point source pollution threatens water quality and ecological system recently. To control it, the first and most important task is to control the migration and transformation of nitrogen and phosphorus in the agricultural ditches. An ecological ditch was designed, and according to the design a pilot device was built, the mechanism of N and P removal in ditches under the collaboration of aquatic organisms-hydraulic power was studied through the dynamic and static experiments, in order to find out the specific influences of different environmental factors such as influent concentration, influent flow and water level. The transport and diffusion of N and P in the ditch was simulated by a three dimensional water quality model EFDC, the simulation results and the experimental data were compared. The average relative errors of EFDC model simulated results were all less than 15%, which verified the reliability of the model.

  19. Probabilistic simulation of the human factor in structural reliability

    NASA Astrophysics Data System (ADS)

    Chamis, Christos C.; Singhal, Surendra N.

    1994-09-01

    The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).

  20. Numerical aerodynamic simulation facility. Preliminary study extension

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The production of an optimized design of key elements of the candidate facility was the primary objective of this report. This was accomplished by effort in the following tasks: (1) to further develop, optimize and describe the function description of the custom hardware; (2) to delineate trade off areas between performance, reliability, availability, serviceability, and programmability; (3) to develop metrics and models for validation of the candidate systems performance; (4) to conduct a functional simulation of the system design; (5) to perform a reliability analysis of the system design; and (6) to develop the software specifications to include a user level high level programming language, a correspondence between the programming language and instruction set and outline the operation system requirements.

  1. Probabilistic Simulation of the Human Factor in Structural Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Singhal, Surendra N.

    1994-01-01

    The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).

  2. Electronic Quality of Life Assessment Using Computer-Adaptive Testing

    PubMed Central

    2016-01-01

    Background Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Objective Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. Methods We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. Results The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Conclusions Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items. PMID:27694100

  3. Prediction of Aerodynamic Coefficient using Genetic Algorithm Optimized Neural Network for Sparse Data

    NASA Technical Reports Server (NTRS)

    Rajkumar, T.; Bardina, Jorge; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wind tunnels use scale models to characterize aerodynamic coefficients, Wind tunnel testing can be slow and costly due to high personnel overhead and intensive power utilization. Although manual curve fitting can be done, it is highly efficient to use a neural network to define the complex relationship between variables. Numerical simulation of complex vehicles on the wide range of conditions required for flight simulation requires static and dynamic data. Static data at low Mach numbers and angles of attack may be obtained with simpler Euler codes. Static data of stalled vehicles where zones of flow separation are usually present at higher angles of attack require Navier-Stokes simulations which are costly due to the large processing time required to attain convergence. Preliminary dynamic data may be obtained with simpler methods based on correlations and vortex methods; however, accurate prediction of the dynamic coefficients requires complex and costly numerical simulations. A reliable and fast method of predicting complex aerodynamic coefficients for flight simulation I'S presented using a neural network. The training data for the neural network are derived from numerical simulations and wind-tunnel experiments. The aerodynamic coefficients are modeled as functions of the flow characteristics and the control surfaces of the vehicle. The basic coefficients of lift, drag and pitching moment are expressed as functions of angles of attack and Mach number. The modeled and training aerodynamic coefficients show good agreement. This method shows excellent potential for rapid development of aerodynamic models for flight simulation. Genetic Algorithms (GA) are used to optimize a previously built Artificial Neural Network (ANN) that reliably predicts aerodynamic coefficients. Results indicate that the GA provided an efficient method of optimizing the ANN model to predict aerodynamic coefficients. The reliability of the ANN using the GA includes prediction of aerodynamic coefficients to an accuracy of 110% . In our problem, we would like to get an optimized neural network architecture and minimum data set. This has been accomplished within 500 training cycles of a neural network. After removing training pairs (outliers), the GA has produced much better results. The neural network constructed is a feed forward neural network with a back propagation learning mechanism. The main goal has been to free the network design process from constraints of human biases, and to discover better forms of neural network architectures. The automation of the network architecture search by genetic algorithms seems to have been the best way to achieve this goal.

  4. High Fidelity, “Faster than Real-Time” Simulator for Predicting Power System Dynamic Behavior - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flueck, Alex

    The “High Fidelity, Faster than Real­Time Simulator for Predicting Power System Dynamic Behavior” was designed and developed by Illinois Institute of Technology with critical contributions from Electrocon International, Argonne National Laboratory, Alstom Grid and McCoy Energy. Also essential to the project were our two utility partners: Commonwealth Edison and AltaLink. The project was a success due to several major breakthroughs in the area of large­scale power system dynamics simulation, including (1) a validated faster than real­ time simulation of both stable and unstable transient dynamics in a large­scale positive sequence transmission grid model, (2) a three­phase unbalanced simulation platform formore » modeling new grid devices, such as independently controlled single­phase static var compensators (SVCs), (3) the world’s first high fidelity three­phase unbalanced dynamics and protection simulator based on Electrocon’s CAPE program, and (4) a first­of­its­ kind implementation of a single­phase induction motor model with stall capability. The simulator results will aid power grid operators in their true time of need, when there is a significant risk of cascading outages. The simulator will accelerate performance and enhance accuracy of dynamics simulations, enabling operators to maintain reliability and steer clear of blackouts. In the long­term, the simulator will form the backbone of the newly conceived hybrid real­time protection and control architecture that will coordinate local controls, wide­area measurements, wide­area controls and advanced real­time prediction capabilities. The nation’s citizens will benefit in several ways, including (1) less down time from power outages due to the faster­than­real­time simulator’s predictive capability, (2) higher levels of reliability due to the detailed dynamics plus protection simulation capability, and (3) more resiliency due to the three­ phase unbalanced simulator’s ability to model three­phase and single­ phase networks and devices.« less

  5. What makes an accurate and reliable subject-specific finite element model? A case study of an elephant femur

    PubMed Central

    Panagiotopoulou, O.; Wilshin, S. D.; Rayfield, E. J.; Shefelbine, S. J.; Hutchinson, J. R.

    2012-01-01

    Finite element modelling is well entrenched in comparative vertebrate biomechanics as a tool to assess the mechanical design of skeletal structures and to better comprehend the complex interaction of their form–function relationships. But what makes a reliable subject-specific finite element model? To approach this question, we here present a set of convergence and sensitivity analyses and a validation study as an example, for finite element analysis (FEA) in general, of ways to ensure a reliable model. We detail how choices of element size, type and material properties in FEA influence the results of simulations. We also present an empirical model for estimating heterogeneous material properties throughout an elephant femur (but of broad applicability to FEA). We then use an ex vivo experimental validation test of a cadaveric femur to check our FEA results and find that the heterogeneous model matches the experimental results extremely well, and far better than the homogeneous model. We emphasize how considering heterogeneous material properties in FEA may be critical, so this should become standard practice in comparative FEA studies along with convergence analyses, consideration of element size, type and experimental validation. These steps may be required to obtain accurate models and derive reliable conclusions from them. PMID:21752810

  6. A smart grid simulation testbed using Matlab/Simulink

    NASA Astrophysics Data System (ADS)

    Mallapuram, Sriharsha; Moulema, Paul; Yu, Wei

    2014-06-01

    The smart grid is the integration of computing and communication technologies into a power grid with a goal of enabling real time control, and a reliable, secure, and efficient energy system [1]. With the increased interest of the research community and stakeholders towards the smart grid, a number of solutions and algorithms have been developed and proposed to address issues related to smart grid operations and functions. Those technologies and solutions need to be tested and validated before implementation using software simulators. In this paper, we developed a general smart grid simulation model in the MATLAB/Simulink environment, which integrates renewable energy resources, energy storage technology, load monitoring and control capability. To demonstrate and validate the effectiveness of our simulation model, we created simulation scenarios and performed simulations using a real-world data set provided by the Pecan Street Research Institute.

  7. Recoding low-level simulator data into a record of meaningful task performance: the integrated task modeling environment (ITME).

    PubMed

    King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick

    2007-11-01

    The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.

  8. Modeling, simulation, and high-autonomy control of a Martian oxygen production plant

    NASA Technical Reports Server (NTRS)

    Schooley, L. C.; Cellier, F. E.; Wang, F.-Y.; Zeigler, B. P.

    1992-01-01

    Progress on a project for the development of a high-autonomy intelligent command and control architecture for process plants used to produce oxygen from local planetary resources is reported. A distributed command and control architecture is being developed and implemented so that an oxygen production plant, or other equipment, can be reliably commanded and controlled over an extended time period in a high-autonomy mode with high-level task-oriented teleoperation from one or several remote locations. During the reporting period, progress was made at all levels of the architecture. At the remote site, several remote observers can now participate in monitoring the plant. At the local site, a command and control center was introduced for increased flexibility, reliability, and robustness. The local control architecture was enhanced to control multiple tubes in parallel, and was refined for increased robustness. The simulation model was enhanced to full dynamics descriptions.

  9. SiC-VJFETs power switching devices: an improved model and parameter optimization technique

    NASA Astrophysics Data System (ADS)

    Ben Salah, T.; Lahbib, Y.; Morel, H.

    2009-12-01

    Silicon carbide junction field effect transistor (SiC-JFETs) is a mature power switch newly applied in several industrial applications. SiC-JFETs are often simulated by Spice model in order to predict their electrical behaviour. Although such a model provides sufficient accuracy for some applications, this paper shows that it presents serious shortcomings in terms of the neglect of the body diode model, among many others in circuit model topology. Simulation correction is then mandatory and a new model should be proposed. Moreover, this paper gives an enhanced model based on experimental dc and ac data. New devices are added to the conventional circuit model giving accurate static and dynamic behaviour, an effect not accounted in the Spice model. The improved model is implemented into VHDL-AMS language and steady-state dynamic and transient responses are simulated for many SiC-VJFETs samples. Very simple and reliable optimization algorithm based on the optimization of a cost function is proposed to extract the JFET model parameters. The obtained parameters are verified by comparing errors between simulations results and experimental data.

  10. Effects of DEM source and resolution on WEPP hydrologic and erosion simulation: A case study of two forest watersheds in northern Idaho

    Treesearch

    J. X. Zhang; J. Q. Wu; K. Chang; W. J. Elliot; S. Dun

    2009-01-01

    The recent modification of the Water Erosion Prediction Project (WEPP) model has improved its applicability to hydrology and erosion modeling in forest watersheds. To generate reliable topographic and hydrologic inputs for the WEPP model, carefully selecting digital elevation models (DEMs) with appropriate resolution and accuracy is essential because topography is a...

  11. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarovar, Mohan; Zhang, Jun; Zeng, Lishan

    Analog quantum simulators (AQS) will likely be the first nontrivial application of quantum technology for predictive simulation. However, there remain questions regarding the degree of confidence that can be placed in the results of AQS since they do not naturally incorporate error correction. Specifically, how do we know whether an analog simulation of a quantum model will produce predictions that agree with the ideal model in the presence of inevitable imperfections? At the same time there is a widely held expectation that certain quantum simulation questions will be robust to errors and perturbations in the underlying hardware. Resolving these twomore » points of view is a critical step in making the most of this promising technology. In this paper we formalize the notion of AQS reliability by determining sensitivity of AQS outputs to underlying parameters, and formulate conditions for robust simulation. Our approach naturally reveals the importance of model symmetries in dictating the robust properties. Finally, to demonstrate the approach, we characterize the robust features of a variety of quantum many-body models.« less

  13. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  14. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  15. An approach for modelling snowcover ablation and snowmelt runoff in cold region environments

    NASA Astrophysics Data System (ADS)

    Dornes, Pablo Fernando

    Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.

  16. ASME V\\&V challenge problem: Surrogate-based V&V

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beghini, Lauren L.; Hough, Patricia D.

    2015-12-18

    The process of verification and validation can be resource intensive. From the computational model perspective, the resource demand typically arises from long simulation run times on multiple cores coupled with the need to characterize and propagate uncertainties. In addition, predictive computations performed for safety and reliability analyses have similar resource requirements. For this reason, there is a tradeoff between the time required to complete the requisite studies and the fidelity or accuracy of the results that can be obtained. At a high level, our approach is cast within a validation hierarchy that provides a framework in which we perform sensitivitymore » analysis, model calibration, model validation, and prediction. The evidence gathered as part of these activities is mapped into the Predictive Capability Maturity Model to assess credibility of the model used for the reliability predictions. With regard to specific technical aspects of our analysis, we employ surrogate-based methods, primarily based on polynomial chaos expansions and Gaussian processes, for model calibration, sensitivity analysis, and uncertainty quantification in order to reduce the number of simulations that must be done. The goal is to tip the tradeoff balance to improving accuracy without increasing the computational demands.« less

  17. Effectiveness evaluation of STOL transport operations (phase 2). [computer simulation program of commercial short haul aircraft operations

    NASA Technical Reports Server (NTRS)

    Welp, D. W.; Brown, R. A.; Ullman, D. G.; Kuhner, M. B.

    1974-01-01

    A computer simulation program which models a commercial short-haul aircraft operating in the civil air system was developed. The purpose of the program is to evaluate the effect of a given aircraft avionics capability on the ability of the aircraft to perform on-time carrier operations. The program outputs consist primarily of those quantities which can be used to determine direct operating costs. These include: (1) schedule reliability or delays, (2) repairs/replacements, (3) fuel consumption, and (4) cancellations. More comprehensive models of the terminal area environment were added and a simulation of an existing airline operation was conducted to obtain a form of model verification. The capability of the program to provide comparative results (sensitivity analysis) was then demonstrated by modifying the aircraft avionics capability for additional computer simulations.

  18. Simulating fail-stop in asynchronous distributed systems

    NASA Technical Reports Server (NTRS)

    Sabel, Laura; Marzullo, Keith

    1994-01-01

    The fail-stop failure model appears frequently in the distributed systems literature. However, in an asynchronous distributed system, the fail-stop model cannot be implemented. In particular, it is impossible to reliably detect crash failures in an asynchronous system. In this paper, we show that it is possible to specify and implement a failure model that is indistinguishable from the fail-stop model from the point of view of any process within an asynchronous system. We give necessary conditions for a failure model to be indistinguishable from the fail-stop model, and derive lower bounds on the amount of process replication needed to implement such a failure model. We present a simple one-round protocol for implementing one such failure model, which we call simulated fail-stop.

  19. Partnering to Establish and Study Simulation in International Nursing Education.

    PubMed

    Garner, Shelby L; Killingsworth, Erin; Raj, Leena

    The purpose of this article was to describe an international partnership to establish and study simulation in India. A pilot study was performed to determine interrater reliability among faculty new to simulation when evaluating nursing student competency performance. Interrater reliability was below the ideal agreement level. Findings in this study underscore the need to obtain baseline interrater reliability data before integrating competency evaluation into a simulation program.

  20. Simplified human model and pedestrian simulation in the millimeter-wave region

    NASA Astrophysics Data System (ADS)

    Han, Junghwan; Kim, Seok; Lee, Tae-Yun; Ka, Min-Ho

    2016-02-01

    The 24 GHz and 77 GHz radar sensors have been studied as a strong candidate for advanced driver assistance systems(ADAS) because of their all-weather capability and accurate range and radial velocity measuring scheme. However, developing a reliable pedestrian recognition system hasmany obstacles due to the inaccurate and non-trivial radar responses at these high frequencies and the many combinations of clothes and accessories. To overcome these obstacles, many researchers used electromagnetic (EM) simulation to characterize the radar scattering response of a human. However, human simulation takes so long time because of the electrically huge size of a human in the millimeter-wave region. To reduce simulation time, some researchers assumed the skin of a human is the perfect electric conductor (PEC) and have simulated the PEC human model using physical optics (PO) algorithm without a specific explanation about how the human body could be modeled with PEC. In this study, the validity of the assumption that the surface of the human body is considered PEC in the EM simulation is verified, and the simulation result of the dry skin human model is compared with that of the PEC human model.

  1. Reliability of Memories Protected by Multibit Error Correction Codes Against MBUs

    NASA Astrophysics Data System (ADS)

    Ming, Zhu; Yi, Xiao Li; Chang, Liu; Wei, Zhang Jian

    2011-02-01

    As technology scales, more and more memory cells can be placed in a die. Therefore, the probability that a single event induces multiple bit upsets (MBUs) in adjacent memory cells gets greater. Generally, multibit error correction codes (MECCs) are effective approaches to mitigate MBUs in memories. In order to evaluate the robustness of protected memories, reliability models have been widely studied nowadays. Instead of irradiation experiments, the models can be used to quickly evaluate the reliability of memories in the early design. To build an accurate model, some situations should be considered. Firstly, when MBUs are presented in memories, the errors induced by several events may overlap each other, which is more frequent than single event upset (SEU) case. Furthermore, radiation experiments show that the probability of MBUs strongly depends on angles of the radiation event. However, reliability models which consider the overlap of multiple bit errors and angles of radiation event have not been proposed in the present literature. In this paper, a more accurate model of memories with MECCs is presented. Both the overlap of multiple bit errors and angles of event are considered in the model, which produces a more precise analysis in the calculation of mean time to failure (MTTF) for memory systems under MBUs. In addition, memories with scrubbing and nonscrubbing are analyzed in the proposed model. Finally, we evaluate the reliability of memories under MBUs in Matlab. The simulation results verify the validity of the proposed model.

  2. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  3. Computational models of aortic coarctation in hypoplastic left heart syndrome: considerations on validation of a detailed 3D model.

    PubMed

    Biglino, Giovanni; Corsini, Chiara; Schievano, Silvia; Dubini, Gabriele; Giardini, Alessandro; Hsia, Tain-Yen; Pennati, Giancarlo; Taylor, Andrew M

    2014-05-01

    Reliability of computational models for cardiovascular investigations strongly depends on their validation against physical data. This study aims to experimentally validate a computational model of complex congenital heart disease (i.e., surgically palliated hypoplastic left heart syndrome with aortic coarctation) thus demonstrating that hemodynamic information can be reliably extrapolated from the model for clinically meaningful investigations. A patient-specific aortic arch model was tested in a mock circulatory system and the same flow conditions were re-created in silico, by setting an appropriate lumped parameter network (LPN) attached to the same three-dimensional (3D) aortic model (i.e., multi-scale approach). The model included a modified Blalock-Taussig shunt and coarctation of the aorta. Different flow regimes were tested as well as the impact of uncertainty in viscosity. Computational flow and pressure results were in good agreement with the experimental signals, both qualitatively, in terms of the shape of the waveforms, and quantitatively (mean aortic pressure 62.3 vs. 65.1 mmHg, 4.8% difference; mean aortic flow 28.0 vs. 28.4% inlet flow, 1.4% difference; coarctation pressure drop 30.0 vs. 33.5 mmHg, 10.4% difference), proving the reliability of the numerical approach. It was observed that substantial changes in fluid viscosity or using a turbulent model in the numerical simulations did not significantly affect flows and pressures of the investigated physiology. Results highlighted how the non-linear fluid dynamic phenomena occurring in vitro must be properly described to ensure satisfactory agreement. This study presents methodological considerations for using experimental data to preliminarily set up a computational model, and then simulate a complex congenital physiology using a multi-scale approach.

  4. Modeling of turbulence and transition

    NASA Technical Reports Server (NTRS)

    Shih, Tsan-Hsing

    1992-01-01

    The first objective is to evaluate current two-equation and second order closure turbulence models using available direct numerical simulations and experiments, and to identify the models which represent the state of the art in turbulence modeling. The second objective is to study the near-wall behavior of turbulence, and to develop reliable models for an engineering calculation of turbulence and transition. The third objective is to develop a two-scale model for compressible turbulence.

  5. Reliable low precision simulations in land surface models

    NASA Astrophysics Data System (ADS)

    Dawson, Andrew; Düben, Peter D.; MacLeod, David A.; Palmer, Tim N.

    2017-12-01

    Weather and climate models must continue to increase in both resolution and complexity in order that forecasts become more accurate and reliable. Moving to lower numerical precision may be an essential tool for coping with the demand for ever increasing model complexity in addition to increasing computing resources. However, there have been some concerns in the weather and climate modelling community over the suitability of lower precision for climate models, particularly for representing processes that change very slowly over long time-scales. These processes are difficult to represent using low precision due to time increments being systematically rounded to zero. Idealised simulations are used to demonstrate that a model of deep soil heat diffusion that fails when run in single precision can be modified to work correctly using low precision, by splitting up the model into a small higher precision part and a low precision part. This strategy retains the computational benefits of reduced precision whilst preserving accuracy. This same technique is also applied to a full complexity land surface model, resulting in rounding errors that are significantly smaller than initial condition and parameter uncertainties. Although lower precision will present some problems for the weather and climate modelling community, many of the problems can likely be overcome using a straightforward and physically motivated application of reduced precision.

  6. APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS

    NASA Astrophysics Data System (ADS)

    Mehran, Babak; Nakamura, Hideki

    Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.

  7. Interaction and Impact Studies for Distributed Energy Resource, Transactive Energy, and Electric Grid, using High Performance Computing ?based Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, B. M.

    The electric utility industry is undergoing significant transformations in its operation model, including a greater emphasis on automation, monitoring technologies, and distributed energy resource management systems (DERMS). With these changes and new technologies, while driving greater efficiencies and reliability, these new models may introduce new vectors of cyber attack. The appropriate cybersecurity controls to address and mitigate these newly introduced attack vectors and potential vulnerabilities are still widely unknown and performance of the control is difficult to vet. This proposal argues that modeling and simulation (M&S) is a necessary tool to address and better understand these problems introduced by emergingmore » technologies for the grid. M&S will provide electric utilities a platform to model its transmission and distribution systems and run various simulations against the model to better understand the operational impact and performance of cybersecurity controls.« less

  8. The Prime Diabetes Model: Novel Methods for Estimating Long-Term Clinical and Cost Outcomes in Type 1 Diabetes Mellitus.

    PubMed

    Valentine, William J; Pollock, Richard F; Saunders, Rhodri; Bae, Jay; Norrbacka, Kirsi; Boye, Kristina

    Recent publications describing long-term follow-up from landmark trials and diabetes registries represent an opportunity to revisit modeling options in type 1 diabetes mellitus (T1DM). To develop a new product-independent model capable of predicting long-term clinical and cost outcomes. After a systematic literature review to identify clinical trial and registry data, a model was developed (the PRIME Diabetes Model) to simulate T1DM progression and complication onset. The model runs as a patient-level simulation, making use of covariance matrices for cohort generation and risk factor progression, and simulating myocardial infarction, stroke, angina, heart failure, nephropathy, retinopathy, macular edema, neuropathy, amputation, hypoglycemia, ketoacidosis, mortality, and risk factor evolution. Several approaches novel to T1DM modeling were used, including patient characteristics and risk factor covariance, a glycated hemoglobin progression model derived from patient-level data, and model averaging approaches to evaluate complication risk. Validation analyses comparing modeled outcomes with published studies demonstrated that the PRIME Diabetes Model projects long-term patient outcomes consistent with those reported for a number of long-term studies. Macrovascular end points were reliably reproduced across five different populations and microvascular complication risk was accurately predicted on the basis of comparisons with landmark studies and published registry data. The PRIME Diabetes Model is product-independent, available online, and has been developed in line with good practice guidelines. Validation has indicated that outcomes from long-term studies can be reliably reproduced. The model offers new approaches to long-standing challenges in diabetes modeling and may become a valuable tool for informing health care policy. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. The fourth radiation transfer model intercomparison (RAMI-IV): Proficiency testing of canopy reflectance models with ISO-13528

    NASA Astrophysics Data System (ADS)

    Widlowski, J.-L.; Pinty, B.; Lopatka, M.; Atzberger, C.; Buzica, D.; Chelle, M.; Disney, M.; Gastellu-Etchegorry, J.-P.; Gerboles, M.; Gobron, N.; Grau, E.; Huang, H.; Kallel, A.; Kobayashi, H.; Lewis, P. E.; Qin, W.; Schlerf, M.; Stuckens, J.; Xie, D.

    2013-07-01

    The radiation transfer model intercomparison (RAMI) activity aims at assessing the reliability of physics-based radiative transfer (RT) models under controlled experimental conditions. RAMI focuses on computer simulation models that mimic the interactions of radiation with plant canopies. These models are increasingly used in the development of satellite retrieval algorithms for terrestrial essential climate variables (ECVs). Rather than applying ad hoc performance metrics, RAMI-IV makes use of existing ISO standards to enhance the rigor of its protocols evaluating the quality of RT models. ISO-13528 was developed "to determine the performance of individual laboratories for specific tests or measurements." More specifically, it aims to guarantee that measurement results fall within specified tolerance criteria from a known reference. Of particular interest to RAMI is that ISO-13528 provides guidelines for comparisons where the true value of the target quantity is unknown. In those cases, "truth" must be replaced by a reliable "conventional reference value" to enable absolute performance tests. This contribution will show, for the first time, how the ISO-13528 standard developed by the chemical and physical measurement communities can be applied to proficiency testing of computer simulation models. Step by step, the pre-screening of data, the identification of reference solutions, and the choice of proficiency statistics will be discussed and illustrated with simulation results from the RAMI-IV "abstract canopy" scenarios. Detailed performance statistics of the participating RT models will be provided and the role of the accuracy of the reference solutions as well as the choice of the tolerance criteria will be highlighted.

  10. Simulation Modeling of the C-5 Galaxy High Velocity Regionalized Isochronal (HVRISO) Inspection Concept

    DTIC Science & Technology

    2009-03-01

    flu en ce Lo g Q ue ue 4 8 X 2 0 100 200 300 400 500 600 700 Row s Breusch - Pagan Response Residual Log Queue 48 X Squared Whole Model Actual...aircraft cannot be immediately inducted into the servicing inspection dock. This study uses discrete-event simulation techniques to test the...for a 10 percent boost in reliability (Hebert, 2007). With 2 C-5Bs and 1 C-5A retrofitted with RERP for test and evaluation purposes, Air Force

  11. Introducing ab initio based neural networks for transition-rate prediction in kinetic Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Messina, Luca; Castin, Nicolas; Domain, Christophe; Olsson, Pär

    2017-02-01

    The quality of kinetic Monte Carlo (KMC) simulations of microstructure evolution in alloys relies on the parametrization of point-defect migration rates, which are complex functions of the local chemical composition and can be calculated accurately with ab initio methods. However, constructing reliable models that ensure the best possible transfer of physical information from ab initio to KMC is a challenging task. This work presents an innovative approach, where the transition rates are predicted by artificial neural networks trained on a database of 2000 migration barriers, obtained with density functional theory (DFT) in place of interatomic potentials. The method is tested on copper precipitation in thermally aged iron alloys, by means of a hybrid atomistic-object KMC model. For the object part of the model, the stability and mobility properties of copper-vacancy clusters are analyzed by means of independent atomistic KMC simulations, driven by the same neural networks. The cluster diffusion coefficients and mean free paths are found to increase with size, confirming the dominant role of coarsening of medium- and large-sized clusters in the precipitation kinetics. The evolution under thermal aging is in better agreement with experiments with respect to a previous interatomic-potential model, especially concerning the experiment time scales. However, the model underestimates the solubility of copper in iron due to the excessively high solution energy predicted by the chosen DFT method. Nevertheless, this work proves the capability of neural networks to transfer complex ab initio physical properties to higher-scale models, and facilitates the extension to systems with increasing chemical complexity, setting the ground for reliable microstructure evolution simulations in a wide range of alloys and applications.

  12. Integrating O/S models during conceptual design, part 3

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    Space vehicles, such as the Space Shuttle, require intensive ground support prior to, during, and after each mission. Maintenance is a significant part of that ground support. All space vehicles require scheduled maintenance to ensure operability and performance. In addition, components of any vehicle are not one-hundred percent reliable so they exhibit random failures. Once detected, a failure initiates unscheduled maintenance on the vehicle. Maintenance decreases the number of missions which can be completed by keeping vehicles out of service so that the time between the completion of one mission and the start of the next is increased. Maintenance also requires resources such as people, facilities, tooling, and spare parts. Assessing the mission capability and resource requirements of any new space vehicle, in addition to performance specification, is necessary to predict the life cycle cost and success of the vehicle. Maintenance and logistics support has been modeled by computer simulation to estimate mission capability and resource requirements for evaluation of proposed space vehicles. The simulation was written with Simulation Language for Alternative Modeling II (SLAM II) for execution on a personal computer. For either one or a fleet of space vehicles, the model simulates the preflight maintenance checks, the mission and return to earth, and the post flight maintenance in preparation to be sent back into space. THe model enables prediction of the number of missions possible and vehicle turn-time (the time between completion of one mission and the start of the next) given estimated values for component reliability and maintainability. The model also facilitates study of the manpower and vehicle requirements for the proposed vehicle to meet its desired mission rate. This is the 3rd part of a 3 part technical report.

  13. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems

    PubMed Central

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data. PMID:28806754

  14. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems.

    PubMed

    Almaraashi, Majid

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data.

  15. Reliability of numerical wind tunnels for VAWT simulation

    NASA Astrophysics Data System (ADS)

    Raciti Castelli, M.; Masi, M.; Battisti, L.; Benini, E.; Brighenti, A.; Dossena, V.; Persico, G.

    2016-09-01

    Computational Fluid Dynamics (CFD) based on the Unsteady Reynolds Averaged Navier Stokes (URANS) equations have long been widely used to study vertical axis wind turbines (VAWTs). Following a comprehensive experimental survey on the wakes downwind of a troposkien-shaped rotor, a campaign of bi-dimensional simulations is presented here, with the aim of assessing its reliability in reproducing the main features of the flow, also identifying areas needing additional research. Starting from both a well consolidated turbulence model (k-ω SST) and an unstructured grid typology, the main simulation settings are here manipulated in a convenient form to tackle rotating grids reproducing a VAWT operating in an open jet wind tunnel. The dependence of the numerical predictions from the selected grid spacing is investigated, thus establishing the less refined grid size that is still capable of capturing some relevant flow features such as integral quantities (rotor torque) and local ones (wake velocities).

  16. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    PubMed

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  17. Computer-assisted midface reconstruction in Treacher Collins syndrome part 1: skeletal reconstruction.

    PubMed

    Herlin, Christian; Doucet, Jean Charles; Bigorre, Michèle; Khelifa, Hatem Cheikh; Captier, Guillaume

    2013-10-01

    Treacher Collins syndrome (TCS) is a severe and complex craniofacial malformation affecting the facial skeleton and soft tissues. The palate as well as the external and middle ear are also affected, but his prognosis is mainly related to neonatal airway management. Methods of zygomatico-orbital reconstruction are numerous and currently use primarily autologous bone, lyophilized cartilage, alloplastic implants, or even free flaps. This work developed a reliable "customized" method of zygomatico-orbital bony reconstruction using a generic reference model tailored to each patient. From a standard computed tomography (CT) acquisition, we studied qualitatively and quantitatively the skeleton of four individuals with TCS whose age was between 6 and 20 years. In parallel, we studied 40 controls at the same age to obtain a morphometric database of reference. Surgical simulation was carried out using validated software used in craniofacial surgery. The zygomatic hypoplasia was very important quantitatively and morphologically in all TCS individuals. Orbital involvement was mainly morphological, with volumes comparable to the controls of the same age. The control database was used to create three-dimensional computer models to be used in the manufacture of cutting guides for autologous cranial bone grafts or alloplastic implants perfectly adapted to each patient's morphology. Presurgical simulation was also used to fabricate custom positioning guides permitting a simple and reliable surgical procedure. The use of a virtual database allowed us to design a reliable and reproducible skeletal reconstruction method for this rare and complex syndrome. The use of presurgical simulation tools seem essential in this type of craniofacial malformation to increase the reliability of these uncommon and complex surgical procedures, and to ensure stable results over time. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  18. Improved Rubin-Bodner Model for the Prediction of Soft Tissue Deformations

    PubMed Central

    Zhang, Guangming; Xia, James J.; Liebschner, Michael; Zhang, Xiaoyan; Kim, Daeseung; Zhou, Xiaobo

    2016-01-01

    In craniomaxillofacial (CMF) surgery, a reliable way of simulating the soft tissue deformation resulted from skeletal reconstruction is vitally important for preventing the risks of facial distortion postoperatively. However, it is difficult to simulate the soft tissue behaviors affected by different types of CMF surgery. This study presents an integrated bio-mechanical and statistical learning model to improve accuracy and reliability of predictions on soft facial tissue behavior. The Rubin-Bodner (RB) model is initially used to describe the biomechanical behavior of the soft facial tissue. Subsequently, a finite element model (FEM) computers the stress of each node in soft facial tissue mesh data resulted from bone displacement. Next, the Generalized Regression Neural Network (GRNN) method is implemented to obtain the relationship between the facial soft tissue deformation and the stress distribution corresponding to different CMF surgical types and to improve evaluation of elastic parameters included in the RB model. Therefore, the soft facial tissue deformation can be predicted by biomechanical properties and statistical model. Leave-one-out cross-validation is used on eleven patients. As a result, the average prediction error of our model (0.7035mm) is lower than those resulting from other approaches. It also demonstrates that the more accurate bio-mechanical information the model has, the better prediction performance it could achieve. PMID:27717593

  19. Engineering Risk Assessment of Space Thruster Challenge Problem

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie

    2014-01-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.

  20. Reliability of emerging bonded interface materials for large-area attachments

    DOE PAGES

    Paret, Paul P.; DeVoto, Douglas J.; Narumanchi, Sreekant

    2015-12-30

    In this study, conventional thermal interface materials (TIMs), such as greases, gels, and phase change materials, pose bottlenecks to heat removal and have long caused reliability issues in automotive power electronics packages. Bonded interface materials (BIMs) with superior thermal performance have the potential to be a replacement to the conventional TIMs. However, due to coefficient of thermal expansion mismatches between different components in a package and resultant thermomechanical stresses, fractures or delamination could occur, causing serious reliability concerns. These defects manifest themselves in increased thermal resistance in the package. In this paper, the results of reliability evaluation of emerging BIMsmore » for large-area attachments in power electronics packaging are reported. Thermoplastic (polyamide) adhesive with embedded near-vertical-aligned carbon fibers, sintered silver, and conventional lead solder (Sn 63Pb 37) materials were bonded between 50.8 mm x 50.8 mm cross-sectional footprint silicon nitride substrates and copper base plate samples, and were subjected to accelerated thermal cycling until failure or 2500 cycles. Damage in the BIMs was monitored every 100 cycles by scanning acoustic microscopy. Thermoplastic with embedded carbon fibers performed the best with no defects, whereas sintered silver and lead solder failed at 2300 and 1400 thermal cycles, respectively. Besides thermal cycling, additional lead solder samples were subjected to thermal shock and thermal cycling with extended dwell periods. A finite element method (FEM)-based model was developed to simulate the behavior of lead solder under thermomechanical loading. Strain energy density per cycle results were calculated from the FEM simulations. A predictive lifetime model was formulated for lead solder by correlating strain energy density results extracted from modeling with cycles-to-failure obtained from experimental accelerated tests. A power-law-based approach was used to formulate the - redictive lifetime model.« less

  1. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.

  2. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  3. A Dynamic Multi-Projection-Contour Approximating Framework for the 3D Reconstruction of Buildings by Super-Generalized Optical Stereo-Pairs.

    PubMed

    Yan, Yiming; Su, Nan; Zhao, Chunhui; Wang, Liguo

    2017-09-19

    In this paper, a novel framework of the 3D reconstruction of buildings is proposed, focusing on remote sensing super-generalized stereo-pairs (SGSPs). As we all know, 3D reconstruction cannot be well performed using nonstandard stereo pairs, since reliable stereo matching could not be achieved when the image-pairs are collected at a great difference of views, and we always failed to obtain dense 3D points for regions of buildings, and cannot do further 3D shape reconstruction. We defined SGSPs as two or more optical images collected in less constrained views but covering the same buildings. It is even more difficult to reconstruct the 3D shape of a building by SGSPs using traditional frameworks. As a result, a dynamic multi-projection-contour approximating (DMPCA) framework was introduced for SGSP-based 3D reconstruction. The key idea is that we do an optimization to find a group of parameters of a simulated 3D model and use a binary feature-image that minimizes the total differences between projection-contours of the building in the SGSPs and that in the simulated 3D model. Then, the simulated 3D model, defined by the group of parameters, could approximate the actual 3D shape of the building. Certain parameterized 3D basic-unit-models of typical buildings were designed, and a simulated projection system was established to obtain a simulated projection-contour in different views. Moreover, the artificial bee colony algorithm was employed to solve the optimization. With SGSPs collected by the satellite and our unmanned aerial vehicle, the DMPCA framework was verified by a group of experiments, which demonstrated the reliability and advantages of this work.

  4. GPU based 3D feature profile simulation of high-aspect ratio contact hole etch process under fluorocarbon plasmas

    NASA Astrophysics Data System (ADS)

    Chun, Poo-Reum; Lee, Se-Ah; Yook, Yeong-Geun; Choi, Kwang-Sung; Cho, Deog-Geun; Yu, Dong-Hun; Chang, Won-Seok; Kwon, Deuk-Chul; Im, Yeon-Ho

    2013-09-01

    Although plasma etch profile simulation has been attracted much interest for developing reliable plasma etching, there still exist big gaps between current research status and predictable modeling due to the inherent complexity of plasma process. As an effort to address this issue, we present 3D feature profile simulation coupled with well-defined plasma-surface kinetic model for silicon dioxide etching process under fluorocarbon plasmas. To capture the realistic plasma surface reaction behaviors, a polymer layer based surface kinetic model was proposed to consider the simultaneous polymer deposition and oxide etching. Finally, the realistic plasma surface model was used for calculation of speed function for 3D topology simulation, which consists of multiple level set based moving algorithm, and ballistic transport module. In addition, the time consumable computations in the ballistic transport calculation were improved drastically by GPU based numerical computation, leading to the real time computation. Finally, we demonstrated that the surface kinetic model could be coupled successfully for 3D etch profile simulations in high-aspect ratio contact hole plasma etching.

  5. Simulation of Range Safety for the NASA Space Shuttle

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Sepulveda, Jose; Compton, Jeppie; Turner, Robert

    2005-01-01

    This paper describes a simulation environment that seamlessly combines a number of safety and environmental models for the launch phase of a NASA Space Shuttle mission. The components of this simulation environment represent the different systems that must interact in order to determine the Expectation of casualties (E(sub c)) resulting from the toxic effects of the gas dispersion that occurs after a disaster affecting a Space Shuttle within 120 seconds of lift-off. The utilization of the Space Shuttle reliability models, trajectory models, weather dissemination systems, population models, amount and type of toxicants, gas dispersion models, human response functions to toxicants, and a geographical information system are all integrated to create this environment. This simulation environment can help safety managers estimate the population at risk in order to plan evacuation, make sheltering decisions, determine the resources required to provide aid and comfort, and mitigate damages in case of a disaster. This simulation environment may also be modified and used for the landing phase of a space vehicle but will not be discussed in this paper.

  6. Development and analysis of a finite element model to simulate pulmonary emphysema in CT imaging.

    PubMed

    Diciotti, Stefano; Nobis, Alessandro; Ciulli, Stefano; Landini, Nicholas; Mascalchi, Mario; Sverzellati, Nicola; Innocenti, Bernardo

    2015-01-01

    In CT imaging, pulmonary emphysema appears as lung regions with Low-Attenuation Areas (LAA). In this study we propose a finite element (FE) model of lung parenchyma, based on a 2-D grid of beam elements, which simulates pulmonary emphysema related to smoking in CT imaging. Simulated LAA images were generated through space sampling of the model output. We employed two measurements of emphysema extent: Relative Area (RA) and the exponent D of the cumulative distribution function of LAA clusters size. The model has been used to compare RA and D computed on the simulated LAA images with those computed on the models output. Different mesh element sizes and various model parameters, simulating different physiological/pathological conditions, have been considered and analyzed. A proper mesh element size has been determined as the best trade-off between reliable results and reasonable computational cost. Both RA and D computed on simulated LAA images were underestimated with respect to those calculated on the models output. Such underestimations were larger for RA (≈ -44 ÷ -26%) as compared to those for D (≈ -16 ÷ -2%). Our FE model could be useful to generate standard test images and to design realistic physical phantoms of LAA images for the assessment of the accuracy of descriptors for quantifying emphysema in CT imaging.

  7. Modeling of the thermal physical process and study on the reliability of linear energy density for selective laser melting

    NASA Astrophysics Data System (ADS)

    Xiang, Zhaowei; Yin, Ming; Dong, Guanhua; Mei, Xiaoqin; Yin, Guofu

    2018-06-01

    A finite element model considering volume shrinkage with powder-to-dense process of powder layer in selective laser melting (SLM) is established. Comparison between models that consider and do not consider volume shrinkage or powder-to-dense process is carried out. Further, parametric analysis of laser power and scan speed is conducted and the reliability of linear energy density as a design parameter is investigated. The results show that the established model is an effective method and has better accuracy allowing for the temperature distribution, and the length and depth of molten pool. The maximum temperature is more sensitive to laser power than scan speed. The maximum heating rate and cooling rate increase with increasing scan speed at constant laser power and increase with increasing laser power at constant scan speed as well. The simulation results and experimental result reveal that linear energy density is not always reliable using as a design parameter in the SLM.

  8. Numerical modeling of local scour around hydraulic structure in sandy beds by dynamic mesh method

    NASA Astrophysics Data System (ADS)

    Fan, Fei; Liang, Bingchen; Bai, Yuchuan; Zhu, Zhixia; Zhu, Yanjun

    2017-10-01

    Local scour, a non-negligible factor in hydraulic engineering, endangers the safety of hydraulic structures. In this work, a numerical model for simulating local scour was constructed, based on the open source code computational fluid dynamics model OpenFOAM. We consider both the bedload and suspended load sediment transport in the scour model and adopt the dynamic mesh method to simulate the evolution of the bed elevation. We use the finite area method to project data between the three-dimensional flow model and the two-dimensional (2D) scour model. We also improved the 2D sand slide method and added it to the scour model to correct the bed bathymetry when the bed slope angle exceeds the angle of repose. Moreover, to validate our scour model, we conducted and compared the results of three experiments with those of the developed model. The validation results show that our developed model can reliably simulate local scour.

  9. Snow water equivalent in the Alps as seen by gridded data sets, CMIP5 and CORDEX climate models

    NASA Astrophysics Data System (ADS)

    Terzago, Silvia; von Hardenberg, Jost; Palazzi, Elisa; Provenzale, Antonello

    2017-07-01

    The estimate of the current and future conditions of snow resources in mountain areas would require reliable, kilometre-resolution, regional-observation-based gridded data sets and climate models capable of properly representing snow processes and snow-climate interactions. At the moment, the development of such tools is hampered by the sparseness of station-based reference observations. In past decades passive microwave remote sensing and reanalysis products have mainly been used to infer information on the snow water equivalent distribution. However, the investigation has usually been limited to flat terrains as the reliability of these products in mountain areas is poorly characterized.This work considers the available snow water equivalent data sets from remote sensing and from reanalyses for the greater Alpine region (GAR), and explores their ability to provide a coherent view of the snow water equivalent distribution and climatology in this area. Further we analyse the simulations from the latest-generation regional and global climate models (RCMs, GCMs), participating in the Coordinated Regional Climate Downscaling Experiment over the European domain (EURO-CORDEX) and in the Fifth Coupled Model Intercomparison Project (CMIP5) respectively. We evaluate their reliability in reproducing the main drivers of snow processes - near-surface air temperature and precipitation - against the observational data set EOBS, and compare the snow water equivalent climatology with the remote sensing and reanalysis data sets previously considered. We critically discuss the model limitations in the historical period and we explore their potential in providing reliable future projections.The results of the analysis show that the time-averaged spatial distribution of snow water equivalent and the amplitude of its annual cycle are reproduced quite differently by the different remote sensing and reanalysis data sets, which in fact exhibit a large spread around the ensemble mean. We find that GCMs at spatial resolutions equal to or finer than 1.25° longitude are in closer agreement with the ensemble mean of satellite and reanalysis products in terms of root mean square error and standard deviation than lower-resolution GCMs. The set of regional climate models from the EURO-CORDEX ensemble provides estimates of snow water equivalent at 0.11° resolution that are locally much larger than those indicated by the gridded data sets, and only in a few cases are these differences smoothed out when snow water equivalent is spatially averaged over the entire Alpine domain. ERA-Interim-driven RCM simulations show an annual snow cycle that is comparable in amplitude to those provided by the reference data sets, while GCM-driven RCMs present a large positive bias. RCMs and higher-resolution GCM simulations are used to provide an estimate of the snow reduction expected by the mid-21st century (RCP 8.5 scenario) compared to the historical climatology, with the main purpose of highlighting the limits of our current knowledge and the need for developing more reliable snow simulations.

  10. Reliable before-fabrication forecasting of normal and touch mode MEMS capacitive pressure sensor: modeling and simulation

    NASA Astrophysics Data System (ADS)

    Jindal, Sumit Kumar; Mahajan, Ankush; Raghuwanshi, Sanjeev Kumar

    2017-10-01

    An analytical model and numerical simulation for the performance of MEMS capacitive pressure sensors in both normal and touch modes is required for expected behavior of the sensor prior to their fabrication. Obtaining such information should be based on a complete analysis of performance parameters such as deflection of diaphragm, change of capacitance when the diaphragm deflects, and sensitivity of the sensor. In the literature, limited work has been carried out on the above-stated issue; moreover, due to approximation factors of polynomials, a tolerance error cannot be overseen. Reliable before-fabrication forecasting requires exact mathematical calculation of the parameters involved. A second-order polynomial equation is calculated mathematically for key performance parameters of both modes. This eliminates the approximation factor, and an exact result can be studied, maintaining high accuracy. The elimination of approximation factors and an approach of exact results are based on a new design parameter (δ) that we propose. The design parameter gives an initial hint to the designers on how the sensor will behave once it is fabricated. The complete work is aided by extensive mathematical detailing of all the parameters involved. Next, we verified our claims using MATLAB® simulation. Since MATLAB® effectively provides the simulation theory for the design approach, more complicated finite element method is not used.

  11. Validation of NE-TWIGS for tolerant hardwood stands in Ontario

    Treesearch

    Jacek Bankowski; Daniel C. Dey; Eric Boysen; Murray Woods; Jim Rice

    1996-01-01

    The individual-tree, distance-independent stand growth simulator NE-TWIGS has been tested for Ontario's tolerant hardwood stands using data from long-term permanent sample plots. NE-TWIGS provides reliable short-term (5-year) predictions of stand basal area (modelling efficiency from 77% to 99%), but in longer projections the efficiency of the model drops...

  12. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation

    USDA-ARS?s Scientific Manuscript database

    The scale mismatch between remotely sensed observations and crop growth models simulated state variables decreases the reliability of crop yield estimates. To overcome this problem, we used a two-step data assimilation phases: first we generated a complete leaf area index (LAI) time series by combin...

  13. System Design under Uncertainty: Evolutionary Optimization of the Gravity Probe-B Spacecraft

    NASA Technical Reports Server (NTRS)

    Pullen, Samuel P.; Parkinson, Bradford W.

    1994-01-01

    This paper discusses the application of evolutionary random-search algorithms (Simulated Annealing and Genetic Algorithms) to the problem of spacecraft design under performance uncertainty. Traditionally, spacecraft performance uncertainty has been measured by reliability. Published algorithms for reliability optimization are seldom used in practice because they oversimplify reality. The algorithm developed here uses random-search optimization to allow us to model the problem more realistically. Monte Carlo simulations are used to evaluate the objective function for each trial design solution. These methods have been applied to the Gravity Probe-B (GP-B) spacecraft being developed at Stanford University for launch in 1999, Results of the algorithm developed here for GP-13 are shown, and their implications for design optimization by evolutionary algorithms are discussed.

  14. Modeling the data management system of Space Station Freedom with DEPEND

    NASA Technical Reports Server (NTRS)

    Olson, Daniel P.; Iyer, Ravishankar K.; Boyd, Mark A.

    1993-01-01

    Some of the features and capabilities of the DEPEND simulation-based modeling tool are described. A study of a 1553B local bus subsystem of the Space Station Freedom Data Management System (SSF DMS) is used to illustrate some types of system behavior that can be important to reliability and performance evaluations of this type of spacecraft. A DEPEND model of the subsystem is used to illustrate how these types of system behavior can be modeled, and shows what kinds of engineering and design questions can be answered through the use of these modeling techniques. DEPEND's process-based simulation environment is shown to provide a flexible method for modeling complex interactions between hardware and software elements of a fault-tolerant computing system.

  15. Atmospheric Test Models and Numerical Experiments for the Simulation of the Global Distributions of Weather Data Transponders III. Horizontal Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molenkamp, C.R.; Grossman, A.

    1999-12-20

    A network of small balloon-borne transponders which gather very high resolution wind and temperature data for use by modern numerical weather predication models has been proposed to improve the reliability of long-range weather forecasts. The global distribution of an array of such transponders is simulated using LLNL's atmospheric parcel transport model (GRANTOUR) with winds supplied by two different general circulation models. An initial study used winds from CCM3 with a horizontal resolution of about 3 degrees in latitude and longitude, and a second study used winds from NOGAPS with a 0.75 degree horizontal resolution. Results from both simulations show thatmore » reasonable global coverage can be attained by releasing balloons from an appropriate set of launch sites.« less

  16. Assessing communication quality of consultations in primary care: initial reliability of the Global Consultation Rating Scale, based on the Calgary-Cambridge Guide to the Medical Interview

    PubMed Central

    Burt, Jenni; Abel, Gary; Elmore, Natasha; Campbell, John; Roland, Martin; Benson, John; Silverman, Jonathan

    2014-01-01

    Objectives To investigate initial reliability of the Global Consultation Rating Scale (GCRS: an instrument to assess the effectiveness of communication across an entire doctor–patient consultation, based on the Calgary-Cambridge guide to the medical interview), in simulated patient consultations. Design Multiple ratings of simulated general practitioner (GP)–patient consultations by trained GP evaluators. Setting UK primary care. Participants 21 GPs and six trained GP evaluators. Outcome measures GCRS score. Methods 6 GP raters used GCRS to rate randomly assigned video recordings of GP consultations with simulated patients. Each of the 42 consultations was rated separately by four raters. We considered whether a fixed difference between scores had the same meaning at all levels of performance. We then examined the reliability of GCRS using mixed linear regression models. We augmented our regression model to also examine whether there were systematic biases between the scores given by different raters and to look for possible order effects. Results Assessing the communication quality of individual consultations, GCRS achieved a reliability of 0.73 (95% CI 0.44 to 0.79) for two raters, 0.80 (0.54 to 0.85) for three and 0.85 (0.61 to 0.88) for four. We found an average difference of 1.65 (on a 0–10 scale) in the scores given by the least and most generous raters: adjusting for this evaluator bias increased reliability to 0.78 (0.53 to 0.83) for two raters; 0.85 (0.63 to 0.88) for three and 0.88 (0.69 to 0.91) for four. There were considerable order effects, with later consultations (after 15–20 ratings) receiving, on average, scores more than one point higher on a 0–10 scale. Conclusions GCRS shows good reliability with three raters assessing each consultation. We are currently developing the scale further by assessing a large sample of real-world consultations. PMID:24604483

  17. How well the Reliable Ensemble Averaging Method (REA) for 15 CMIP5 GCMs simulations works for Mexico?

    NASA Astrophysics Data System (ADS)

    Colorado, G.; Salinas, J. A.; Cavazos, T.; de Grau, P.

    2013-05-01

    15 CMIP5 GCMs precipitation simulations were combined in a weighted ensemble using the Reliable Ensemble Averaging (REA) method, obtaining the weight of each model. This was done for a historical period (1961-2000) and for the future emissions based on low (RCP4.5) and high (RCP8.5) radiating forcing for the period 2075-2099. The annual cycle of simple ensemble of the historical GCMs simulations, the historical REA average and the Climate Research Unit (CRU TS3.1) database was compared in four zones of México. In the case of precipitation we can see the improvements by using the REA method, especially in the two northern zones of México where the REA average is more close to the observations (CRU) that the simple average. However in the southern zones although there is an improvement it is not as good as it is in the north, particularly in the southeast where instead of the REA average is able to reproduce qualitatively good the annual cycle with the mid-summer drought it was greatly underestimated. The main reason is because the precipitation is underestimated for all the models and the mid-summer drought do not even exists in some models. In the REA average of the future scenarios, as we can expected, the most drastic decrease in precipitation was simulated using the RCP8.5 especially in the monsoon area and in the south of Mexico in summer and in winter. In the center and southern of Mexico however, the same scenario in autumn simulates an increase of precipitation.

  18. Systematic coarse-grained modeling of complexation between small interfering RNA and polycations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Zonghui; Luijten, Erik, E-mail: luijten@northwestern.edu; Department of Materials Science and Engineering, Northwestern University, Evanston, Illinois 60208

    All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed bindingmore » patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.« less

  19. A simulation model to estimate the cost and effectiveness of alternative dialysis initiation strategies.

    PubMed

    Lee, Chris P; Chertow, Glenn M; Zenios, Stefanos A

    2006-01-01

    Patients with end-stage renal disease (ESRD) require dialysis to maintain survival. The optimal timing of dialysis initiation in terms of cost-effectiveness has not been established. We developed a simulation model of individuals progressing towards ESRD and requiring dialysis. It can be used to analyze dialysis strategies and scenarios. It was embedded in an optimization frame worked to derive improved strategies. Actual (historical) and simulated survival curves and hospitalization rates were virtually indistinguishable. The model overestimated transplantation costs (10%) but it was related to confounding by Medicare coverage. To assess the model's robustness, we examined several dialysis strategies while input parameters were perturbed. Under all 38 scenarios, relative rankings remained unchanged. An improved policy for a hypothetical patient was derived using an optimization algorithm. The model produces reliable results and is robust. It enables the cost-effectiveness analysis of dialysis strategies.

  20. Mechatronic modeling of a 750kW fixed-speed wind energy conversion system using the Bond Graph Approach.

    PubMed

    Khaouch, Zakaria; Zekraoui, Mustapha; Bengourram, Jamaa; Kouider, Nourreeddine; Mabrouki, Mustapha

    2016-11-01

    In this paper, we would like to focus on modeling main parts of the wind turbines (blades, gearbox, tower, generator and pitching system) from a mechatronics viewpoint using the Bond-Graph Approach (BGA). Then, these parts are combined together in order to simulate the complete system. Moreover, the real dynamic behavior of the wind turbine is taken into account and with the new model; final load simulation is more realistic offering benefits and reliable system performance. This model can be used to develop control algorithms to reduce fatigue loads and enhance power production. Different simulations are carried-out in order to validate the proposed wind turbine model, using real data provided in the open literature (blade profile and gearbox parameters for a 750 kW wind turbine). Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Water planning in a mixed land use Mediterranean area: point-source abstraction and pollution scenarios by a numerical model of varying stream-aquifer regime.

    PubMed

    Du, Mingxuan; Fouché, Olivier; Zavattero, Elodie; Ma, Qiang; Delestre, Olivier; Gourbesville, Philippe

    2018-02-22

    Integrated hydrodynamic modelling is an efficient approach for making semi-quantitative scenarios reliable enough for groundwater management, provided that the numerical simulations are from a validated model. The model set-up, however, involves many inputs due to the complexity of both the hydrological system and the land use. The case study of a Mediterranean alluvial unconfined aquifer in the lower Var valley (Southern France) is useful to test a method to estimate lacking data on water abstraction by small farms in urban context. With this estimation of the undocumented pumping volumes, and after calibration of the exchange parameters of the stream-aquifer system with the help of a river model, the groundwater flow model shows a high goodness of fit with the measured potentiometric levels. The consistency between simulated results and real behaviour of the system, with regard to the observed effects of lowering weirs and previously published hydrochemistry data, confirms reliability of the groundwater flow model. On the other hand, accuracy of the transport model output may be influenced by many parameters, many of which are not derived from field measurements. In this case study, for which river-aquifer feeding is the main control, the partition coefficient between direct recharge and runoff does not show a significant effect on the transport model output, and therefore, uncertainty of the hydrological terms such as evapotranspiration and runoff is not a first-rank issue to the pollution propagation. The simulation of pollution scenarios with the model returns expected pessimistic outputs, with regard to hazard management. The model is now ready to be used in a decision support system by the local water supply managers.

  2. Linking 1D coastal ocean modelling to environmental management: an ensemble approach

    NASA Astrophysics Data System (ADS)

    Mussap, Giulia; Zavatarelli, Marco; Pinardi, Nadia

    2017-12-01

    The use of a one-dimensional interdisciplinary numerical model of the coastal ocean as a tool contributing to the formulation of ecosystem-based management (EBM) is explored. The focus is on the definition of an experimental design based on ensemble simulations, integrating variability linked to scenarios (characterised by changes in the system forcing) and to the concurrent variation of selected, and poorly constrained, model parameters. The modelling system used was previously specifically designed for the use in "data-rich" areas, so that horizontal dynamics can be resolved by a diagnostic approach and external inputs can be parameterised by nudging schemes properly calibrated. Ensembles determined by changes in the simulated environmental (physical and biogeochemical) dynamics, under joint forcing and parameterisation variations, highlight the uncertainties associated to the application of specific scenarios that are relevant to EBM, providing an assessment of the reliability of the predicted changes. The work has been carried out by implementing the coupled modelling system BFM-POM1D in an area of Gulf of Trieste (northern Adriatic Sea), considered homogeneous from the point of view of hydrological properties, and forcing it by changing climatic (warming) and anthropogenic (reduction of the land-based nutrient input) pressure. Model parameters affected by considerable uncertainties (due to the lack of relevant observations) were varied jointly with the scenarios of change. The resulting large set of ensemble simulations provided a general estimation of the model uncertainties related to the joint variation of pressures and model parameters. The information of the model result variability aimed at conveying efficiently and comprehensibly the information on the uncertainties/reliability of the model results to non-technical EBM planners and stakeholders, in order to have the model-based information effectively contributing to EBM.

  3. GASP-PL/I Simulation of Integrated Avionic System Processor Architectures. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brent, G. A.

    1978-01-01

    A development study sponsored by NASA was completed in July 1977 which proposed a complete integration of all aircraft instrumentation into a single modular system. Instead of using the current single-function aircraft instruments, computers compiled and displayed inflight information for the pilot. A processor architecture called the Team Architecture was proposed. This is a hardware/software approach to high-reliability computer systems. A follow-up study of the proposed Team Architecture is reported. GASP-PL/1 simulation models are used to evaluate the operating characteristics of the Team Architecture. The problem, model development, simulation programs and results at length are presented. Also included are program input formats, outputs and listings.

  4. Prediction of normalized biodiesel properties by simulation of multiple feedstock blends.

    PubMed

    García, Manuel; Gonzalo, Alberto; Sánchez, José Luis; Arauzo, Jesús; Peña, José Angel

    2010-06-01

    A continuous process for biodiesel production has been simulated using Aspen HYSYS V7.0 software. As fresh feed, feedstocks with a mild acid content have been used. The process flowsheet follows a traditional alkaline transesterification scheme constituted by esterification, transesterification and purification stages. Kinetic models taking into account the concentration of the different species have been employed in order to simulate the behavior of the CSTR reactors and the product distribution within the process. The comparison between experimental data found in literature and the predicted normalized properties, has been discussed. Additionally, a comparison between different thermodynamic packages has been performed. NRTL activity model has been selected as the most reliable of them. The combination of these models allows the prediction of 13 out of 25 parameters included in standard EN-14214:2003, and confers simulators a great value as predictive as well as optimization tool. (c) 2010 Elsevier Ltd. All rights reserved.

  5. Multi-time scale Climate Informed Stochastic Hybrid Simulation-Optimization Model (McISH model) for Multi-Purpose Reservoir System

    NASA Astrophysics Data System (ADS)

    Lu, M.; Lall, U.

    2013-12-01

    In order to mitigate the impacts of climate change, proactive management strategies to operate reservoirs and dams are needed. A multi-time scale climate informed stochastic model is developed to optimize the operations for a multi-purpose single reservoir by simulating decadal, interannual, seasonal and sub-seasonal variability. We apply the model to a setting motivated by the largest multi-purpose dam in N. India, the Bhakhra reservoir on the Sutlej River, a tributary of the Indus. This leads to a focus on timing and amplitude of the flows for the monsoon and snowmelt periods. The flow simulations are constrained by multiple sources of historical data and GCM future projections, that are being developed through a NSF funded project titled 'Decadal Prediction and Stochastic Simulation of Hydroclimate Over Monsoon Asia'. The model presented is a multilevel, nonlinear programming model that aims to optimize the reservoir operating policy on a decadal horizon and the operation strategy on an updated annual basis. The model is hierarchical, in terms of having a structure that two optimization models designated for different time scales are nested as a matryoshka doll. The two optimization models have similar mathematical formulations with some modifications to meet the constraints within that time frame. The first level of the model is designated to provide optimization solution for policy makers to determine contracted annual releases to different uses with a prescribed reliability; the second level is a within-the-period (e.g., year) operation optimization scheme that allocates the contracted annual releases on a subperiod (e.g. monthly) basis, with additional benefit for extra release and penalty for failure. The model maximizes the net benefit of irrigation, hydropower generation and flood control in each of the periods. The model design thus facilitates the consistent application of weather and climate forecasts to improve operations of reservoir systems. The decadal flow simulations are re-initialized every year with updated climate projections to improve the reliability of the operation rules for the next year, within which the seasonal operation strategies are nested. The multi-level structure can be repeated for monthly operation with weekly subperiods to take advantage of evolving weather forecasts and seasonal climate forecasts. As a result of the hierarchical structure, sub-seasonal even weather time scale updates and adjustment can be achieved. Given an ensemble of these scenarios, the McISH reservoir simulation-optimization model is able to derive the desired reservoir storage levels, including minimum and maximum, as a function of calendar date, and the associated release patterns. The multi-time scale approach allows adaptive management of water supplies acknowledging the changing risks, meeting both the objectives over the decade in expected value and controlling the near term and planning period risk through probabilistic reliability constraints. For the applications presented, the target season is the monsoon season from June to September. The model also includes a monthly flood volume forecast model, based on a Copula density fit to the monthly flow and the flood volume flow. This is used to guide dynamic allocation of the flood control volume given the forecasts.

  6. Development of Equivalent Material Properties of Microbump for Simulating Chip Stacking Packaging

    PubMed Central

    Lee, Chang-Chun; Tzeng, Tzai-Liang; Huang, Pei-Chen

    2015-01-01

    A three-dimensional integrated circuit (3D-IC) structure with a significant scale mismatch causes difficulty in analytic model construction. This paper proposes a simulation technique to introduce an equivalent material composed of microbumps and their surrounding wafer level underfill (WLUF). The mechanical properties of this equivalent material, including Young’s modulus (E), Poisson’s ratio, shear modulus, and coefficient of thermal expansion (CTE), are directly obtained by applying either a tensile load or a constant displacement, and by increasing the temperature during simulations, respectively. Analytic results indicate that at least eight microbumps at the outermost region of the chip stacking structure need to be considered as an accurate stress/strain contour in the concerned region. In addition, a factorial experimental design with analysis of variance is proposed to optimize chip stacking structure reliability with four factors: chip thickness, substrate thickness, CTE, and E-value. Analytic results show that the most significant factor is CTE of WLUF. This factor affects microbump reliability and structural warpage under a temperature cycling load and high-temperature bonding process. WLUF with low CTE and high E-value are recommended to enhance the assembly reliability of the 3D-IC architecture. PMID:28793495

  7. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the early-life failures region and the expected losses given failure characterizing the corresponding time intervals. For complex systems whose components are not logically arranged in series, discrete simulation algorithms and software have been created for determining the losses from failures in terms of expected lost production time, cost of intervention, and cost of replacement. Different system topologies are assessed to determine the effect of modifications of the system topology on the expected losses from failures. It is argued that the reliability allocation in a production system should be done to maximize the profit/value associated with the system. Consequently, a method for setting reliability requirements and reliability allocation maximizing the profit by minimizing the total cost has been developed. Reliability allocation that maximizes the profit in case of a system consisting of blocks arranged in series is achieved by determining for each block individually the reliabilities of the components in the block that minimize the sum of the capital, operation costs, and the expected losses from failures. A Monte Carlo simulation based net present value (NPV) cash-flow model has also been proposed, which has significant advantages to cash-flow models based on the expected value of the losses from failures per time interval. Unlike these models, the proposed model has the capability to reveal the variation of the NPV due to different number of failures occurring during a specified time interval (e.g., during one year). The model also permits tracking the impact of the distribution pattern of failure occurrences and the time dependence of the losses from failures.

  8. Dielectric properties of organic solvents from non-polarizable molecular dynamics simulation with electronic continuum model and density functional theory.

    PubMed

    Lee, Sanghun; Park, Sung Soo

    2011-11-03

    Dielectric constants of electrolytic organic solvents are calculated employing nonpolarizable Molecular Dynamics simulation with Electronic Continuum (MDEC) model and Density Functional Theory. The molecular polarizabilities are obtained by the B3LYP/6-311++G(d,p) level of theory to estimate high-frequency refractive indices while the densities and dipole moment fluctuations are computed using nonpolarizable MD simulations. The dielectric constants reproduced from these procedures are evaluated to provide a reliable approach for estimating the experimental data. An additional feature, two representative solvents which have similar molecular weights but are different dielectric properties, i.e., ethyl methyl carbonate and propylene carbonate, are compared using MD simulations and the distinctly different dielectric behaviors are observed at short times as well as at long times.

  9. Science-based HRA: experimental comparison of operator performance to IDAC (Information-Decision-Action Crew) simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirley, Rachel; Smidts, Carol; Boring, Ronald

    Information-Decision-Action Crew (IDAC) operator model simulations of a Steam Generator Tube Rupture are compared to student operator performance in studies conducted in the Ohio State University’s Nuclear Power Plant Simulator Facility. This study is presented as a prototype for conducting simulator studies to validate key aspects of Human Reliability Analysis (HRA) methods. Seven student operator crews are compared to simulation results for crews designed to demonstrate three different decision-making strategies. The IDAC model used in the simulations is modified slightly to capture novice behavior rather that expert operators. Operator actions and scenario pacing are compared. A preliminary review of availablemore » performance shaping factors (PSFs) is presented. After the scenario in the NPP Simulator Facility, student operators review a video of the scenario and evaluate six PSFs at pre-determined points in the scenario. This provides a dynamic record of the PSFs experienced by the OSU student operators. In this preliminary analysis, Time Constraint Load (TCL) calculated in the IDAC simulations is compared to TCL reported by student operators. We identify potential modifications to the IDAC model to develop an “IDAC Student Operator Model.” This analysis provides insights into how similar experiments could be conducted using expert operators to improve the fidelity of IDAC simulations.« less

  10. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  11. Improving 1D Site Specific Velocity Profiles for the Kik-Net Network

    NASA Astrophysics Data System (ADS)

    Holt, James; Edwards, Benjamin; Pilz, Marco; Fäh, Donat; Rietbrock, Andreas

    2017-04-01

    Ground motion predication equations (GMPEs) form the cornerstone of modern seismic hazard assessments. When produced to a high standard they provide reliable estimates of ground motion/spectral acceleration for a given site and earthquake scenario. This information is crucial for engineers to optimise design and for regulators who enforce legal minimum safe design capacities. Classically, GMPEs were built upon the assumption that variability around the median model could be treated as aleatory. As understanding improved, it was noted that the propagation could be segregated into the response of the average path from the source and the response of the site. This is because the heterogeneity of the near-surface lithology is significantly different from that of the bulk path. It was then suggested that the semi-ergodic approach could be taken if the site response could be determined, moving uncertainty away from aleatory to epistemic. The determination of reliable site-specific response models is therefore becoming increasingly critical for ground motion models used in engineering practice. Today it is common practice to include proxies for site response within the scope of a GMPE, such as Vs30 or site classification, in an effort to reduce the overall uncertainty of the predication at a given site. However, these proxies are not always reliable enough to give confident ground motion estimates, due to the complexity of the near-surface. Other approaches of quantifying the response of the site include detailed numerical simulations (1/2/3D - linear, EQL, non-linear etc.). However, in order to be reliable, they require highly detailed and accurate velocity and, for non-linear analyses, material property models. It is possible to obtain this information through invasive methods, but is expensive, and not feasible for most projects. Here we propose an alternative method to derive reliable velocity profiles (and their uncertainty), calibrated using almost 20 years of recorded data from the Kik-Net network. First, using a reliable subset of sites, the empirical surface to borehole (S/B) ratio is calculated in the frequency domain using all events recorded at that site. In a subsequent step, we use numerical simulation to produce 1D SH transfer function curves using a suite of stochastic velocity models. Comparing the resulting amplification with the empirical S/B ratio we find optimal 1D velocity models and their uncertainty. The method will be tested to determine the level of initial information required to obtain a reliable Vs profile (e.g., starting Vs model, only Vs30, site-class, H/V ratio etc.) and then applied and tested against data from other regions using site-to-reference or empirical spectral model amplification.

  12. Modeling and simulation of Cu diffusion and drift in porous CMOS backend dielectrics

    NASA Astrophysics Data System (ADS)

    Ali, R.; Fan, Y.; King, S.; Orlowski, M.

    2018-06-01

    With the advent of porous dielectrics, Cu drift-diffusion reliability issues in CMOS backend have only been exacerbated. In this regard, a modeling and simulation study of Cu atom/ion drift-diffusion in porous dielectrics is presented to assess the backend reliability and to explore conditions for a reliable Resistive Random Access Memory (RRAM) operation. The numerical computation, using elementary jump frequencies for a random walk in 2D and 3D, is based on an extended adjacency tensor concept. It is shown that Cu diffusion and drift transport are affected as much by the level of porosity as by the pore morphology. Allowance is made for different rates of Cu dissolution into the dielectric and for Cu absorption and transport at and on the inner walls of the pores. Most of the complex phenomena of the drift-diffusion transport in porous media can be understood in terms of local lateral and vertical gradients and the degree of their perturbation caused by the presence of pores in the transport domain. The impact of pore morphology, related to the concept of tortuosity, is discussed in terms of "channeling" and "trapping" effects. The simulations are calibrated to experimental results of porous SiCOH layers of 25 nm thickness, sandwiched between Cu and Pt(W) electrodes with experimental porosity levels of 0%, 8%, 12%, and 25%. We find that porous SICOH is more immune to Cu+ drift at 300 K than non-porous SICOH.

  13. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    NASA Astrophysics Data System (ADS)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  14. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  15. Calibrating cellular automaton models for pedestrians walking through corners

    NASA Astrophysics Data System (ADS)

    Dias, Charitha; Lovreglio, Ruggiero

    2018-05-01

    Cellular Automata (CA) based pedestrian simulation models have gained remarkable popularity as they are simpler and easier to implement compared to other microscopic modeling approaches. However, incorporating traditional floor field representations in CA models to simulate pedestrian corner navigation behavior could result in unrealistic behaviors. Even though several previous studies have attempted to enhance CA models to realistically simulate pedestrian maneuvers around bends, such modifications have not been calibrated or validated against empirical data. In this study, two static floor field (SFF) representations, namely 'discrete representation' and 'continuous representation', are calibrated for CA-models to represent pedestrians' walking behavior around 90° bends. Trajectory data collected through a controlled experiment are used to calibrate these model representations. Calibration results indicate that although both floor field representations can represent pedestrians' corner navigation behavior, the 'continuous' representation fits the data better. Output of this study could be beneficial for enhancing the reliability of existing CA-based models by representing pedestrians' corner navigation behaviors more realistically.

  16. Large eddy simulation of spanwise rotating turbulent channel flow with dynamic variants of eddy viscosity model

    NASA Astrophysics Data System (ADS)

    Jiang, Zhou; Xia, Zhenhua; Shi, Yipeng; Chen, Shiyi

    2018-04-01

    A fully developed spanwise rotating turbulent channel flow has been numerically investigated utilizing large-eddy simulation. Our focus is to assess the performances of the dynamic variants of eddy viscosity models, including dynamic Vreman's model (DVM), dynamic wall adapting local eddy viscosity (DWALE) model, dynamic σ (Dσ ) model, and the dynamic volumetric strain-stretching (DVSS) model, in this canonical flow. The results with dynamic Smagorinsky model (DSM) and direct numerical simulations (DNS) are used as references. Our results show that the DVM has a wrong asymptotic behavior in the near wall region, while the other three models can correctly predict it. In the high rotation case, the DWALE can get reliable mean velocity profile, but the turbulence intensities in the wall-normal and spanwise directions show clear deviations from DNS data. DVSS exhibits poor predictions on both the mean velocity profile and turbulence intensities. In all three cases, Dσ performs the best.

  17. How can model comparison help improving species distribution models?

    PubMed

    Gritti, Emmanuel Stephan; Gaucherel, Cédric; Crespo-Perez, Maria-Veronica; Chuine, Isabelle

    2013-01-01

    Today, more than ever, robust projections of potential species range shifts are needed to anticipate and mitigate the impacts of climate change on biodiversity and ecosystem services. Such projections are so far provided almost exclusively by correlative species distribution models (correlative SDMs). However, concerns regarding the reliability of their predictive power are growing and several authors call for the development of process-based SDMs. Still, each of these methods presents strengths and weakness which have to be estimated if they are to be reliably used by decision makers. In this study we compare projections of three different SDMs (STASH, LPJ and PHENOFIT) that lie in the continuum between correlative models and process-based models for the current distribution of three major European tree species, Fagussylvatica L., Quercusrobur L. and Pinussylvestris L. We compare the consistency of the model simulations using an innovative comparison map profile method, integrating local and multi-scale comparisons. The three models simulate relatively accurately the current distribution of the three species. The process-based model performs almost as well as the correlative model, although parameters of the former are not fitted to the observed species distributions. According to our simulations, species range limits are triggered, at the European scale, by establishment and survival through processes primarily related to phenology and resistance to abiotic stress rather than to growth efficiency. The accuracy of projections of the hybrid and process-based model could however be improved by integrating a more realistic representation of the species resistance to water stress for instance, advocating for pursuing efforts to understand and formulate explicitly the impact of climatic conditions and variations on these processes.

  18. How Can Model Comparison Help Improving Species Distribution Models?

    PubMed Central

    Gritti, Emmanuel Stephan; Gaucherel, Cédric; Crespo-Perez, Maria-Veronica; Chuine, Isabelle

    2013-01-01

    Today, more than ever, robust projections of potential species range shifts are needed to anticipate and mitigate the impacts of climate change on biodiversity and ecosystem services. Such projections are so far provided almost exclusively by correlative species distribution models (correlative SDMs). However, concerns regarding the reliability of their predictive power are growing and several authors call for the development of process-based SDMs. Still, each of these methods presents strengths and weakness which have to be estimated if they are to be reliably used by decision makers. In this study we compare projections of three different SDMs (STASH, LPJ and PHENOFIT) that lie in the continuum between correlative models and process-based models for the current distribution of three major European tree species, Fagus sylvatica L., Quercus robur L. and Pinus sylvestris L. We compare the consistency of the model simulations using an innovative comparison map profile method, integrating local and multi-scale comparisons. The three models simulate relatively accurately the current distribution of the three species. The process-based model performs almost as well as the correlative model, although parameters of the former are not fitted to the observed species distributions. According to our simulations, species range limits are triggered, at the European scale, by establishment and survival through processes primarily related to phenology and resistance to abiotic stress rather than to growth efficiency. The accuracy of projections of the hybrid and process-based model could however be improved by integrating a more realistic representation of the species resistance to water stress for instance, advocating for pursuing efforts to understand and formulate explicitly the impact of climatic conditions and variations on these processes. PMID:23874779

  19. Estimation of Local Bone Loads for the Volume of Interest.

    PubMed

    Kim, Jung Jin; Kim, Youkyung; Jang, In Gwun

    2016-07-01

    Computational bone remodeling simulations have recently received significant attention with the aid of state-of-the-art high-resolution imaging modalities. They have been performed using localized finite element (FE) models rather than full FE models due to the excessive computational costs of full FE models. However, these localized bone remodeling simulations remain to be investigated in more depth. In particular, applying simplified loading conditions (e.g., uniform and unidirectional loads) to localized FE models have a severe limitation in a reliable subject-specific assessment. In order to effectively determine the physiological local bone loads for the volume of interest (VOI), this paper proposes a novel method of estimating the local loads when the global musculoskeletal loads are given. The proposed method is verified for the three VOI in a proximal femur in terms of force equilibrium, displacement field, and strain energy density (SED) distribution. The effect of the global load deviation on the local load estimation is also investigated by perturbing a hip joint contact force (HCF) in the femoral head. Deviation in force magnitude exhibits the greatest absolute changes in a SED distribution due to its own greatest deviation, whereas angular deviation perpendicular to a HCF provides the greatest relative change. With further in vivo force measurements and high-resolution clinical imaging modalities, the proposed method will contribute to the development of reliable patient-specific localized FE models, which can provide enhanced computational efficiency for iterative computing processes such as bone remodeling simulations.

  20. Business Cases for Microgrids: Modeling Interactions of Technology Choice, Reliability, Cost, and Benefit

    NASA Astrophysics Data System (ADS)

    Hanna, Ryan

    Distributed energy resources (DERs), and increasingly microgrids, are becoming an integral part of modern distribution systems. Interest in microgrids--which are insular and autonomous power networks embedded within the bulk grid--stems largely from the vast array of flexibilities and benefits they can offer stakeholders. Managed well, they can improve grid reliability and resiliency, increase end-use energy efficiency by coupling electric and thermal loads, reduce transmission losses by generating power locally, and may reduce system-wide emissions, among many others. Whether these public benefits are realized, however, depends on whether private firms see a "business case", or private value, in investing. To this end, firms need models that evaluate costs, benefits, risks, and assumptions that underlie decisions to invest. The objectives of this dissertation are to assess the business case for microgrids that provide what industry analysts forecast as two primary drivers of market growth--that of providing energy services (similar to an electric utility) as well as reliability service to customers within. Prototypical first adopters are modeled--using an existing model to analyze energy services and a new model that couples that analysis with one of reliability--to explore interactions between technology choice, reliability, costs, and benefits. The new model has a bi-level hierarchy; it uses heuristic optimization to select and size DERs and analytical optimization to schedule them. It further embeds Monte Carlo simulation to evaluate reliability as well as regression models for customer damage functions to monetize reliability. It provides least-cost microgrid configurations for utility customers who seek to reduce interruption and operating costs. Lastly, the model is used to explore the impact of such adoption on system-wide greenhouse gas emissions in California. Results indicate that there are, at present, co-benefits for emissions reductions when customers adopt and operate microgrids for private benefit, though future analysis is needed as the bulk grid continues to transition toward a less carbon intensive system.

  1. Non-iterative distance constraints enforcement for cloth drapes simulation

    NASA Astrophysics Data System (ADS)

    Hidajat, R. L. L. G.; Wibowo, Arifin, Z.; Suyitno

    2016-03-01

    A cloth simulation represents the behavior of cloth objects such as flag, tablecloth, or even garments has application in clothing animation for games and virtual shops. Elastically deformable models have widely used to provide realistic and efficient simulation, however problem of overstretching is encountered. We introduce a new cloth simulation algorithm that replaces iterative distance constraint enforcement steps with non-iterative ones for preventing over stretching in a spring-mass system for cloth modeling. Our method is based on a simple position correction procedure applied at one end of a spring. In our experiments, we developed a rectangle cloth model which is initially at a horizontal position with one point is fixed, and it is allowed to drape by its own weight. Our simulation is able to achieve a plausible cloth drapes as in reality. This paper aims to demonstrate the reliability of our approach to overcome overstretches while decreasing the computational cost of the constraint enforcement process due to an iterative procedure that is eliminated.

  2. Effects of imperfect automation on decision making in a simulated command and control task.

    PubMed

    Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja

    2007-02-01

    Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.

  3. Reliability analysis and fault-tolerant system development for a redundant strapdown inertial measurement unit. [inertial platforms

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.

  4. Urban Flow and Pollutant Dispersion Simulation with Multi-scale coupling of Meteorological Model with Computational Fluid Dynamic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yushi; Poh, Hee Joo

    2014-11-01

    The Computational Fluid Dynamics analysis has become increasingly important in modern urban planning in order to create highly livable city. This paper presents a multi-scale modeling methodology which couples Weather Research and Forecasting (WRF) Model with open source CFD simulation tool, OpenFOAM. This coupling enables the simulation of the wind flow and pollutant dispersion in urban built-up area with high resolution mesh. In this methodology meso-scale model WRF provides the boundary condition for the micro-scale CFD model OpenFOAM. The advantage is that the realistic weather condition is taken into account in the CFD simulation and complexity of building layout can be handled with ease by meshing utility of OpenFOAM. The result is validated against the Joint Urban 2003 Tracer Field Tests in Oklahoma City and there is reasonably good agreement between the CFD simulation and field observation. The coupling of WRF- OpenFOAM provide urban planners with reliable environmental modeling tool in actual urban built-up area; and it can be further extended with consideration of future weather conditions for the scenario studies on climate change impact.

  5. Simulated microgravity: critical review on the use of random positioning machines for mammalian cell culture.

    PubMed

    Wuest, Simon L; Richard, Stéphane; Kopp, Sascha; Grimm, Daniela; Egli, Marcel

    2015-01-01

    Random Positioning Machines (RPMs) have been used since many years as a ground-based model to simulate microgravity. In this review we discuss several aspects of the RPM. Recent technological development has expanded the operative range of the RPM substantially. New possibilities of live cell imaging and partial gravity simulations, for example, are of particular interest. For obtaining valuable and reliable results from RPM experiments, the appropriate use of the RPM is of utmost importance. The simulation of microgravity requires that the RPM's rotation is faster than the biological process under study, but not so fast that undesired side effects appear. It remains a legitimate question, however, whether the RPM can accurately and reliably simulate microgravity conditions comparable to real microgravity in space. We attempt to answer this question by mathematically analyzing the forces working on the samples while they are mounted on the operating RPM and by comparing data obtained under real microgravity in space and simulated microgravity on the RPM. In conclusion and after taking the mentioned constraints into consideration, we are convinced that simulated microgravity experiments on the RPM are a valid alternative for conducting examinations on the influence of the force of gravity in a fast and straightforward approach.

  6. Simulated Microgravity: Critical Review on the Use of Random Positioning Machines for Mammalian Cell Culture

    PubMed Central

    Wuest, Simon L.; Richard, Stéphane; Kopp, Sascha

    2015-01-01

    Random Positioning Machines (RPMs) have been used since many years as a ground-based model to simulate microgravity. In this review we discuss several aspects of the RPM. Recent technological development has expanded the operative range of the RPM substantially. New possibilities of live cell imaging and partial gravity simulations, for example, are of particular interest. For obtaining valuable and reliable results from RPM experiments, the appropriate use of the RPM is of utmost importance. The simulation of microgravity requires that the RPM's rotation is faster than the biological process under study, but not so fast that undesired side effects appear. It remains a legitimate question, however, whether the RPM can accurately and reliably simulate microgravity conditions comparable to real microgravity in space. We attempt to answer this question by mathematically analyzing the forces working on the samples while they are mounted on the operating RPM and by comparing data obtained under real microgravity in space and simulated microgravity on the RPM. In conclusion and after taking the mentioned constraints into consideration, we are convinced that simulated microgravity experiments on the RPM are a valid alternative for conducting examinations on the influence of the force of gravity in a fast and straightforward approach. PMID:25649075

  7. An energy-limited model of algal biofuel production: Toward the next generation of advanced biofuels

    DOE PAGES

    Dunlop, Eric H.; Coaldrake, A. Kimi; Silva, Cory S.; ...

    2013-10-22

    Algal biofuels are increasingly important as a source of renewable energy. The absence of reliable thermodynamic and other property data, and the large amount of kinetic data that would normally be required have created a major barrier to simulation. Additionally, the absence of a generally accepted flowsheet for biofuel production means that detailed simulation of the wrong approach is a real possibility. This model of algal biofuel production estimates the necessary data and places it into a heuristic model using a commercial simulator that back-calculates the process structure required. Furthermore, complex kinetics can be obviated for now by putting themore » simulator into energy limitation and forcing it to solve for the missing design variables, such as bioreactor surface area, productivity, and oil content. The model does not attempt to prescribe a particular approach, but provides a guide towards a sound engineering approach to this challenging and important problem.« less

  8. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wimmer, Thomas, E-mail: thomas.wimmer@medunigraz.at; Srimathveeravalli, Govindarajan; Gutta, Narendra

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathologymore » was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery.« less

  9. Evaluation of high fidelity patient simulator in assessment of performance of anaesthetists.

    PubMed

    Weller, J M; Bloch, M; Young, S; Maze, M; Oyesola, S; Wyner, J; Dob, D; Haire, K; Durbridge, J; Walker, T; Newble, D

    2003-01-01

    There is increasing emphasis on performance-based assessment of clinical competence. The High Fidelity Patient Simulator (HPS) may be useful for assessment of clinical practice in anaesthesia, but needs formal evaluation of validity, reliability, feasibility and effect on learning. We set out to assess the reliability of a global rating scale for scoring simulator performance in crisis management. Using a global rating scale, three judges independently rated videotapes of anaesthetists in simulated crises in the operating theatre. Five anaesthetists then independently rated subsets of these videotapes. There was good agreement between raters for medical management, behavioural attributes and overall performance. Agreement was high for both the initial judges and the five additional raters. Using a global scale to assess simulator performance, we found good inter-rater reliability for scoring performance in a crisis. We estimate that two judges should provide a reliable assessment. High fidelity simulation should be studied further for assessing clinical performance.

  10. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE PAGES

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...

    2016-02-26

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  11. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  12. Improving reliability of aggregation, numerical simulation and analysis of complex systems by empirical data

    NASA Astrophysics Data System (ADS)

    Dobronets, Boris S.; Popova, Olga A.

    2018-05-01

    The paper considers a new approach of regression modeling that uses aggregated data presented in the form of density functions. Approaches to Improving the reliability of aggregation of empirical data are considered: improving accuracy and estimating errors. We discuss the procedures of data aggregation as a preprocessing stage for subsequent to regression modeling. An important feature of study is demonstration of the way how represent the aggregated data. It is proposed to use piecewise polynomial models, including spline aggregate functions. We show that the proposed approach to data aggregation can be interpreted as the frequency distribution. To study its properties density function concept is used. Various types of mathematical models of data aggregation are discussed. For the construction of regression models, it is proposed to use data representation procedures based on piecewise polynomial models. New approaches to modeling functional dependencies based on spline aggregations are proposed.

  13. Accurate and general treatment of electrostatic interaction in Hamiltonian adaptive resolution simulations

    NASA Astrophysics Data System (ADS)

    Heidari, M.; Cortes-Huerto, R.; Donadio, D.; Potestio, R.

    2016-10-01

    In adaptive resolution simulations the same system is concurrently modeled with different resolution in different subdomains of the simulation box, thereby enabling an accurate description in a small but relevant region, while the rest is treated with a computationally parsimonious model. In this framework, electrostatic interaction, whose accurate treatment is a crucial aspect in the realistic modeling of soft matter and biological systems, represents a particularly acute problem due to the intrinsic long-range nature of Coulomb potential. In the present work we propose and validate the usage of a short-range modification of Coulomb potential, the Damped shifted force (DSF) model, in the context of the Hamiltonian adaptive resolution simulation (H-AdResS) scheme. This approach, which is here validated on bulk water, ensures a reliable reproduction of the structural and dynamical properties of the liquid, and enables a seamless embedding in the H-AdResS framework. The resulting dual-resolution setup is implemented in the LAMMPS simulation package, and its customized version employed in the present work is made publicly available.

  14. Multi-Model Ensemble Wake Vortex Prediction

    NASA Technical Reports Server (NTRS)

    Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.

    2015-01-01

    Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.

  15. How does higher frequency monitoring data affect the calibration of a process-based water quality model?

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, Leah; Helliwell, Rachel

    2015-04-01

    Process-based catchment water quality models are increasingly used as tools to inform land management. However, for such models to be reliable they need to be well calibrated and shown to reproduce key catchment processes. Calibration can be challenging for process-based models, which tend to be complex and highly parameterised. Calibrating a large number of parameters generally requires a large amount of monitoring data, spanning all hydrochemical conditions. However, regulatory agencies and research organisations generally only sample at a fortnightly or monthly frequency, even in well-studied catchments, often missing peak flow events. The primary aim of this study was therefore to investigate how the quality and uncertainty of model simulations produced by a process-based, semi-distributed catchment model, INCA-P (the INtegrated CAtchment model of Phosphorus dynamics), were improved by calibration to higher frequency water chemistry data. Two model calibrations were carried out for a small rural Scottish catchment: one using 18 months of daily total dissolved phosphorus (TDP) concentration data, another using a fortnightly dataset derived from the daily data. To aid comparability, calibrations were carried out automatically using the Markov Chain Monte Carlo - DiffeRential Evolution Adaptive Metropolis (MCMC-DREAM) algorithm. Calibration to daily data resulted in improved simulation of peak TDP concentrations and improved model performance statistics. Parameter-related uncertainty in simulated TDP was large when fortnightly data was used for calibration, with a 95% credible interval of 26 μg/l. This uncertainty is comparable in size to the difference between Water Framework Directive (WFD) chemical status classes, and would therefore make it difficult to use this calibration to predict shifts in WFD status. The 95% credible interval reduced markedly with the higher frequency monitoring data, to 6 μg/l. The number of parameters that could be reliably auto-calibrated was lower for the fortnightly data, with a physically unrealistic TDP simulation being produced when too many parameters were allowed to vary during model calibration. Parameters should not therefore be varied spatially for models such as INCA-P unless there is solid evidence that this is appropriate, or there is a real need to do so for the model to fulfil its purpose. This study highlights the potential pitfalls of using low frequency timeseries of observed water quality to calibrate complex process-based models. For reliable model calibrations to be produced, monitoring programmes need to be designed which capture system variability, in particular nutrient dynamics during high flow events. In addition, there is a need for simpler models, so that all model parameters can be included in auto-calibration and uncertainty analysis, and to reduce the data needs during calibration.

  16. Activated sludge pilot plant: comparison between experimental and predicted concentration profiles using three different modelling approaches.

    PubMed

    Le Moullec, Y; Potier, O; Gentric, C; Leclerc, J P

    2011-05-01

    This paper presents an experimental and numerical study of an activated sludge channel pilot plant. Concentration profiles of oxygen, COD, NO(3) and NH(4) have been measured for several operating conditions. These profiles have been compared to the simulated ones with three different modelling approaches, namely a systemic approach, CFD and compartmental modelling. For these three approaches, the kinetics model was the ASM-1 model (Henze et al., 2001). The three approaches allowed a reasonable simulation of all the concentration profiles except for ammonium for which the simulations results were far from the experimental ones. The analysis of the results showed that the role of the kinetics model is of primary importance for the prediction of activated sludge reactors performance. The fact that existing kinetics parameters in the literature have been determined by parametric optimisation using a systemic model limits the reliability of the prediction of local concentrations and of the local design of activated sludge reactors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Simulation of an Asynchronous Machine by using a Pseudo Bond Graph

    NASA Astrophysics Data System (ADS)

    Romero, Gregorio; Felez, Jesus; Maroto, Joaquin; Martinez, M. Luisa

    2008-11-01

    For engineers, computer simulation, is a basic tool since it enables them to understand how systems work without actually needing to see them. They can learn how they work in different circumstances and optimize their design with considerably less cost in terms of time and money than if they had to carry out tests on a physical system. However, if computer simulation is to be reliable it is essential for the simulation model to be validated. There is a wide range of commercial brands on the market offering products for electrical domain simulation (SPICE, LabVIEW PSCAD,Dymola, Simulink, Simplorer,...). These are powerful tools, but require the engineer to have a perfect knowledge of the electrical field. This paper shows an alternative methodology to can simulate an asynchronous machine using the multidomain Bond Graph technique and apply it in any program that permit the simulation of models based in this technique; no extraordinary knowledge of this technique and electric field are required to understand the process .

  18. A Comparison of Factor Score Estimation Methods in the Presence of Missing Data: Reliability and an Application to Nicotine Dependence

    ERIC Educational Resources Information Center

    Estabrook, Ryne; Neale, Michael

    2013-01-01

    Factor score estimation is a controversial topic in psychometrics, and the estimation of factor scores from exploratory factor models has historically received a great deal of attention. However, both confirmatory factor models and the existence of missing data have generally been ignored in this debate. This article presents a simulation study…

  19. Understanding Climate Uncertainty with an Ocean Focus

    NASA Astrophysics Data System (ADS)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.

  20. Conformational Spread in the Flagellar Motor Switch: A Model Study

    PubMed Central

    Maini, Philip K.; Berry, Richard M.; Bai, Fan

    2012-01-01

    The reliable response to weak biological signals requires that they be amplified with fidelity. In E. coli, the flagellar motors that control swimming can switch direction in response to very small changes in the concentration of the signaling protein CheY-P, but how this works is not well understood. A recently proposed allosteric model based on cooperative conformational spread in a ring of identical protomers seems promising as it is able to qualitatively reproduce switching, locked state behavior and Hill coefficient values measured for the rotary motor. In this paper we undertook a comprehensive simulation study to analyze the behavior of this model in detail and made predictions on three experimentally observable quantities: switch time distribution, locked state interval distribution, Hill coefficient of the switch response. We parameterized the model using experimental measurements, finding excellent agreement with published data on motor behavior. Analysis of the simulated switching dynamics revealed a mechanism for chemotactic ultrasensitivity, in which cooperativity is indispensable for realizing both coherent switching and effective amplification. These results showed how cells can combine elements of analog and digital control to produce switches that are simultaneously sensitive and reliable. PMID:22654654

  1. Reliable groundwater levels: failures and lessons learned from modeling and monitoring studies

    NASA Astrophysics Data System (ADS)

    Van Lanen, Henny A. J.

    2017-04-01

    Adequate management of groundwater resources requires an a priori assessment of impacts of intended groundwater abstractions. Usually, groundwater flow modeling is used to simulate the influence of the planned abstraction on groundwater levels. Model performance is tested by using observed groundwater levels. Where a multi-aquifer system occurs, groundwater levels in the different aquifers have to be monitored through observation wells with filters at different depths, i.e. above the impermeable clay layer (phreatic water level) and beneath (artesian aquifer level). A reliable artesian level can only be measured if the space between the outer wall of the borehole (vertical narrow shaft) and the observation well is refilled with impermeable material at the correct depth (post-drilling phase) to prevent a vertical hydraulic connection between the artesian and phreatic aquifer. We were involved in improper refilling, which led to impossibility to monitor reliable artesian aquifer levels. At the location of the artesian observation well, a freely overflowing spring was seen, which implied water leakage from the artesian aquifer affected the artesian groundwater level. Careful checking of the monitoring sites in a study area is a prerequisite to use observations for model performance assessment. After model testing the groundwater model is forced with proposed groundwater abstractions (sites, extraction rates). The abstracted groundwater volume is compensated by a reduction of groundwater flow to the drainage network and the model simulates associated groundwater tables. The drawdown of groundwater level is calculated by comparing the simulated groundwater level with and without groundwater abstraction. In lowland areas, such as vast areas of the Netherlands, the groundwater model has to consider a variable drainage network, which means that small streams only carry water during the wet winter season, and run dry during the summer. The main streams drain groundwater throughout the whole year. We simulated groundwater levels with a steady-state groundwater flow model with and without groundwater abstraction for the wet and dry season, i.e. considering a high (all streams included) and low drainage density (only major streams), respectively. Groundwater drawdown maps for the wet and dry season were compiled. Stakeholders (farmers, ecologists) were very concerned about the large drawdowns. After a while and discussions with the Water Supply Company and stakeholders, we realised that we had calculated unrealistic large drawdowns of the phreatic groundwater level for the dry season. We learnt that by applying a steady-state model we did not take into account the large volume of groundwater, which is released from the groundwater storage. The transient groundwater model that we developed then, showed that the volume of groundwater released from the storage per unit of time is significant and that the drawdown of the phreatic groundwater level by the end of the dry period is substantially smaller than the one simulated by the steady-state model. The results of the transient groundwater flow model agreed rather well with the pumping test that lasted the whole dry season.

  2. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models

    PubMed Central

    2018-01-01

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the ‘Internet of Things’ (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds. PMID:29748521

  3. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    PubMed

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  4. A stream temperature model for the Peace-Athabasca River basin

    NASA Astrophysics Data System (ADS)

    Morales-Marin, L. A.; Rokaya, P.; Wheater, H. S.; Lindenschmidt, K. E.

    2017-12-01

    Water temperature plays a fundamental role in water ecosystem functioning. Because it regulates flow energy and metabolic rates in organism productivity over a broad spectrum of space and time scales, water temperature constitutes an important indicator of aquatic ecosystems health. In cold region basins, stream water temperature modelling is also fundamental to predict ice freeze-up and break-up events in order to improve flood management. Multiple model approaches such as linear and multivariable regression methods, neural network and thermal energy budged models have been developed and implemented to simulate stream water temperature. Most of these models have been applied to specific stream reaches and trained using observed data, but very little has been done to simulate water temperature in large catchment river networks. We present the coupling of RBM model, a semi-Lagrangian water temperature model for advection-dominated river system, and MESH, a semi-distributed hydrological model, to simulate stream water temperature in river catchments. The coupled models are implemented in the Peace-Athabasca River basin in order to analyze the variation in stream temperature regimes under changing hydrological and meteorological conditions. Uncertainty of stream temperature simulations is also assessed in order to determine the degree of reliability of the estimates.

  5. Design and Analysis of a Low Latency Deterministic Network MAC for Wireless Sensor Networks

    PubMed Central

    Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin

    2017-01-01

    The IEEE 802.15.4e standard has four different superframe structures for different applications. Use of a low latency deterministic network (LLDN) superframe for the wireless sensor network is one of them, which can operate in a star topology. In this paper, a new channel access mechanism for IEEE 802.15.4e-based LLDN shared slots is proposed, and analytical models are designed based on this channel access mechanism. A prediction model is designed to estimate the possible number of retransmission slots based on the number of failed transmissions. Performance analysis in terms of data transmission reliability, delay, throughput and energy consumption are provided based on our proposed designs. Our designs are validated for simulation and analytical results, and it is observed that the simulation results well match with the analytical ones. Besides, our designs are compared with the IEEE 802.15.4 MAC mechanism, and it is shown that ours outperforms in terms of throughput, energy consumption, delay and reliability. PMID:28937632

  6. Design and Analysis of a Low Latency Deterministic Network MAC for Wireless Sensor Networks.

    PubMed

    Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin

    2017-09-22

    The IEEE 802.15.4e standard has four different superframe structures for different applications. Use of a low latency deterministic network (LLDN) superframe for the wireless sensor network is one of them, which can operate in a star topology. In this paper, a new channel access mechanism for IEEE 802.15.4e-based LLDN shared slots is proposed, and analytical models are designed based on this channel access mechanism. A prediction model is designed to estimate the possible number of retransmission slots based on the number of failed transmissions. Performance analysis in terms of data transmission reliability, delay, throughput and energy consumption are provided based on our proposed designs. Our designs are validated for simulation and analytical results, and it is observed that the simulation results well match with the analytical ones. Besides, our designs are compared with the IEEE 802.15.4 MAC mechanism, and it is shown that ours outperforms in terms of throughput, energy consumption, delay and reliability.

  7. Parallelization and automatic data distribution for nuclear reactor simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, L.M.

    1997-07-01

    Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directlymore » affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed.« less

  8. High effective inverse dynamics modelling for dual-arm robot

    NASA Astrophysics Data System (ADS)

    Shen, Haoyu; Liu, Yanli; Wu, Hongtao

    2018-05-01

    To deal with the problem of inverse dynamics modelling for dual arm robot, a recursive inverse dynamics modelling method based on decoupled natural orthogonal complement is presented. In this model, the concepts and methods of Decoupled Natural Orthogonal Complement matrices are used to eliminate the constraint forces in the Newton-Euler kinematic equations, and the screws is used to express the kinematic and dynamics variables. On this basis, the paper has developed a special simulation program with symbol software of Mathematica and conducted a simulation research on the a dual-arm robot. Simulation results show that the proposed method based on decoupled natural orthogonal complement can save an enormous amount of CPU time that was spent in computing compared with the recursive Newton-Euler kinematic equations and the results is correct and reasonable, which can verify the reliability and efficiency of the method.

  9. Modeling and numerical simulations of growth and morphologies of three dimensional aggregated silver films

    NASA Astrophysics Data System (ADS)

    Davis, L. J.; Boggess, M.; Kodpuak, E.; Deutsch, M.

    2012-11-01

    We report on a model for the deposition of three dimensional, aggregated nanocrystalline silver films, and an efficient numerical simulation method developed for visualizing such structures. We compare our results to a model system comprising chemically deposited silver films with morphologies ranging from dilute, uniform distributions of nanoparticles to highly porous aggregated networks. Disordered silver films grown in solution on silica substrates are characterized using digital image analysis of high resolution scanning electron micrographs. While the latter technique provides little volume information, plane-projected (two dimensional) island structure and surface coverage may be reliably determined. Three parameters governing film growth are evaluated using these data and used as inputs for the deposition model, greatly reducing computing requirements while still providing direct access to the complete (bulk) structure of the films throughout the growth process. We also show how valuable three dimensional characteristics of the deposited materials can be extracted using the simulated structures.

  10. Simulator of human visual perception

    NASA Astrophysics Data System (ADS)

    Bezzubik, Vitalii V.; Belashenkov, Nickolai R.

    2016-04-01

    Difference of Circs (DoC) model allowing to simulate the response of neurons - ganglion cells as a reaction to stimuli is represented and studied in relation with representation of receptive fields of human retina. According to this model the response of neurons is reduced to execution of simple arithmetic operations and the results of these calculations well correlate with experimental data in wide range of stimuli parameters. The simplicity of the model and reliability of reproducing of responses allow to propose the conception of a device which can simulate the signals generated by ganglion cells as a reaction to presented stimuli. The signals produced according to DoC model are considered as a result of primary processing of information received from receptors independently of their type and may be sent to higher levels of nervous system of living creatures for subsequent processing. Such device may be used as a prosthesis for disabled organ.

  11. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less

  12. Experiments with the Mesoscale Atmospheric Simulation System (MASS) using the synthetic relative humidity

    NASA Technical Reports Server (NTRS)

    Chang, Chia-Bo

    1994-01-01

    This study is intended to examine the impact of the synthetic relative humidity on the model simulation of mesoscale convective storm environment. The synthetic relative humidity is derived from the National Weather Services surface observations, and non-conventional sources including aircraft, radar, and satellite observations. The latter sources provide the mesoscale data of very high spatial and temporal resolution. The synthetic humidity data is used to complement the National Weather Services rawinsonde observations. It is believed that a realistic representation of initial moisture field in a mesoscale model is critical for the model simulation of thunderstorm development, and the formation of non-convective clouds as well as their effects on the surface energy budget. The impact will be investigated based on a real-data case study using the mesoscale atmospheric simulation system developed by Mesoscale Environmental Simulations Operations, Inc. The mesoscale atmospheric simulation system consists of objective analysis and initialization codes, and the coarse-mesh and fine-mesh dynamic prediction models. Both models are a three dimensional, primitive equation model containing the essential moist physics for simulating and forecasting mesoscale convective processes in the atmosphere. The modeling system is currently implemented at the Applied Meteorology Unit, Kennedy Space Center. Two procedures involving the synthetic relative humidity to define the model initial moisture fields are considered. It is proposed to perform several short-range (approximately 6 hours) comparative coarse-mesh simulation experiments with and without the synthetic data. They are aimed at revealing the model sensitivities should allow us both to refine the specification of the observational requirements, and to develop more accurate and efficient objective analysis schemes. The goal is to advance the MASS (Mesoscal Atmospheric Simulation System) modeling expertise so that the model output can provide reliable guidance for thunderstorm forecasting.

  13. A Time-Variant Reliability Model for Copper Bending Pipe under Seawater-Active Corrosion Based on the Stochastic Degradation Process

    PubMed Central

    Li, Mengmeng; Feng, Qiang; Yang, Dezhen

    2018-01-01

    In the degradation process, the randomness and multiplicity of variables are difficult to describe by mathematical models. However, they are common in engineering and cannot be neglected, so it is necessary to study this issue in depth. In this paper, the copper bending pipe in seawater piping systems is taken as the analysis object, and the time-variant reliability is calculated by solving the interference of limit strength and maximum stress. We did degradation experiments and tensile experiments on copper material, and obtained the limit strength at each time. In addition, degradation experiments on copper bending pipe were done and the thickness at each time has been obtained, then the response of maximum stress was calculated by simulation. Further, with the help of one kind of Monte Carlo method we propose, the time-variant reliability of copper bending pipe was calculated based on the stochastic degradation process and interference theory. Compared with traditional methods and verified by maintenance records, the results show that the time-variant reliability model based on the stochastic degradation process proposed in this paper has better applicability in the reliability analysis, and it can be more convenient and accurate to predict the replacement cycle of copper bending pipe under seawater-active corrosion. PMID:29584695

  14. Improving the XAJ Model on the Basis of Mass-Energy Balance

    NASA Astrophysics Data System (ADS)

    Fang, Yuanhao; Corbari, Chiara; Zhang, Xingnan; Mancini, Marco

    2014-11-01

    Introduction: The Xin'anjiang(XAJ) model is a conceptual model developed by the group led by Prof. Ren-Jun Zhao, which takes the pan evaporation as one of its input and then computes the effective evapotranspiration (ET) of the catchment by mass balance. Such scheme can ensure a good performance of discharge simulation but has obvious defects, one of which is that the effective ET is spatially-constant over the computation unit, neglecting the spatial variation of variables that influence the effective ET and therefore the simulation of ET and SM by the XAJ model, comparing with discharge, is less reliable. In this study, The XAJ model was improved to employ both energy and mass balance to compute the ET following the energy-mass balance scheme of FEST-EWB. model.

  15. Improving the XAJ Model on the Basis of Mass-Energy Balance

    NASA Astrophysics Data System (ADS)

    Fang, Yuanghao; Corbari, Chiara; Zhang, Xingnan; Mancini, Marco

    2014-11-01

    The Xin’anjiang(XAJ) model is a conceptual model developed by the group led by Prof. Ren-Jun Zhao, which takes the pan evaporation as one of its input and then computes the effective evapotranspiration (ET) of the catchment by mass balance. Such scheme can ensure a good performance of discharge simulation but has obvious defects, one of which is that the effective ET is spatially-constant over the computation unit, neglecting the spatial variation of variables that influence the effective ET and therefore the simulation of ET and SM by the XAJ model, comparing with discharge, is less reliable. In this study, The XAJ model was improved to employ both energy and mass balance to compute the ET following the energy-mass balance scheme of FEST-EWB. model.

  16. Model structure identification for wastewater treatment simulation based on computational fluid dynamics.

    PubMed

    Alex, J; Kolisch, G; Krause, K

    2002-01-01

    The objective of this presented project is to use the results of an CFD simulation to automatically, systematically and reliably generate an appropriate model structure for simulation of the biological processes using CSTR activated sludge compartments. Models and dynamic simulation have become important tools for research but also increasingly for the design and optimisation of wastewater treatment plants. Besides the biological models several cases are reported about the application of computational fluid dynamics ICFD) to wastewater treatment plants. One aim of the presented method to derive model structures from CFD results is to exclude the influence of empirical structure selection to the result of dynamic simulations studies of WWTPs. The second application of the approach developed is the analysis of badly performing treatment plants where the suspicion arises that bad flow behaviour such as short cut flows is part of the problem. The method suggested requires as the first step the calculation of fluid dynamics of the biological treatment step at different loading situations by use of 3-dimensional CFD simulation. The result of this information is used to generate a suitable model structure for conventional dynamic simulation of the treatment plant by use of a number of CSTR modules with a pattern of exchange flows between the tanks automatically. The method is explained in detail and the application to the WWTP Wuppertal Buchenhofen is presented.

  17. The Prodiguer Messaging Platform

    NASA Astrophysics Data System (ADS)

    Denvil, S.; Greenslade, M. A.; Carenton, N.; Levavasseur, G.; Raciazek, J.

    2015-12-01

    CONVERGENCE is a French multi-partner national project designed to gather HPC and informatics expertise to innovate in the context of running French global climate models with differing grids and at differing resolutions. Efficient and reliable execution of these models and the management and dissemination of model output are some of the complexities that CONVERGENCE aims to resolve.At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of French High Performance Computing (HPC) environments. The IPSL's simulation execution runtime libIGCM (library for IPSL Global Climate Modeling group) has recently been enhanced so as to support hitherto impossible realtime use cases such as simulation monitoring, data publication, metrics collection, simulation control, visualizations … etc. At the core of this enhancement is Prodiguer: an AMQP (Advanced Message Queue Protocol) based event driven asynchronous distributed messaging platform. libIGCM now dispatches copious amounts of information, in the form of messages, to the platform for remote processing by Prodiguer software agents at IPSL servers in Paris. Such processing takes several forms: Persisting message content to database(s); Launching rollback jobs upon simulation failure; Notifying downstream applications; Automation of visualization pipelines; We will describe and/or demonstrate the platform's: Technical implementation; Inherent ease of scalability; Inherent adaptiveness in respect to supervising simulations; Web portal receiving simulation notifications in realtime.

  18. Comparison of Actual Surgical Outcomes and 3D Surgical Simulations

    PubMed Central

    Tucker, Scott; Cevidanes, Lucia; Styner, Martin; Kim, Hyungmin; Reyes, Mauricio; Proffit, William; Turvey, Timothy

    2009-01-01

    Purpose The advent of imaging software programs have proved to be useful for diagnosis, treatment planning, and outcome measurement, but precision of 3D surgical simulation still needs to be tested. This study was conducted to determine if the virtual surgery performed on 3D models constructed from Cone-beam CT (CBCT) can correctly simulate the actual surgical outcome and to validate the ability of this emerging technology to recreate the orthognathic surgery hard tissue movements in 3 translational and 3 rotational planes of space. Methods Construction of pre- and post-surgery 3D models from CBCTs of 14 patients who had combined maxillary advancement and mandibular setback surgery and 6 patients who had one-piece maxillary advancement surgery was performed. The post-surgery and virtually simulated surgery 3D models were registered at the cranial base to quantify differences between simulated and actual surgery models. Hotelling T-test were used to assess the differences between simulated and actual surgical outcomes. Results For all anatomic regions of interest, there was no statistically significant difference between the simulated and the actual surgical models. The right lateral ramus was the only region that showed a statistically significant, but small difference when comparing two- and one-jaw surgeries. Conclusions Virtual surgical methods were reliably reproduced, oral surgery residents could benefit from virtual surgical training, and computer simulation has the potential to increase predictability in the operating room. PMID:20591553

  19. Numerical Simulation of Hydraulic Fracturing in Low-/High-Permeability, Quasi-Brittle and Heterogeneous Rocks

    NASA Astrophysics Data System (ADS)

    Pakzad, R.; Wang, S. Y.; Sloan, S. W.

    2018-04-01

    In this study, an elastic-brittle-damage constitutive model was incorporated into the coupled fluid/solid analysis of ABAQUS to iteratively calculate the equilibrium effective stress of Biot's theory of consolidation. The Young's modulus, strength and permeability parameter of the material were randomly assigned to the representative volume elements of finite element models following the Weibull distribution function. The hydraulic conductivity of elements was associated with their hydrostatic effective stress and damage level. The steady-state permeability test results for sandstone specimens under different triaxial loading conditions were reproduced by employing the same set of material parameters in coupled transient flow/stress analyses of plane-strain models, thereby indicating the reliability of the numerical model. The influence of heterogeneity on the failure response and the absolute permeability was investigated, and the post-peak permeability was found to decrease with the heterogeneity level in the coupled analysis with transient flow. The proposed model was applied to the plane-strain simulation of the fluid pressurization of a cavity within a large-scale block under different conditions. Regardless of the heterogeneity level, the hydraulically driven fractures propagated perpendicular to the minimum principal far-field stress direction for high-permeability models under anisotropic far-field stress conditions. Scattered damage elements appeared in the models with higher degrees of heterogeneity. The partially saturated areas around propagating fractures were simulated by relating the saturation degree to the negative pore pressure in low-permeability blocks under high pressure. By replicating previously reported trends in the fracture initiation and breakdown pressure for different pressurization rates and hydraulic conductivities, the results showed that the proposed model for hydraulic fracture problems is reliable for a wide range of pressurization rates and permeability conditions.

  20. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  1. Simulation and experimental research of 1MWe solar tower power plant in China

    NASA Astrophysics Data System (ADS)

    Yu, Qiang; Wang, Zhifeng; Xu, Ershu

    2016-05-01

    The establishment of a reliable simulation system for a solar tower power plant can greatly increase the economic and safety performance of the whole system. In this paper, a dynamic model of the 1MWe Solar Tower Power Plant at Badaling in Beijing is developed based on the "STAR-90" simulation platform, including the heliostat field, the central receiver system (water/steam), etc. The dynamic behavior of the global CSP plant can be simulated. In order to verify the validity of simulation system, a complete experimental process was synchronously simulated by repeating the same operating steps based on the simulation platform, including the locations and number of heliostats, the mass flow of the feed water, etc. According to the simulation and experimental results, some important parameters are taken out to make a deep comparison. The results show that there is good alignment between the simulations and the experimental results and that the error range can be acceptable considering the error of the models. In the end, a comprehensive and deep analysis on the error source is carried out according to the comparative results.

  2. Simulated Annealing Based Hybrid Forecast for Improving Daily Municipal Solid Waste Generation Prediction

    PubMed Central

    Song, Jingwei; He, Jiaying; Zhu, Menghua; Tan, Debao; Zhang, Yu; Ye, Song; Shen, Dingtao; Zou, Pengfei

    2014-01-01

    A simulated annealing (SA) based variable weighted forecast model is proposed to combine and weigh local chaotic model, artificial neural network (ANN), and partial least square support vector machine (PLS-SVM) to build a more accurate forecast model. The hybrid model was built and multistep ahead prediction ability was tested based on daily MSW generation data from Seattle, Washington, the United States. The hybrid forecast model was proved to produce more accurate and reliable results and to degrade less in longer predictions than three individual models. The average one-week step ahead prediction has been raised from 11.21% (chaotic model), 12.93% (ANN), and 12.94% (PLS-SVM) to 9.38%. Five-week average has been raised from 13.02% (chaotic model), 15.69% (ANN), and 15.92% (PLS-SVM) to 11.27%. PMID:25301508

  3. Range Systems Simulation for the NASA Shuttle: Emphasis on Disaster and Prevention Management During Lift-Off

    NASA Technical Reports Server (NTRS)

    Rabelo, Lisa; Sepulveda, Jose; Moraga, Reinaldo; Compton, Jeppie; Turner, Robert

    2005-01-01

    This article describes a decision-making system composed of a number of safety and environmental models for the launch phase of a NASA Space Shuttle mission. The components of this distributed simulation environment represent the different systems that must collaborate to establish the Expectation of Casualties (E(sub c)) caused by a failed Space Shuttle launch and subsequent explosion (accidental or instructed) of the spacecraft shortly after liftoff. This decision-making tool employs Space Shuttle reliability models, trajectory models, a blast model, weather dissemination systems, population models, amount and type of toxicants, gas dispersion models, human response functions to toxicants, and a geographical information system. Since one of the important features of this proposed simulation environment is to measure blast, toxic, and debris effects, the clear benefits is that it can help safety managers not only estimate the population at risk, but also to help plan evacuations, make sheltering decisions, establish the resources required to provide aid and comfort, and mitigate damages in case of a disaster.

  4. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  5. Electron backscattering simulation in Geant4

    NASA Astrophysics Data System (ADS)

    Dondero, Paolo; Mantero, Alfonso; Ivanchencko, Vladimir; Lotti, Simone; Mineo, Teresa; Fioretti, Valentina

    2018-06-01

    The backscattering of electrons is a key phenomenon in several physics applications which range from medical therapy to space including AREMBES, the new ESA simulation framework for radiation background effects. The importance of properly reproducing this complex interaction has grown considerably in the last years and the Geant4 Monte Carlo simulation toolkit, recently upgraded to the version 10.3, is able to comply with the AREMBES requirements in a wide energy range. In this study a validation of the electron Geant4 backscattering models is performed with respect to several experimental data. In addition a selection of the most recent validation results on the electron scattering processes is also presented. Results of our analysis show a good agreement between simulations and data from several experiments, confirming the Geant4 electron backscattering models to be robust and reliable up to a few tens of electronvolts.

  6. Local control on precipitation in a fully coupled climate-hydrology model.

    PubMed

    Larsen, Morten A D; Christensen, Jens H; Drews, Martin; Butts, Michael B; Refsgaard, Jens C

    2016-03-10

    The ability to simulate regional precipitation realistically by climate models is essential to understand and adapt to climate change. Due to the complexity of associated processes, particularly at unresolved temporal and spatial scales this continues to be a major challenge. As a result, climate simulations of precipitation often exhibit substantial biases that affect the reliability of future projections. Here we demonstrate how a regional climate model (RCM) coupled to a distributed hydrological catchment model that fully integrates water and energy fluxes between the subsurface, land surface, plant cover and the atmosphere, enables a realistic representation of local precipitation. Substantial improvements in simulated precipitation dynamics on seasonal and longer time scales is seen for a simulation period of six years and can be attributed to a more complete treatment of hydrological sub-surface processes including groundwater and moisture feedback. A high degree of local influence on the atmosphere suggests that coupled climate-hydrology models have a potential for improving climate projections and the results further indicate a diminished need for bias correction in climate-hydrology impact studies.

  7. Local control on precipitation in a fully coupled climate-hydrology model

    PubMed Central

    Larsen, Morten A. D.; Christensen, Jens H.; Drews, Martin; Butts, Michael B.; Refsgaard, Jens C.

    2016-01-01

    The ability to simulate regional precipitation realistically by climate models is essential to understand and adapt to climate change. Due to the complexity of associated processes, particularly at unresolved temporal and spatial scales this continues to be a major challenge. As a result, climate simulations of precipitation often exhibit substantial biases that affect the reliability of future projections. Here we demonstrate how a regional climate model (RCM) coupled to a distributed hydrological catchment model that fully integrates water and energy fluxes between the subsurface, land surface, plant cover and the atmosphere, enables a realistic representation of local precipitation. Substantial improvements in simulated precipitation dynamics on seasonal and longer time scales is seen for a simulation period of six years and can be attributed to a more complete treatment of hydrological sub-surface processes including groundwater and moisture feedback. A high degree of local influence on the atmosphere suggests that coupled climate-hydrology models have a potential for improving climate projections and the results further indicate a diminished need for bias correction in climate-hydrology impact studies. PMID:26960564

  8. Fault detection and diagnosis of photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Wu, Xing

    The rapid growth of the solar industry over the past several years has expanded the significance of photovoltaic (PV) systems. One of the primary aims of research in building-integrated PV systems is to improve the performance of the system's efficiency, availability, and reliability. Although much work has been done on technological design to increase a photovoltaic module's efficiency, there is little research so far on fault diagnosis for PV systems. Faults in a PV system, if not detected, may not only reduce power generation, but also threaten the availability and reliability, effectively the "security" of the whole system. In this paper, first a circuit-based simulation baseline model of a PV system with maximum power point tracking (MPPT) is developed using MATLAB software. MATLAB is one of the most popular tools for integrating computation, visualization and programming in an easy-to-use modeling environment. Second, data collection of a PV system at variable surface temperatures and insolation levels under normal operation is acquired. The developed simulation model of PV system is then calibrated and improved by comparing modeled I-V and P-V characteristics with measured I--V and P--V characteristics to make sure the simulated curves are close to those measured values from the experiments. Finally, based on the circuit-based simulation model, a PV model of various types of faults will be developed by changing conditions or inputs in the MATLAB model, and the I--V and P--V characteristic curves, and the time-dependent voltage and current characteristics of the fault modalities will be characterized for each type of fault. These will be developed as benchmark I-V or P-V, or prototype transient curves. If a fault occurs in a PV system, polling and comparing actual measured I--V and P--V characteristic curves with both normal operational curves and these baseline fault curves will aid in fault diagnosis.

  9. Evaluating the Relevance, Reliability, and Applicability of CMIP5 Climate Projections for Water Resources and Environmental Planning

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Scott, J.; Ferguson, I. M.; Arnold, J.; Raff, D. A.; Webb, R. S.

    2012-12-01

    Water managers need to understand the applicability of climate projection information available for decision-support at the scale of their applications. Applicability depends on information reliability and relevance. This need to understand applicability stems from expectations that entities rationalize adaptation investments or decisions to delay investment. It is also occurring at a time when new global climate projections are being released through the World Climate Research Programme Coupled Model Intercomparison Project phase 5 (CMIP5), which introduces new information opportunities and interpretation challenges. This project involves an interagency collaboration to evaluate the applicability of CMIP5 projections for use in water and environmental resources planning. The overarching goal is to develop and demonstrate a framework that involves dual evaluations of relevance and reliability informing an ultimate discussion and judgment of applicability, which is expected to vary with decision-making context. The framework is being developed and demonstrated within the context of reservoir systems management in California's Sacramento and San Joaquin River basins. The relevance evaluation focuses on identifying the climate variables and statistical measures relevant to long-term management questions, which may depend on satisfying multiple objectives. Past studies' results are being considered in this evaluation, along with new results from system sensitivity analyses conducted through this effort. The reliability evaluation focuses on the CMIP5 climate models' ability to simulate past conditions relative to observed references. The evaluation is being conducted across the global domain using a large menu of climate variables and statistical measures, leveraging lessons learned from similar evaluations of CMIP3 climate models. The global focus addresses a broader project goal of producing a web resource that can serve reliability information to applicability discussions around the world, with evaluation results being served through a web-portal similar to that developed by NOAA/CIRES to serve CMIP3 information on future climate extremes (http://www.esrl.noaa.gov/psd/ipcc/extremes/). The framework concludes with an applicability discussion informed by relevance and reliability results. The goal is to observe the discussion process and identify features, choice points, and challenges that might be summarized and shared with other resource management groups facing applicability questions. This presentation will discuss the project framework and preliminary results. In addition to considering CMIP5 21st century projection information, the framework is being developed to support evaluation of CMIP5 decadal predictability experiment simulations and reconcile those simulations with 21st century projections. The presentation will also discuss implications of considering the applicability of bias-corrected and downscaled information within this framework.

  10. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  11. PERFORMANCE, RELIABILITY, AND IMPROVEMENT OF A TISSUE-SPECIFIC METABOLIC SIMULATOR

    EPA Science Inventory

    A methodology is described that has been used to build and enhance a simulator for rat liver metabolism providing reliable predictions within a large chemical domain. The tissue metabolism simulator (TIMES) utilizes a heuristic algorithm to generate plausible metabolic maps using...

  12. Structural Reliability and Monte Carlo Simulation.

    ERIC Educational Resources Information Center

    Laumakis, P. J.; Harlow, G.

    2002-01-01

    Analyzes a simple boom structure and assesses its reliability using elementary engineering mechanics. Demonstrates the power and utility of Monte-Carlo simulation by showing that such a simulation can be implemented more readily with results that compare favorably to the theoretical calculations. (Author/MM)

  13. 78 FR 29049 - Streptomycin; Pesticide Tolerances for Emergency Exemptions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... exposures for which there is reliable information.'' This includes exposure through drinking water and in... commodities. 2. Dietary exposure from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for streptomycin in drinking water. These simulation...

  14. Autonomous Energy Grids | Grid Modernization | NREL

    Science.gov Websites

    control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to

  15. Integrating Solar PV in Utility System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, A.; Botterud, A.; Wu, J.

    2013-10-31

    This study develops a systematic framework for estimating the increase in operating costs due to uncertainty and variability in renewable resources, uses the framework to quantify the integration costs associated with sub-hourly solar power variability and uncertainty, and shows how changes in system operations may affect these costs. Toward this end, we present a statistical method for estimating the required balancing reserves to maintain system reliability along with a model for commitment and dispatch of the portfolio of thermal and renewable resources at different stages of system operations. We estimate the costs of sub-hourly solar variability, short-term forecast errors, andmore » day-ahead (DA) forecast errors as the difference in production costs between a case with “realistic” PV (i.e., subhourly solar variability and uncertainty are fully included in the modeling) and a case with “well behaved” PV (i.e., PV is assumed to have no sub-hourly variability and can be perfectly forecasted). In addition, we highlight current practices that allow utilities to compensate for the issues encountered at the sub-hourly time frame with increased levels of PV penetration. In this analysis we use the analytical framework to simulate utility operations with increasing deployment of PV in a case study of Arizona Public Service Company (APS), a utility in the southwestern United States. In our analysis, we focus on three processes that are important in understanding the management of PV variability and uncertainty in power system operations. First, we represent the decisions made the day before the operating day through a DA commitment model that relies on imperfect DA forecasts of load and wind as well as PV generation. Second, we represent the decisions made by schedulers in the operating day through hour-ahead (HA) scheduling. Peaking units can be committed or decommitted in the HA schedules and online units can be redispatched using forecasts that are improved relative to DA forecasts, but still imperfect. Finally, we represent decisions within the operating hour by schedulers and transmission system operators as real-time (RT) balancing. We simulate the DA and HA scheduling processes with a detailed unit-commitment (UC) and economic dispatch (ED) optimization model. This model creates a least-cost dispatch and commitment plan for the conventional generating units using forecasts and reserve requirements as inputs. We consider only the generation units and load of the utility in this analysis; we do not consider opportunities to trade power with neighboring utilities. We also do not consider provision of reserves from renewables or from demand-side options. We estimate dynamic reserve requirements in order to meet reliability requirements in the RT operations, considering the uncertainty and variability in load, solar PV, and wind resources. Balancing reserve requirements are based on the 2.5th and 97.5th percentile of 1-min deviations from the HA schedule in a previous year. We then simulate RT deployment of balancing reserves using a separate minute-by-minute simulation of deviations from the HA schedules in the operating year. In the simulations we assume that balancing reserves can be fully deployed in 10 min. The minute-by-minute deviations account for HA forecasting errors and the actual variability of the load, wind, and solar generation. Using these minute-by-minute deviations and deployment of balancing reserves, we evaluate the impact of PV on system reliability through the calculation of the standard reliability metric called Control Performance Standard 2 (CPS2). Broadly speaking, the CPS2 score measures the percentage of 10-min periods in which a balancing area is able to balance supply and demand within a specific threshold. Compliance with the North American Electric Reliability Corporation (NERC) reliability standards requires that the CPS2 score must exceed 90% (i.e., the balancing area must maintain adequate balance for 90% of the 10-min periods). The combination of representing DA forecast errors in the DA commitments, using 1-min PV data to simulate RT balancing, and estimates of reliability performance through the CPS2 metric, all factors that are important to operating systems with increasing amounts of PV, makes this study unique in its scope.« less

  16. Modelling low velocity impact induced damage in composite laminates

    NASA Astrophysics Data System (ADS)

    Shi, Yu; Soutis, Constantinos

    2017-12-01

    The paper presents recent progress on modelling low velocity impact induced damage in fibre reinforced composite laminates. It is important to understand the mechanisms of barely visible impact damage (BVID) and how it affects structural performance. To reduce labour intensive testing, the development of finite element (FE) techniques for simulating impact damage becomes essential and recent effort by the composites research community is reviewed in this work. The FE predicted damage initiation and propagation can be validated by Non Destructive Techniques (NDT) that gives confidence to the developed numerical damage models. A reliable damage simulation can assist the design process to optimise laminate configurations, reduce weight and improve performance of components and structures used in aircraft construction.

  17. Future Warming Patterns Linked to Today's Climate Variability.

    PubMed

    Dai, Aiguo

    2016-01-11

    The reliability of model projections of greenhouse gas (GHG)-induced future climate change is often assessed based on models' ability to simulate the current climate, but there has been little evidence that connects the two. In fact, this practice has been questioned because the GHG-induced future climate change may involve additional physical processes that are not important for the current climate. Here I show that the spatial patterns of the GHG-induced future warming in the 21(st) century is highly correlated with the patterns of the year-to-year variations of surface air temperature for today's climate, with areas of larger variations during 1950-1979 having more GHG-induced warming in the 21(st) century in all climate models. Such a relationship also exists in other climate fields such as atmospheric water vapor, and it is evident in observed temperatures from 1950-2010. The results suggest that many physical processes may work similarly in producing the year-to-year climate variations in the current climate and the GHG-induced long-term changes in the 21(st) century in models and in the real world. They support the notion that models that simulate present-day climate variability better are likely to make more reliable predictions of future climate change.

  18. Reexamining Computational Support for Intelligence Analysis: A Functional Design for a Future Capability

    DTIC Science & Technology

    2016-07-14

    applicability of the sensor model in the context under consideration. A similar information flow can be considered for obtaining direct reliability of an... Modeling , Bex Concepts Human Intelligence Simulation USE CASES Army: Opns in Megacities, Syrian Civil War Navy: Piracy (NATO, Book), Autonomous ISR...2007) 6 [25] Bex, F. and Verheij, B ., Story Schemes for Argumentation about the Facts of a Crime, Computational Models of Narrative: Papers from the

  19. Urban air quality estimation study, phase 1

    NASA Technical Reports Server (NTRS)

    Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.

    1976-01-01

    Possibilities are explored for applying estimation theory to the analysis, interpretation, and use of air quality measurements in conjunction with simulation models to provide a cost effective method of obtaining reliable air quality estimates for wide urban areas. The physical phenomenology of real atmospheric plumes from elevated localized sources is discussed. A fluctuating plume dispersion model is derived. Individual plume parameter formulations are developed along with associated a priori information. Individual measurement models are developed.

  20. A rater training protocol to assess team performance.

    PubMed

    Eppich, Walter; Nannicelli, Anna P; Seivert, Nicholas P; Sohn, Min-Woong; Rozenfeld, Ranna; Woods, Donna M; Holl, Jane L

    2015-01-01

    Simulation-based methodologies are increasingly used to assess teamwork and communication skills and provide team training. Formative feedback regarding team performance is an essential component. While effective use of simulation for assessment or training requires accurate rating of team performance, examples of rater-training programs in health care are scarce. We describe our rater training program and report interrater reliability during phases of training and independent rating. We selected an assessment tool shown to yield valid and reliable results and developed a rater training protocol with an accompanying rater training handbook. The rater training program was modeled after previously described high-stakes assessments in the setting of 3 facilitated training sessions. Adjacent agreement was used to measure interrater reliability between raters. Nine raters with a background in health care and/or patient safety evaluated team performance of 42 in-situ simulations using post-hoc video review. Adjacent agreement increased from the second training session (83.6%) to the third training session (85.6%) when evaluating the same video segments. Adjacent agreement for the rating of overall team performance was 78.3%, which was added for the third training session. Adjacent agreement was 97% 4 weeks posttraining and 90.6% at the end of independent rating of all simulation videos. Rater training is an important element in team performance assessment, and providing examples of rater training programs is essential. Articulating key rating anchors promotes adequate interrater reliability. In addition, using adjacent agreement as a measure allows differentiation between high- and low-performing teams on video review. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.

  1. Reliability analysis of a wastewater treatment plant using fault tree analysis and Monte Carlo simulation.

    PubMed

    Taheriyoun, Masoud; Moradinejad, Saber

    2015-01-01

    The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.

  2. Small Business Innovations

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The purpose of QASE RT is to enable system analysts and software engineers to evaluate performance and reliability implications of design alternatives. The program resulted from two Small Business Innovation Research (SBIR) projects. After receiving a description of the system architecture and workload from the user, QASE RT translates the system description into simulation models and executes them. Simulation provides detailed performance evaluation. The results of the evaluations are service and response times, offered load and device utilizations and functional availability.

  3. Reliability of simulated robustness testing in fast liquid chromatography, using state-of-the-art column technology, instrumentation and modelling software.

    PubMed

    Kormány, Róbert; Fekete, Jenő; Guillarme, Davy; Fekete, Szabolcs

    2014-02-01

    The goal of this study was to evaluate the accuracy of simulated robustness testing using commercial modelling software (DryLab) and state-of-the-art stationary phases. For this purpose, a mixture of amlodipine and its seven related impurities was analyzed on short narrow bore columns (50×2.1mm, packed with sub-2μm particles) providing short analysis times. The performance of commercial modelling software for robustness testing was systematically compared to experimental measurements and DoE based predictions. We have demonstrated that the reliability of predictions was good, since the predicted retention times and resolutions were in good agreement with the experimental ones at the edges of the design space. In average, the retention time relative errors were <1.0%, while the predicted critical resolution errors were comprised between 6.9 and 17.2%. Because the simulated robustness testing requires significantly less experimental work than the DoE based predictions, we think that robustness could now be investigated in the early stage of method development. Moreover, the column interchangeability, which is also an important part of robustness testing, was investigated considering five different C8 and C18 columns packed with sub-2μm particles. Again, thanks to modelling software, we proved that the separation was feasible on all columns within the same analysis time (less than 4min), by proper adjustments of variables. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Validity evidence and reliability of a simulated patient feedback instrument.

    PubMed

    Schlegel, Claudia; Woermann, Ulrich; Rethans, Jan-Joost; van der Vleuten, Cees

    2012-01-27

    In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients.

  5. How to assess the impact of a physical parameterization in simulations of moist convection?

    NASA Astrophysics Data System (ADS)

    Grabowski, Wojciech

    2017-04-01

    A numerical model capable in simulating moist convection (e.g., cloud-resolving model or large-eddy simulation model) consists of a fluid flow solver combined with required representations (i.e., parameterizations) of physical processes. The later typically include cloud microphysics, radiative transfer, and unresolved turbulent transport. Traditional approaches to investigate impacts of such parameterizations on convective dynamics involve parallel simulations with different parameterization schemes or with different scheme parameters. Such methodologies are not reliable because of the natural variability of a cloud field that is affected by the feedback between the physics and dynamics. For instance, changing the cloud microphysics typically leads to a different realization of the cloud-scale flow, and separating dynamical and microphysical impacts is difficult. This presentation will present a novel modeling methodology, the piggybacking, that allows studying the impact of a physical parameterization on cloud dynamics with confidence. The focus will be on the impact of cloud microphysics parameterization. Specific examples of the piggybacking approach will include simulations concerning the hypothesized deep convection invigoration in polluted environments, the validity of the saturation adjustment in modeling condensation in moist convection, and separation of physical impacts from statistical uncertainty in simulations applying particle-based Lagrangian microphysics, the super-droplet method.

  6. Symmetry-plane model of 3D Euler flows: Mapping to regular systems and numerical solutions of blowup

    NASA Astrophysics Data System (ADS)

    Mulungye, Rachel M.; Lucas, Dan; Bustamante, Miguel D.

    2014-11-01

    We introduce a family of 2D models describing the dynamics on the so-called symmetry plane of the full 3D Euler fluid equations. These models depend on a free real parameter and can be solved analytically. For selected representative values of the free parameter, we apply the method introduced in [M.D. Bustamante, Physica D: Nonlinear Phenom. 240, 1092 (2011)] to map the fluid equations bijectively to globally regular systems. By comparing the analytical solutions with the results of numerical simulations, we establish that the numerical simulations of the mapped regular systems are far more accurate than the numerical simulations of the original systems, at the same spatial resolution and CPU time. In particular, the numerical integrations of the mapped regular systems produce robust estimates for the growth exponent and singularity time of the main blowup quantity (vorticity stretching rate), converging well to the analytically-predicted values even beyond the time at which the flow becomes under-resolved (i.e. the reliability time). In contrast, direct numerical integrations of the original systems develop unstable oscillations near the reliability time. We discuss the reasons for this improvement in accuracy, and explain how to extend the analysis to the full 3D case. Supported under the programme for Research in Third Level Institutions (PRTLI) Cycle 5 and co-funded by the European Regional Development Fund.

  7. Modeling riverine nitrate export from an East-Central Illinois watershed using SWAT.

    PubMed

    Hu, X; McIsaac, G F; David, M B; Louwers, C A L

    2007-01-01

    Reliable water quality models are needed to forecast the water quality consequences of different agricultural nutrient management scenarios. In this study, the Soil and Water Assessment Tool (SWAT), version 2000, was applied to simulate streamflow, riverine nitrate (NO(3)) export, crop yield, and watershed nitrogen (N) budgets in the upper Embarras River (UER) watershed in east-central Illinois, which has extensive maize-soybean cultivation, large N fertilizer input, and extensive tile drainage. During the calibration (1994-2002) and validation (1985-1993) periods, SWAT simulated monthly and annual stream flows with Nash-Sutcliffe coefficients (E) ranging from 0.67 to 0.94 and R(2) from 0.75 to 0.95. For monthly and annual NO(3) loads, E ranged from -0.16 to 0.45 and R(2) from 0.36 to 0.74. Annual maize and soybean yields were simulated with relative errors ranging from -10 to 6%. The model was then used to predict the changes in NO(3) output with N fertilizer application rates 10 to 50% lower than original application rates in UER. The calibrated SWAT predicted a 10 to 43% decrease in NO(3) export from UER and a 6 to 38% reduction in maize yield in response to the reduction in N fertilizer. The SWAT model markedly overestimated NO(3) export during major wet periods. Moreover, SWAT estimated soybean N fixation rates considerably greater than literature values, and some simulated changes in the N cycle in response to fertilizer reduction seemed to be unrealistic. Improving these aspects of SWAT could lead to more reliable predictions in the water quality outcomes of nutrient management practices in tile-drained watersheds.

  8. Efficient Permeability Measurement and Numerical Simulation of the Resin Flow in Low Permeability Preform Fabricated by Automated Dry Fiber Placement

    NASA Astrophysics Data System (ADS)

    Agogue, Romain; Chebil, Naziha; Deleglise-Lagardere, Mylène; Beauchene, Pierre; Park, Chung Hae

    2017-10-01

    We propose a new experimental method using a Hassler cell and air injection to measure the permeability of fiber preform while avoiding a race tracking effect. This method was proven to be particularly efficient to measure very low through-thickness permeability of preform fabricated by automated dry fiber placement. To validate the reliability of the permeability measurement, the experiments of viscous liquid infusion into the preform with or without a distribution medium were performed. The experimental data of flow front advancement was compared with the numerical simulation result using the permeability values obtained by the Hassler cell permeability measurement set-up as well as by the liquid infusion experiments. To address the computational cost issue, the model for the equivalent permeability of distribution medium was employed in the numerical simulation of liquid flow. The new concept using air injection and Hassler cell for the fiber preform permeability measurement was shown to be reliable and efficient.

  9. Model of dissolution in the framework of tissue engineering and drug delivery.

    PubMed

    Sanz-Herrera, J A; Soria, L; Reina-Romo, E; Torres, Y; Boccaccini, A R

    2018-05-22

    Dissolution phenomena are ubiquitously present in biomaterials in many different fields. Despite the advantages of simulation-based design of biomaterials in medical applications, additional efforts are needed to derive reliable models which describe the process of dissolution. A phenomenologically based model, available for simulation of dissolution in biomaterials, is introduced in this paper. The model turns into a set of reaction-diffusion equations implemented in a finite element numerical framework. First, a parametric analysis is conducted in order to explore the role of model parameters on the overall dissolution process. Then, the model is calibrated and validated versus a straightforward but rigorous experimental setup. Results show that the mathematical model macroscopically reproduces the main physicochemical phenomena that take place in the tests, corroborating its usefulness for design of biomaterials in the tissue engineering and drug delivery research areas.

  10. A Fresh Start for Flood Estimation in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Woods, R. A.

    2017-12-01

    The two standard methods for flood estimation in ungauged basins, regression-based statistical models and rainfall-runoff models using a design rainfall event, have survived relatively unchanged as the methods of choice for more than 40 years. Their technical implementation has developed greatly, but the models' representation of hydrological processes has not, despite a large volume of hydrological research. I suggest it is time to introduce more hydrology into flood estimation. The reliability of the current methods can be unsatisfactory. For example, despite the UK's relatively straightforward hydrology, regression estimates of the index flood are uncertain by +/- a factor of two (for a 95% confidence interval), an impractically large uncertainty for design. The standard error of rainfall-runoff model estimates is not usually known, but available assessments indicate poorer reliability than statistical methods. There is a practical need for improved reliability in flood estimation. Two promising candidates to supersede the existing methods are (i) continuous simulation by rainfall-runoff modelling and (ii) event-based derived distribution methods. The main challenge with continuous simulation methods in ungauged basins is to specify the model structure and parameter values, when calibration data are not available. This has been an active area of research for more than a decade, and this activity is likely to continue. The major challenges for the derived distribution method in ungauged catchments include not only the correct specification of model structure and parameter values, but also antecedent conditions (e.g. seasonal soil water balance). However, a much smaller community of researchers are active in developing or applying the derived distribution approach, and as a result slower progress is being made. A change in needed: surely we have learned enough about hydrology in the last 40 years that we can make a practical hydrological advance on our methods for flood estimation! A shift to new methods for flood estimation will not be taken lightly by practitioners. However, the standard for change is clear - can we develop new methods which give significant improvements in reliability over those existing methods which are demonstrably unsatisfactory?

  11. Distributed collaborative response surface method for mechanical dynamic assembly reliability design

    NASA Astrophysics Data System (ADS)

    Bai, Guangchen; Fei, Chengwei

    2013-11-01

    Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.

  12. An OSSE on Mesoscale Model Assimilation of Simulated HIRAD-Observed Hurricane Surface Winds

    NASA Technical Reports Server (NTRS)

    Albers, Cerese; Miller, Timothy; Uhlhorn, Eric; Krishnamurti, T. N.

    2012-01-01

    The hazards of landfalling hurricanes are well known, but progress on improving the intensity forecasts of these deadly storms at landfall has been slow. Many cite a lack of high-resolution data sets taken inside the core of a hurricane, and the lack of reliable measurements in extreme conditions near the surface of hurricanes, as possible reasons why even the most state-of-the-art forecasting models cannot seem to forecast intensity changes better. The Hurricane Imaging Radiometer (HIRAD) is a new airborne microwave remote sensor for observing hurricanes, and is operated and researched by NASA Marshall Space Flight Center in partnership with the NOAA Atlantic Oceanographic and Meteorological Laboratory/Hurricane Research Division, the University of Central Florida, the University of Michigan, and the University of Alabama in Huntsville. This instrument?s purpose is to study the wind field of a hurricane, specifically observing surface wind speeds and rain rates, in what has traditionally been the most difficult areas for other instruments to study; the high wind and heavy rain regions. Dr. T. N. Krishnamurti has studied various data assimilation techniques for hurricane and monsoon rain rates, and this study builds off of results obtained from utilizing his style of physical initializations of rainfall observations, but obtaining reliable observations in heavy rain regions has always presented trouble to our research of high-resolution rainfall forecasting. Reliable data from these regions at such a high resolution and wide swath as HIRAD provides is potentially very valuable to mesoscale forecasting of hurricane intensity. This study shows how the data assimilation technique of Ensemble Kalman Filtering (EnKF) in the Weather Research and Forecasting (WRF) model can be used to incorporate wind, and later rain rate, data into a mesoscale model forecast of hurricane intensity. The study makes use of an Observing System Simulation Experiment (OSSE) with a simulated HIRAD dataset sampled during a hurricane and uses EnKF to forecast the track and intensity prediction of the hurricane. Comparisons to truth and error metrics are used to assess the model?s forecast performance.

  13. Evaluation of a Multi-Decadal Simulation of Stratospheric Ozone by Comparison with Total Ozone Mapping Spectrometer (TOMS) Observations

    NASA Technical Reports Server (NTRS)

    Douglass, Anne R.; Stolarski, Richard S.; Steenrod, Steven; Pawson, Steven

    2003-01-01

    One key application of atmospheric chemistry and transport models is prediction of the response of ozone and other constituents to various natural and anthropogenic perturbations. These include changes in composition, such as the previous rise and recent decline in emission of man-made chlorofluorcarbons, changes in aerosol loading due to volcanic eruption, and changes in solar forcing. Comparisons of hindcast model results for the past few decades with observations are a key element of model evaluation and provide a sense of the reliability of model predictions. The 25 year data set from Total Ozone Mapping Spectrometers is a cornerstone of such model evaluation. Here we report evaluation of three-dimensional multi-decadal simulation of stratospheric composition. Meteorological fields for this off-line calculation are taken from a 50 year simulation of a general circulation model. Model fields are compared with observations from TOMS and also with observations from the Stratospheric Aerosol and Gas Experiment (SAGE), Microwave Limb Sounder (MLS), Cryogenic Limb Array Etalon Spectrometer (CLAES), and the Halogen Occultation Experiment (HALOE). This overall evaluation will emphasize the spatial, seasonal, and interannual variability of the simulation compared with observed atmospheric variability.

  14. Genomic data assimilation for estimating hybrid functional Petri net from time-course gene expression data.

    PubMed

    Nagasaki, Masao; Yamaguchi, Rui; Yoshida, Ryo; Imoto, Seiya; Doi, Atsushi; Tamada, Yoshinori; Matsuno, Hiroshi; Miyano, Satoru; Higuchi, Tomoyuki

    2006-01-01

    We propose an automatic construction method of the hybrid functional Petri net as a simulation model of biological pathways. The problems we consider are how we choose the values of parameters and how we set the network structure. Usually, we tune these unknown factors empirically so that the simulation results are consistent with biological knowledge. Obviously, this approach has the limitation in the size of network of interest. To extend the capability of the simulation model, we propose the use of data assimilation approach that was originally established in the field of geophysical simulation science. We provide genomic data assimilation framework that establishes a link between our simulation model and observed data like microarray gene expression data by using a nonlinear state space model. A key idea of our genomic data assimilation is that the unknown parameters in simulation model are converted as the parameter of the state space model and the estimates are obtained as the maximum a posteriori estimators. In the parameter estimation process, the simulation model is used to generate the system model in the state space model. Such a formulation enables us to handle both the model construction and the parameter tuning within a framework of the Bayesian statistical inferences. In particular, the Bayesian approach provides us a way of controlling overfitting during the parameter estimations that is essential for constructing a reliable biological pathway. We demonstrate the effectiveness of our approach using synthetic data. As a result, parameter estimation using genomic data assimilation works very well and the network structure is suitably selected.

  15. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  16. Linear and evolutionary polynomial regression models to forecast coastal dynamics: Comparison and reliability assessment

    NASA Astrophysics Data System (ADS)

    Bruno, Delia Evelina; Barca, Emanuele; Goncalves, Rodrigo Mikosz; de Araujo Queiroz, Heithor Alexandre; Berardi, Luigi; Passarella, Giuseppe

    2018-01-01

    In this paper, the Evolutionary Polynomial Regression data modelling strategy has been applied to study small scale, short-term coastal morphodynamics, given its capability for treating a wide database of known information, non-linearly. Simple linear and multilinear regression models were also applied to achieve a balance between the computational load and reliability of estimations of the three models. In fact, even though it is easy to imagine that the more complex the model, the more the prediction improves, sometimes a "slight" worsening of estimations can be accepted in exchange for the time saved in data organization and computational load. The models' outcomes were validated through a detailed statistical, error analysis, which revealed a slightly better estimation of the polynomial model with respect to the multilinear model, as expected. On the other hand, even though the data organization was identical for the two models, the multilinear one required a simpler simulation setting and a faster run time. Finally, the most reliable evolutionary polynomial regression model was used in order to make some conjecture about the uncertainty increase with the extension of extrapolation time of the estimation. The overlapping rate between the confidence band of the mean of the known coast position and the prediction band of the estimated position can be a good index of the weakness in producing reliable estimations when the extrapolation time increases too much. The proposed models and tests have been applied to a coastal sector located nearby Torre Colimena in the Apulia region, south Italy.

  17. Blinded evaluation of interrater reliability of an operative competency assessment tool for direct laryngoscopy and rigid bronchoscopy.

    PubMed

    Ishman, Stacey L; Benke, James R; Johnson, Kaalan Erik; Zur, Karen B; Jacobs, Ian N; Thorne, Marc C; Brown, David J; Lin, Sandra Y; Bhatti, Nasir; Deutsch, Ellen S

    2012-10-01

    OBJECTIVES To confirm interrater reliability using blinded evaluation of a skills-assessment instrument to assess the surgical performance of resident and fellow trainees performing pediatric direct laryngoscopy and rigid bronchoscopy in simulated models. DESIGN Prospective, paired, blinded observational validation study. SUBJECTS Paired observers from multiple institutions simultaneously evaluated residents and fellows who were performing surgery in an animal laboratory or using high-fidelity manikins. The evaluators had no previous affiliation with the residents and fellows and did not know their year of training. INTERVENTIONS One- and 2-page versions of an objective structured assessment of technical skills (OSATS) assessment instrument composed of global and a task-specific surgical items were used to evaluate surgical performance. RESULTS Fifty-two evaluations were completed by 17 attending evaluators. The instrument agreement for the 2-page assessment was 71.4% when measured as a binary variable (ie, competent vs not competent) (κ = 0.38; P = .08). Evaluation as a continuous variable revealed a 42.9% percentage agreement (κ = 0.18; P = .14). The intraclass correlation was 0.53, considered substantial/good interrater reliability (69% reliable). For the 1-page instrument, agreement was 77.4% when measured as a binary variable (κ = 0.53, P = .0015). Agreement when evaluated as a continuous measure was 71.0% (κ = 0.54, P < .001). The intraclass correlation was 0.73, considered high interrater reliability (85% reliable). CONCLUSIONS The OSATS assessment instrument is an effective tool for evaluating surgical performance among trainees with acceptable interrater reliability in a simulator setting. Reliability was good for both the 1- and 2-page OSATS checklists, and both serve as excellent tools to provide immediate formative feedback on operational competency.

  18. Extreme storms, sea level rise, and coastal change: implications for infrastructure reliability in the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Anarde, K.; Kameshwar, S.; Irza, N.; Lorenzo-Trueba, J.; Nittrouer, J. A.; Padgett, J.; Bedient, P. B.

    2016-12-01

    Predicting coastal infrastructure reliability during hurricane events is important for risk-based design and disaster planning, such as delineating viable emergency response routes. Previous research has focused on either infrastructure vulnerability to coastal flooding or the impact of changing sea level and landforms on surge dynamics. Here we investigate the combined impact of sea level, morphology, and coastal flooding on the reliability of highway bridges - the only access points between barrier islands and mainland communities - during future extreme storms. We forward model coastal flooding for static projections of geomorphic change using ADCIRC+SWAN. First-order parameters that are adjusted include sea level and elevation. These are varied for each storm simulation to evaluate relative impact on the reliability of bridges surrounding Freeport, TX. Simulated storms include both synthetic and historical events, which are classified by intensity using the storm's integrated kinetic energy, a metric for surge generation potential. Reliability is estimated through probability of failure - given wave and surge loads - and time inundated. Findings include that: 1) bridge reliability scales inversely with surge height, and 2) sea level rise reduces bridge reliability due to a monotonic increase in surge height. The impact of a shifting landscape on bridge reliability is more complex: barrier island rollback can increase or decrease inundation times for storms of different intensity due to changes in wind-setup and back-barrier bay interactions. Initial storm surge readily inundates the coastal landscape during large intensity storms, however the draining of inland bays following storm passage is significantly impeded by the barrier. From a coastal engineering standpoint, we determine that to protect critical infrastructure, efforts now implemented that nourish low-lying barriers may be enhanced by also armoring back-bay coastlines and elevating bridge approach ramps.

  19. Simulations of forest mortality in Colorado River basin

    NASA Astrophysics Data System (ADS)

    Wei, L.; Xu, C.; Johnson, D. J.; Zhou, H.; McDowell, N.

    2017-12-01

    The Colorado River Basin (CRB) had experienced multiple severe forest mortality events under the recent changing climate. Such forest mortality events may have great impacts on ecosystem services and water budget of the watershed. It is hence important to estimate and predict the forest mortality in the CRB with climate change. We simulated forest mortality in the CRB with a model of plant hydraulics within the FATES (the Functionally Assembled Terrestrial Ecosystem Simulator) coupled to the DOE Earth System model (ACME: Accelerated Climate Model of Energy) at a 0.5 x 0.5 degree resolution. Moreover, we incorporated a stable carbon isotope (δ13C) module to ACME(FATE) and used it as a new predictor of forest mortality. The δ13C values of plants with C3 photosynthetic pathway (almost all trees are C3 plants) can indicate the water stress plants experiencing (the more intensive stress, the less negative δ13C value). We set a δ13C threshold in model simulation, above which forest mortality initiates. We validate the mortality simulations with field data based on Forest Inventory and Analysis (FIA) data, which were aggregated into the same spatial resolution as the model simulations. Different mortality schemes in the model (carbon starvation, hydraulic failure, and δ13C) were tested and compared. Each scheme demonstrated its strength and the plant hydraulics module provided more reliable simulations of forest mortality than the earlier ACME(FATE) version. Further testing is required for better forest mortality modelling.

  20. Cue reliability and a landmark stability heuristic determine relative weighting between egocentric and allocentric visual information in memory-guided reach.

    PubMed

    Byrne, Patrick A; Crawford, J Douglas

    2010-06-01

    It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.

  1. The Evaluation of Motivation and Sport Education Relationship

    ERIC Educational Resources Information Center

    Bas, Mustafa

    2016-01-01

    Motivating and enjoyable experience are the factors that physical education teachers encounter with them. One of the educational models of sport is sport education that is a result of features reliable sport simulation that causes to positive motivational sport experiences. Participants were 120 (male = 100, female = 20). The number of classes…

  2. Evaluation of existing and modified wetland equations in the SWAT model

    USDA-ARS?s Scientific Manuscript database

    The drainage significantly alters flow and nutrient pathways in small watersheds and reliable simulation at this scale is needed for effective planning of nutrient reduction strategies. The Soil and Water Assessment Tool (SWAT) has been widely utilized for prediction of flow and nutrient loads, but...

  3. The SYSGEN user package

    NASA Technical Reports Server (NTRS)

    Carlson, C. R.

    1981-01-01

    The user documentation of the SYSGEN model and its links with other simulations is described. The SYSGEN is a production costing and reliability model of electric utility systems. Hydroelectric, storage, and time dependent generating units are modeled in addition to conventional generating plants. Input variables, modeling options, output variables, and reports formats are explained. SYSGEN also can be run interactively by using a program called FEPS (Front End Program for SYSGEN). A format for SYSGEN input variables which is designed for use with FEPS is presented.

  4. On the Simulation-Based Reliability of Complex Emergency Logistics Networks in Post-Accident Rescues.

    PubMed

    Wang, Wei; Huang, Li; Liang, Xuedong

    2018-01-06

    This paper investigates the reliability of complex emergency logistics networks, as reliability is crucial to reducing environmental and public health losses in post-accident emergency rescues. Such networks' statistical characteristics are analyzed first. After the connected reliability and evaluation indices for complex emergency logistics networks are effectively defined, simulation analyses of network reliability are conducted under two different attack modes using a particular emergency logistics network as an example. The simulation analyses obtain the varying trends in emergency supply times and the ratio of effective nodes and validates the effects of network characteristics and different types of attacks on network reliability. The results demonstrate that this emergency logistics network is both a small-world and a scale-free network. When facing random attacks, the emergency logistics network steadily changes, whereas it is very fragile when facing selective attacks. Therefore, special attention should be paid to the protection of supply nodes and nodes with high connectivity. The simulation method provides a new tool for studying emergency logistics networks and a reference for similar studies.

  5. On the Simulation-Based Reliability of Complex Emergency Logistics Networks in Post-Accident Rescues

    PubMed Central

    Wang, Wei; Huang, Li; Liang, Xuedong

    2018-01-01

    This paper investigates the reliability of complex emergency logistics networks, as reliability is crucial to reducing environmental and public health losses in post-accident emergency rescues. Such networks’ statistical characteristics are analyzed first. After the connected reliability and evaluation indices for complex emergency logistics networks are effectively defined, simulation analyses of network reliability are conducted under two different attack modes using a particular emergency logistics network as an example. The simulation analyses obtain the varying trends in emergency supply times and the ratio of effective nodes and validates the effects of network characteristics and different types of attacks on network reliability. The results demonstrate that this emergency logistics network is both a small-world and a scale-free network. When facing random attacks, the emergency logistics network steadily changes, whereas it is very fragile when facing selective attacks. Therefore, special attention should be paid to the protection of supply nodes and nodes with high connectivity. The simulation method provides a new tool for studying emergency logistics networks and a reference for similar studies. PMID:29316614

  6. As accessible as a book on a library shelf: the imperative of routine simulation in modern health care.

    PubMed

    Gordon, James A

    2012-01-01

    Technology-enhanced patient simulation has emerged as an important new modality for teaching and learning in medicine. In particular, immersive simulation platforms that replicate the clinical environment promise to revolutionize medical education by enabling an enhanced level of safety, standardization, and efficiency across health-care training. Such an experiential approach seems unique in reliably catalyzing a level of emotional engagement that fosters immediate and indelible learning and allows for increasingly reliable levels of performance evaluation-all in a completely risk-free environment. As such, medical simulation is poised to emerge as a critical component of training and certification throughout health care, promising to fundamentally enhance quality and safety across disciplines. To encourage routine simulation-based practice as part of its core quality and safety mission, Massachusetts General Hospital now incorporates simulation resources within its historic medical library (est. 1847), located at the center of the campus. In this new model, learners go to the library not only to read about a patient's illness, but also to take care of their "patient." Such an approach redefines and advances the central role of the library on the campus and ensures that simulation-based practice is centrally available as part of everyday hospital operations. This article describes the reasons for identifying simulation as an institutional priority leading up to the Massachusetts General Hospital Bicentennial Celebration (1811-2011) and for creating a simulation-based learning laboratory within a hospital library.

  7. Methods of the aerodynamical experiments with simulation of massflow-traction ratio of the power unit

    NASA Astrophysics Data System (ADS)

    Lokotko, A. V.

    2016-10-01

    Modeling massflow-traction characteristics of the power unit (PU) may be of interest in the study of aerodynamic characteristics (ADC) aircraft models with full dynamic likeness, and in the study of the effect of interference PU. These studies require the use of a number of processing methods. These include: 1) The method of delivery of the high-pressure body of jets model engines on the sensitive part of the aerodynamic balance. 2) The method of estimate accuracy and reliability of measurement thrust generated by the jet device. 3) The method of implementation of the simulator SU in modeling the external contours of the nacelle, and the conditions at the inlet and outlet. 4) The method of determining the traction simulator PU. 5) The method of determining the interference effect from the work of power unit on the ADC of model. 6) The method of producing hot jets of jet engines. The paper examines implemented in ITAM methodology applied to testing in a supersonic wind tunnel T-313.

  8. Numerical investigation of the vortex-induced vibration of an elastically mounted circular cylinder at high Reynolds number (Re = 104) and low mass ratio using the RANS code.

    PubMed

    Khan, Niaz Bahadur; Ibrahim, Zainah; Nguyen, Linh Tuan The; Javed, Muhammad Faisal; Jameel, Mohammed

    2017-01-01

    This study numerically investigates the vortex-induced vibration (VIV) of an elastically mounted rigid cylinder by using Reynolds-averaged Navier-Stokes (RANS) equations with computational fluid dynamic (CFD) tools. CFD analysis is performed for a fixed-cylinder case with Reynolds number (Re) = 104 and for a cylinder that is free to oscillate in the transverse direction and possesses a low mass-damping ratio and Re = 104. Previously, similar studies have been performed with 3-dimensional and comparatively expensive turbulent models. In the current study, the capability and accuracy of the RANS model are validated, and the results of this model are compared with those of detached eddy simulation, direct numerical simulation, and large eddy simulation models. All three response branches and the maximum amplitude are well captured. The 2-dimensional case with the RANS shear-stress transport k-w model, which involves minimal computational cost, is reliable and appropriate for analyzing the characteristics of VIV.

  9. Integrating O/S models during conceptual design, part 1

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.

  10. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2011-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  11. Probabilistic Simulation for Combined Cycle Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  12. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  13. Increasing the relevance of GCM simulations for Climate Services

    NASA Astrophysics Data System (ADS)

    Smith, L. A.; Suckling, E.

    2012-12-01

    The design and interpretation of model simulations for climate services differ significantly from experimental design for the advancement of the fundamental research on predictability that underpins it. Climate services consider the sources of best information available today; this calls for a frank evaluation of model skill in the face of statistical benchmarks defined by empirical models. The fact that Physical simulation models are thought to provide the only reliable method for extrapolating into conditions not previously observed has no bearing on whether or not today's simulation models outperform empirical models. Evidence on the length scales on which today's simulation models fail to outperform empirical benchmarks is presented; it is illustrated that this occurs even on global scales in decadal prediction. At all timescales considered thus far (as of July 2012), predictions based on simulation models are improved by blending with the output of statistical models. Blending is shown to be more interesting in the climate context than it is in the weather context, where blending with a history-based climatology is straightforward. As GCMs improve and as the Earth's climate moves further from that of the last century, the skill from simulation models and their relevance to climate services is expected to increase. Examples from both seasonal and decadal forecasting will be used to discuss a third approach that may increase the role of current GCMs more quickly. Specifically, aspects of the experimental design in previous hind cast experiments are shown to hinder the use of GCM simulations for climate services. Alternative designs are proposed. The value in revisiting Thompson's classic approach to improving weather forecasting in the fifties in the context of climate services is discussed.

  14. Evaluating Modeled Impact Metrics for Human Health, Agriculture Growth, and Near-Term Climate

    NASA Astrophysics Data System (ADS)

    Seltzer, K. M.; Shindell, D. T.; Faluvegi, G.; Murray, L. T.

    2017-12-01

    Simulated metrics that assess impacts on human health, agriculture growth, and near-term climate were evaluated using ground-based and satellite observations. The NASA GISS ModelE2 and GEOS-Chem models were used to simulate the near-present chemistry of the atmosphere. A suite of simulations that varied by model, meteorology, horizontal resolution, emissions inventory, and emissions year were performed, enabling an analysis of metric sensitivities to various model components. All simulations utilized consistent anthropogenic global emissions inventories (ECLIPSE V5a or CEDS), and an evaluation of simulated results were carried out for 2004-2006 and 2009-2011 over the United States and 2014-2015 over China. Results for O3- and PM2.5-based metrics featured minor differences due to the model resolutions considered here (2.0° × 2.5° and 0.5° × 0.666°) and model, meteorology, and emissions inventory each played larger roles in variances. Surface metrics related to O3 were consistently high biased, though to varying degrees, demonstrating the need to evaluate particular modeling frameworks before O3 impacts are quantified. Surface metrics related to PM2.5 were diverse, indicating that a multimodel mean with robust results are valuable tools in predicting PM2.5-related impacts. Oftentimes, the configuration that captured the change of a metric best over time differed from the configuration that captured the magnitude of the same metric best, demonstrating the challenge in skillfully simulating impacts. These results highlight the strengths and weaknesses of these models in simulating impact metrics related to air quality and near-term climate. With such information, the reliability of historical and future simulations can be better understood.

  15. Multiscale three-dimensional simulations of charge gain and transport in diamond

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Busby, R.; Cary, J. R.; Ben-Zvi, I.; Rao, T.; Smedley, J.; Chang, X.; Keister, J. W.; Wu, Q.; Muller, E.

    2010-10-01

    A promising new concept of a diamond-amplified photocathode for generation of high-current, high-brightness, and low thermal emittance electron beams was recently proposed and is currently under active development. Detailed understanding of physical processes with multiple energy and time scales is required to design reliable and efficient diamond-amplifier cathodes. We have implemented models, within the VORPAL computational framework, to simulate secondary electron generation and charge transport in diamond in order to facilitate the investigation of the relevant effects involved. The models include inelastic scattering of electrons and holes for generation of electron-hole pairs, elastic, phonon, and charge impurity scattering. We describe the integrated modeling capabilities we developed and present results on charge gain and collection efficiency as a function of primary electron energy and applied electric field. We compare simulation results with available experimental data. The simulations show an overall qualitative agreement with the observed charge gain from transmission mode experiments and have enabled better understanding of the collection efficiency measurements.

  16. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    NASA Astrophysics Data System (ADS)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  17. Influence of grid resolution, parcel size and drag models on bubbling fluidized bed simulation

    DOE PAGES

    Lu, Liqiang; Konan, Arthur; Benyahia, Sofiane

    2017-06-02

    Here in this paper, a bubbling fluidized bed is simulated with different numerical parameters, such as grid resolution and parcel size. We examined also the effect of using two homogeneous drag correlations and a heterogeneous drag based on the energy minimization method. A fast and reliable bubble detection algorithm was developed based on the connected component labeling. The radial and axial solids volume fraction profiles are compared with experiment data and previous simulation results. These results show a significant influence of drag models on bubble size and voidage distributions and a much less dependence on numerical parameters. With a heterogeneousmore » drag model that accounts for sub-scale structures, the void fraction in the bubbling fluidized bed can be well captured with coarse grid and large computation parcels. Refining the CFD grid and reducing the parcel size can improve the simulation results but with a large increase in computation cost.« less

  18. Reservoir Performance Under Future Climate For Basins With Different Hydrologic Sensitivities

    NASA Astrophysics Data System (ADS)

    Mateus, M. C.; Tullos, D. D.

    2013-12-01

    In addition to long-standing uncertainties related to variable inflows and market price of power, reservoir operators face a number of new uncertainties related to hydrologic nonstationarity, changing environmental regulations, and rapidly growing water and energy demands. This study investigates the impact, sensitivity, and uncertainty of changing hydrology on hydrosystem performance across different hydrogeologic settings. We evaluate the performance of reservoirs in the Santiam River basin, including a case study in the North Santiam Basin, with high permeability and extensive groundwater storage, and the South Santiam Basin, with low permeability, little groundwater storage and rapid runoff response. The modeling objective is to address the following study questions: (1) for the two hydrologic regimes, how does the flood management, water supply, and environmental performance of current reservoir operations change under future 2.5, 50 and 97.5 percentile streamflow projections; and (2) how much change in inflow is required to initiate a failure to meet downstream minimum or maximum flows in the future. We couple global climate model results with a rainfall-runoff model and a formal Bayesian uncertainty analysis to simulate future inflow hydrographs as inputs to a reservoir operations model. To evaluate reservoir performance under a changing climate, we calculate reservoir refill reliability, changes in flood frequency, and reservoir time and volumetric reliability of meeting minimum spring and summer flow target. Reservoir performance under future hydrology appears to vary with hydrogeology. We find higher sensitivity to floods for the North Santiam Basin and higher sensitivity to minimum flow targets for the South Santiam Basin. Higher uncertainty is related with basins with a more complex hydrologeology. Results from model simulations contribute to understanding of the reliability and vulnerability of reservoirs to a changing climate.

  19. LISA: a java API for performing simulations of trajectories for all types of balloons

    NASA Astrophysics Data System (ADS)

    Conessa, Huguette

    2016-07-01

    LISA (LIbrarie de Simulation pour les Aerostats) is a java API for performing simulations of trajectories for all types of balloons (Zero Pressure Balloons, Pressurized Balloons, Infrared Montgolfier), and for all phases of flight (ascent, ceiling, descent). This library has for goals to establish a reliable repository of Balloons flight physics models, to capitalize developments and control models used in different tools. It is already used for flight physics study software in CNES, to understand and reproduce the behavior of balloons, observed during real flights. It will be used operationally for the ground segment of the STRATEOLE2 mission. It was developed with quality rules of "critical software." It is based on fundamental generic concepts, linking the simulation state variables to interchangeable calculation models. Each LISA model defines how to calculate a consistent set of state variables combining validity checks. To perform a simulation for a type of balloon and a phase of flight, it is necessary to select or create a macro-model that is to say, a consistent set of models to choose from among those offered by LISA, defining the behavior of the environment and the balloon. The purpose of this presentation is to introduce the main concepts of LISA, and the new perspectives offered by this library.

  20. Comparison of Malaria Simulations Driven by Meteorological Observations and Reanalysis Products in Senegal.

    PubMed

    Diouf, Ibrahima; Rodriguez-Fonseca, Belen; Deme, Abdoulaye; Caminade, Cyril; Morse, Andrew P; Cisse, Moustapha; Sy, Ibrahima; Dia, Ibrahima; Ermert, Volker; Ndione, Jacques-André; Gaye, Amadou Thierno

    2017-09-25

    The analysis of the spatial and temporal variability of climate parameters is crucial to study the impact of climate-sensitive vector-borne diseases such as malaria. The use of malaria models is an alternative way of producing potential malaria historical data for Senegal due to the lack of reliable observations for malaria outbreaks over a long time period. Consequently, here we use the Liverpool Malaria Model (LMM), driven by different climatic datasets, in order to study and validate simulated malaria parameters over Senegal. The findings confirm that the risk of malaria transmission is mainly linked to climate variables such as rainfall and temperature as well as specific landscape characteristics. For the whole of Senegal, a lag of two months is generally observed between the peak of rainfall in August and the maximum number of reported malaria cases in October. The malaria transmission season usually takes place from September to November, corresponding to the second peak of temperature occurring in October. Observed malaria data from the Programme National de Lutte contre le Paludisme (PNLP, National Malaria control Programme in Senegal) and outputs from the meteorological data used in this study were compared. The malaria model outputs present some consistencies with observed malaria dynamics over Senegal, and further allow the exploration of simulations performed with reanalysis data sets over a longer time period. The simulated malaria risk significantly decreased during the 1970s and 1980s over Senegal. This result is consistent with the observed decrease of malaria vectors and malaria cases reported by field entomologists and clinicians in the literature. The main differences between model outputs and observations regard amplitude, but can be related not only to reanalysis deficiencies but also to other environmental and socio-economic factors that are not included in this mechanistic malaria model framework. The present study can be considered as a validation of the reliability of reanalysis to be used as inputs for the calculation of malaria parameters in the Sahel using dynamical malaria models.

  1. Improved reliability of wind turbine towers with active tuned mass dampers (ATMDs)

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Breiffni; Sarkar, Saptarshi; Staino, Andrea

    2018-04-01

    Modern multi-megawatt wind turbines are composed of slender, flexible, and lightly damped blades and towers. These components exhibit high susceptibility to wind-induced vibrations. As the size, flexibility and cost of the towers have increased in recent years, the need to protect these structures against damage induced by turbulent aerodynamic loading has become apparent. This paper combines structural dynamic models and probabilistic assessment tools to demonstrate improvements in structural reliability when modern wind turbine towers are equipped with active tuned mass dampers (ATMDs). This study proposes a multi-modal wind turbine model for wind turbine control design and analysis. This study incorporates an ATMD into the tower of this model. The model is subjected to stochastically generated wind loads of varying speeds to develop wind-induced probabilistic demand models for towers of modern multi-megawatt wind turbines under structural uncertainty. Numerical simulations have been carried out to ascertain the effectiveness of the active control system to improve the structural performance of the wind turbine and its reliability. The study constructs fragility curves, which illustrate reductions in the vulnerability of towers to wind loading owing to the inclusion of the damper. Results show that the active controller is successful in increasing the reliability of the tower responses. According to the analysis carried out in this paper, a strong reduction of the probability of exceeding a given displacement at the rated wind speed has been observed.

  2. Performance assessment of Large Eddy Simulation (LES) for modeling dispersion in an urban street canyon with tree planting

    NASA Astrophysics Data System (ADS)

    Moonen, P.; Gromke, C.; Dorer, V.

    2013-08-01

    The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.

  3. A comparative study of shallow groundwater level simulation with three time series models in a coastal aquifer of South China

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Wang, Y.; Zhang, J.; Delgado, J.

    2017-05-01

    Accurate and reliable groundwater level forecasting models can help ensure the sustainable use of a watershed's aquifers for urban and rural water supply. In this paper, three time series analysis methods, Holt-Winters (HW), integrated time series (ITS), and seasonal autoregressive integrated moving average (SARIMA), are explored to simulate the groundwater level in a coastal aquifer, China. The monthly groundwater table depth data collected in a long time series from 2000 to 2011 are simulated and compared with those three time series models. The error criteria are estimated using coefficient of determination ( R 2), Nash-Sutcliffe model efficiency coefficient ( E), and root-mean-squared error. The results indicate that three models are all accurate in reproducing the historical time series of groundwater levels. The comparisons of three models show that HW model is more accurate in predicting the groundwater levels than SARIMA and ITS models. It is recommended that additional studies explore this proposed method, which can be used in turn to facilitate the development and implementation of more effective and sustainable groundwater management strategies.

  4. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the exponent in the crack propagation rate (Paris equation) and the yield strength of the elements are considered in the analytical model. The structural component is assumed to consist of a prescribed number of elements. This Monte Carlo simulation methodology is used to determine the required non-periodic inspections so that the reliability of the structural component will not fall below a prescribed minimum level. A sensitivity analysis is conducted to determine the effect of three key parameters on the specification of the non-periodic inspection intervals: namely a parameter associated with the time to crack initiation, the applied nominal stress fluctuation and the minimum acceptable reliability level.

  5. Multisite stochastic simulation of daily precipitation from copula modeling with a gamma marginal distribution

    NASA Astrophysics Data System (ADS)

    Lee, Taesam

    2018-05-01

    Multisite stochastic simulations of daily precipitation have been widely employed in hydrologic analyses for climate change assessment and agricultural model inputs. Recently, a copula model with a gamma marginal distribution has become one of the common approaches for simulating precipitation at multiple sites. Here, we tested the correlation structure of the copula modeling. The results indicate that there is a significant underestimation of the correlation in the simulated data compared to the observed data. Therefore, we proposed an indirect method for estimating the cross-correlations when simulating precipitation at multiple stations. We used the full relationship between the correlation of the observed data and the normally transformed data. Although this indirect method offers certain improvements in preserving the cross-correlations between sites in the original domain, the method was not reliable in application. Therefore, we further improved a simulation-based method (SBM) that was developed to model the multisite precipitation occurrence. The SBM preserved well the cross-correlations of the original domain. The SBM method provides around 0.2 better cross-correlation than the direct method and around 0.1 degree better than the indirect method. The three models were applied to the stations in the Nakdong River basin, and the SBM was the best alternative for reproducing the historical cross-correlation. The direct method significantly underestimates the correlations among the observed data, and the indirect method appeared to be unreliable.

  6. Reliability measures in item response theory: manifest versus latent correlation functions.

    PubMed

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Verbeke, Geert; De Boeck, Paul

    2015-02-01

    For item response theory (IRT) models, which belong to the class of generalized linear or non-linear mixed models, reliability at the scale of observed scores (i.e., manifest correlation) is more difficult to calculate than latent correlation based reliability, but usually of greater scientific interest. This is not least because it cannot be calculated explicitly when the logit link is used in conjunction with normal random effects. As such, approximations such as Fisher's information coefficient, Cronbach's α, or the latent correlation are calculated, allegedly because it is easy to do so. Cronbach's α has well-known and serious drawbacks, Fisher's information is not meaningful under certain circumstances, and there is an important but often overlooked difference between latent and manifest correlations. Here, manifest correlation refers to correlation between observed scores, while latent correlation refers to correlation between scores at the latent (e.g., logit or probit) scale. Thus, using one in place of the other can lead to erroneous conclusions. Taylor series based reliability measures, which are based on manifest correlation functions, are derived and a careful comparison of reliability measures based on latent correlations, Fisher's information, and exact reliability is carried out. The latent correlations are virtually always considerably higher than their manifest counterparts, Fisher's information measure shows no coherent behaviour (it is even negative in some cases), while the newly introduced Taylor series based approximations reflect the exact reliability very closely. Comparisons among the various types of correlations, for various IRT models, are made using algebraic expressions, Monte Carlo simulations, and data analysis. Given the light computational burden and the performance of Taylor series based reliability measures, their use is recommended. © 2014 The British Psychological Society.

  7. Experimental validation of ultrasonic NDE simulation software

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Larche, Michael; Diaz, Aaron A.; Crawford, Susan L.; Prowant, Matthew S.; Anderson, Michael T.

    2016-02-01

    Computer modeling and simulation is becoming an essential tool for transducer design and insight into ultrasonic nondestructive evaluation (UT-NDE). As the popularity of simulation tools for UT-NDE increases, it becomes important to assess their reliability to model acoustic responses from defects in operating components and provide information that is consistent with in-field inspection data. This includes information about the detectability of different defect types for a given UT probe. Recently, a cooperative program between the Electrical Power Research Institute and the U.S. Nuclear Regulatory Commission was established to validate numerical modeling software commonly used for simulating UT-NDE of nuclear power plant components. In the first phase of this cooperative, extensive experimental UT measurements were conducted on machined notches with varying depth, length, and orientation in stainless steel plates. Then, the notches were modeled in CIVA, a semi-analytical NDE simulation platform developed by the French Commissariat a l'Energie Atomique, and their responses compared with the experimental measurements. Discrepancies between experimental and simulation results are due to either improper inputs to the simulation model, or to incorrect approximations and assumptions in the numerical models. To address the former, a variation study was conducted on the different parameters that are required as inputs for the model, specifically the specimen and transducer properties. Then, the ability of simulations to give accurate predictions regarding the detectability of the different defects was demonstrated. This includes the results in terms of the variations in defect amplitude indications, and the ratios between tip diffracted and specular signal amplitudes.

  8. [Parameter sensitivity of simulating net primary productivity of Larix olgensis forest based on BIOME-BGC model].

    PubMed

    He, Li-hong; Wang, Hai-yan; Lei, Xiang-dong

    2016-02-01

    Model based on vegetation ecophysiological process contains many parameters, and reasonable parameter values will greatly improve simulation ability. Sensitivity analysis, as an important method to screen out the sensitive parameters, can comprehensively analyze how model parameters affect the simulation results. In this paper, we conducted parameter sensitivity analysis of BIOME-BGC model with a case study of simulating net primary productivity (NPP) of Larix olgensis forest in Wangqing, Jilin Province. First, with the contrastive analysis between field measurement data and the simulation results, we tested the BIOME-BGC model' s capability of simulating the NPP of L. olgensis forest. Then, Morris and EFAST sensitivity methods were used to screen the sensitive parameters that had strong influence on NPP. On this basis, we also quantitatively estimated the sensitivity of the screened parameters, and calculated the global, the first-order and the second-order sensitivity indices. The results showed that the BIOME-BGC model could well simulate the NPP of L. olgensis forest in the sample plot. The Morris sensitivity method provided a reliable parameter sensitivity analysis result under the condition of a relatively small sample size. The EFAST sensitivity method could quantitatively measure the impact of simulation result of a single parameter as well as the interaction between the parameters in BIOME-BGC model. The influential sensitive parameters for L. olgensis forest NPP were new stem carbon to new leaf carbon allocation and leaf carbon to nitrogen ratio, the effect of their interaction was significantly greater than the other parameter' teraction effect.

  9. Determination of Quantum Chemistry Based Force Fields for Molecular Dynamics Simulations of Aromatic Polymers

    NASA Technical Reports Server (NTRS)

    Jaffe, Richard; Langhoff, Stephen R. (Technical Monitor)

    1995-01-01

    Ab initio quantum chemistry calculations for model molecules can be used to parameterize force fields for molecular dynamics simulations of polymers. Emphasis in our research group is on using quantum chemistry-based force fields for molecular dynamics simulations of organic polymers in the melt and glassy states, but the methodology is applicable to simulations of small molecules, multicomponent systems and solutions. Special attention is paid to deriving reliable descriptions of the non-bonded and electrostatic interactions. Several procedures have been developed for deriving and calibrating these parameters. Our force fields for aromatic polyimide simulations will be described. In this application, the intermolecular interactions are the critical factor in determining many properties of the polymer (including its color).

  10. An application of sedimentation simulation in Tahe oilfield

    NASA Astrophysics Data System (ADS)

    Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He

    2017-12-01

    The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.

  11. Large Eddy Simulation of Gravitational Effects on Transitional and Turbulent Gas-Jet Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Givi, Peyman; Jaberi, Farhad A.

    2001-01-01

    The basic objective of this work is to assess the influence of gravity on "the compositional and the spatial structures" of transitional and turbulent diffusion flames via large eddy simulation (LES), and direct numerical simulation (DNS). The DNS is conducted for appraisal of the various closures employed in LES, and to study the effect of buoyancy on the small scale flow features. The LES is based on our "filtered mass density function"' (FMDF) model. The novelty of the methodology is that it allows for reliable simulations with inclusion of "realistic physics." It also allows for detailed analysis of the unsteady large scale flow evolution and compositional flame structure which is not usually possible via Reynolds averaged simulations.

  12. Research on Harmonic Characteristic of Electronic Current Transformer Based on the Rogowski Coil

    NASA Astrophysics Data System (ADS)

    Shen, Diqiu; Hu, Bei; Wang, Xufeng; Zhu, Mingdong; Wang, Liang; Lu, Wenxing

    2017-05-01

    The nonlinear load present in the power system will cause the distortion of AC sine wave and generate the harmonic, which havea severe impact on the accuracy of energy metering and reliability of relay protection. Tosatisfy the requirements of energy metering and relay protection for the new generation of intelligent substation, based on the working principle of Rogowski coil current transformer, mathematical model and transfer characteristics of Rogowski coil sensors were studied in this paper, and frequency response characteristics of Rogowski coil current transformer system were analysed. Finally, the frequency response characteristics of the Rogowski coil current transformer at 2 to 13 harmonics was simulated and experimented. Simulation and experiments show that Rogowski coil current transformer couldmeet 0.2 accuracy requirements of harmonic power measurement of power system, and measure the harmonic components of the grid reliably.

  13. Frequency Distribution in Domestic Microwave Ovens and Its Influence on Heating Pattern.

    PubMed

    Luan, Donglei; Wang, Yifen; Tang, Juming; Jain, Deepali

    2017-02-01

    In this study, snapshots of operating frequency profiles of domestic microwave ovens were collected to reveal the extent of microwave frequency variations under different operation conditions. A computer simulation model was developed based on the finite difference time domain method to analyze the influence of the shifting frequency on heating patterns of foods in a microwave oven. The results showed that the operating frequencies of empty and loaded domestic microwave ovens varied widely even among ovens of the same model purchased on the same date. Each microwave oven had its unique characteristic operating frequencies, which were also affected by the location and shape of the load. The simulated heating patterns of a gellan gel model food when heated on a rotary plate agreed well with the experimental results, which supported the reliability of the developed simulation model. Simulation indicated that the heating patterns of a stationary model food load changed with the varying operating frequency. However, the heating pattern of a rotary model food load was not sensitive to microwave frequencies due to the severe edge heating overshadowing the effects of the frequency variations. © 2016 Institute of Food Technologists®.

  14. 3D liver volume reconstructed for palpation training.

    PubMed

    Tibamoso, Gerardo; Perez-Gutierrez, Byron; Uribe-Quevedo, Alvaro

    2013-01-01

    Virtual Reality systems for medical procedures such as the palpation of different organs, requires fast, robust, accurate and reliable computational methods for providing realism during interaction with the 3D biological models. This paper presents the segmentation, reconstruction and palpation simulation of a healthy liver volume as a tool for training. The chosen method considers the mechanical characteristics and liver properties for correctly simulating palpation interactions, which results appropriate as a complementary tool for training medical students in familiarizing with the liver anatomy.

  15. A Holistic Approach to Systems Development

    NASA Technical Reports Server (NTRS)

    Wong, Douglas T.

    2008-01-01

    Introduces a Holistic and Iterative Design Process. Continuous process but can be loosely divided into four stages. More effort spent early on in the design. Human-centered and Multidisciplinary. Emphasis on Life-Cycle Cost. Extensive use of modeling, simulation, mockups, human subjects, and proven technologies. Human-centered design doesn t mean the human factors discipline is the most important Disciplines should be involved in the design: Subsystem vendors, configuration management, operations research, manufacturing engineering, simulation/modeling, cost engineering, hardware engineering, software engineering, test and evaluation, human factors, electromagnetic compatibility, integrated logistics support, reliability/maintainability/availability, safety engineering, test equipment, training systems, design-to-cost, life cycle cost, application engineering etc. 9

  16. Summative Objective Structured Clinical Examination Assessment at the End of Anesthesia Residency for Perioperative Ultrasound.

    PubMed

    Mitchell, John D; Amir, Rabia; Montealegre-Gallegos, Mario; Mahmood, Feroze; Shnider, Marc; Mashari, Azad; Yeh, Lu; Bose, Ruma; Wong, Vanessa; Hess, Philip; Amador, Yannis; Jeganathan, Jelliffe; Jones, Stephanie B; Matyal, Robina

    2018-06-01

    While standardized examinations and data from simulators and phantom models can assess knowledge and manual skills for ultrasound, an Objective Structured Clinical Examination (OSCE) could assess workflow understanding. We recruited 8 experts to develop an OSCE to assess workflow understanding in perioperative ultrasound. The experts used a binary grading system to score 19 graduating anesthesia residents at 6 stations. Overall average performance was 86.2%, and 3 stations had an acceptable internal reliability (Kuder-Richardson formula 20 coefficient >0.5). After refinement, this OSCE can be combined with standardized examinations and data from simulators and phantom models to assess proficiency in ultrasound.

  17. An analytic model for footprint dispersions and its application to mission design

    NASA Technical Reports Server (NTRS)

    Rao, J. R. Jagannatha; Chen, Yi-Chao

    1992-01-01

    This is the final report on our recent research activities that are complementary to those conducted by our colleagues, Professor Farrokh Mistree and students, in the context of the Taguchi method. We have studied the mathematical model that forms the basis of the Simulation and Optimization of Rocket Trajectories (SORT) program and developed an analytic method for determining mission reliability with a reduced number of flight simulations. This method can be incorporated in a design algorithm to mathematically optimize different performance measures of a mission, thus leading to a robust and easy-to-use methodology for mission planning and design.

  18. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  19. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  20. Modified chloride diffusion model for concrete under the coupling effect of mechanical load and chloride salt environment

    NASA Astrophysics Data System (ADS)

    Lei, Mingfeng; Lin, Dayong; Liu, Jianwen; Shi, Chenghua; Ma, Jianjun; Yang, Weichao; Yu, Xiaoniu

    2018-03-01

    For the purpose of investigating lining concrete durability, this study derives a modified chloride diffusion model for concrete based on the odd continuation of boundary conditions and Fourier transform. In order to achieve this, the linear stress distribution on a sectional structure is considered, detailed procedures and methods are presented for model verification and parametric analysis. Simulation results show that the chloride diffusion model can reflect the effects of linear stress distribution of the sectional structure on the chloride diffusivity with reliable accuracy. Along with the natural environmental characteristics of practical engineering structures, reference value ranges of model parameters are provided. Furthermore, a chloride diffusion model is extended for the consideration of multi-factor coupling of linear stress distribution, chloride concentration and diffusion time. Comparison between model simulation and typical current research results shows that the presented model can produce better considerations with a greater universality.

  1. Validity evidence and reliability of a simulated patient feedback instrument

    PubMed Central

    2012-01-01

    Background In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Methods Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. Results All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. Conclusions The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients. PMID:22284898

  2. MUSICA MetOp/IASI {H2O,δD} pair retrieval simulations for validating tropospheric moisture pathways in atmospheric models

    NASA Astrophysics Data System (ADS)

    Schneider, Matthias; Borger, Christian; Wiegele, Andreas; Hase, Frank; García, Omaira E.; Sepúlveda, Eliezer; Werner, Martin

    2017-02-01

    The project MUSICA (MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water) has shown that the sensor IASI aboard the satellite MetOp can measure the free tropospheric {H2O,δD} pair distribution twice per day on a quasi-global scale. Such data are very promising for investigating tropospheric moisture pathways, however, the complex data characteristics compromise their usage in the context of model evaluation studies. Here we present a tool that allows for simulating MUSICA MetOp/IASI {H2O,δD} pair remote sensing data for a given model atmosphere, thereby creating model data that have the remote sensing data characteristics assimilated. This model data can then be compared to the MUSICA data. The retrieval simulation method is based on the physical principles of radiative transfer and we show that the uncertainty of the simulations is within the uncertainty of the MUSICA MetOp/IASI products, i.e. the retrieval simulations are reliable enough. We demonstrate the working principle of the simulator by applying it to ECHAM5-wiso model data. The few case studies clearly reveal the large potential of the MUSICA MetOp/IASI {H2O,δD} data pairs for evaluating modelled moisture pathways. The tool is made freely available in form of MATLAB and Python routines and can be easily connected to any atmospheric water vapour isotopologue model.

  3. Fluid dynamics simulation for design on sludge drying equipment

    NASA Astrophysics Data System (ADS)

    Li, Shuiping; Liang, Wang; Kai, Zhang

    2017-10-01

    Sludge drying equipment is a key component in the sludge drying disposal, the structure of drying equipment directly affects the drying disposal of the sludge, so it is necessary to analyse the performance of the drying equipment with different structure. Fluent software can be very convenient to get the distribution of the flow field and temperature field inside the drying equipment which reflects the performance of the structure. In this paper, the outlet position of the sludge and the shape of the sludge inlet are designed. The geometrical model of the drying equipment is established by using pre-processing software Gambit, and the meshing of the model is carried out. The Eulerian model is used to simulate the flow of each phase and the interaction between them, and the realizable turbulence model is used to simulate the turbulence of each phase. Finally, the simulation results of the scheme are compared and the optimal structure scheme is obtained, the operational requirement is proposed. The CFD theory provides a reliable basis for the drying equipment research and reduces the time and costs of the research.

  4. Investigation and Development of Control Laws for the NASA Langley Research Center Cockpit Motion Facility

    NASA Technical Reports Server (NTRS)

    Coon, Craig R.; Cardullo, Frank M.; Zaychik, Kirill B.

    2014-01-01

    The ability to develop highly advanced simulators is a critical need that has the ability to significantly impact the aerospace industry. The aerospace industry is advancing at an ever increasing pace and flight simulators must match this development with ever increasing urgency. In order to address both current problems and potential advancements with flight simulator techniques, several aspects of current control law technology of the National Aeronautics and Space Administration (NASA) Langley Research Center's Cockpit Motion Facility (CMF) motion base simulator were examined. Preliminary investigation of linear models based upon hardware data were examined to ensure that the most accurate models are used. This research identified both system improvements in the bandwidth and more reliable linear models. Advancements in the compensator design were developed and verified through multiple techniques. The position error rate feedback, the acceleration feedback and the force feedback were all analyzed in the heave direction using the nonlinear model of the hardware. Improvements were made using the position error rate feedback technique. The acceleration feedback compensator also provided noteworthy improvement, while attempts at implementing a force feedback compensator proved unsuccessful.

  5. Adaptive resolution simulation of oligonucleotides

    NASA Astrophysics Data System (ADS)

    Netz, Paulo A.; Potestio, Raffaello; Kremer, Kurt

    2016-12-01

    Nucleic acids are characterized by a complex hierarchical structure and a variety of interaction mechanisms with other molecules. These features suggest the need of multiscale simulation methods in order to grasp the relevant physical properties of deoxyribonucleic acid (DNA) and RNA using in silico experiments. Here we report an implementation of a dual-resolution modeling of a DNA oligonucleotide in physiological conditions; in the presented setup only the nucleotide molecule and the solvent and ions in its proximity are described at the atomistic level; in contrast, the water molecules and ions far from the DNA are represented as computationally less expensive coarse-grained particles. Through the analysis of several structural and dynamical parameters, we show that this setup reliably reproduces the physical properties of the DNA molecule as observed in reference atomistic simulations. These results represent a first step towards a realistic multiscale modeling of nucleic acids and provide a quantitatively solid ground for their simulation using dual-resolution methods.

  6. Scale-dependent performances of CMIP5 earth system models in simulating terrestrial vegetation carbon

    NASA Astrophysics Data System (ADS)

    Jiang, L.; Luo, Y.; Yan, Y.; Hararuk, O.

    2013-12-01

    Mitigation of global changes will depend on reliable projection for the future situation. As the major tools to predict future climate, Earth System Models (ESMs) used in Coupled Model Intercomparison Project Phase 5 (CMIP5) for the IPCC Fifth Assessment Report have incorporated carbon cycle components, which account for the important fluxes of carbon between the ocean, atmosphere, and terrestrial biosphere carbon reservoirs; and therefore are expected to provide more detailed and more certain projections. However, ESMs are never perfect; and evaluating the ESMs can help us to identify uncertainties in prediction and give the priorities for model development. In this study, we benchmarked carbon in live vegetation in the terrestrial ecosystems simulated by 19 ESMs models from CMIP5 with an observationally estimated data set of global carbon vegetation pool 'Olson's Major World Ecosystem Complexes Ranked by Carbon in Live Vegetation: An Updated Database Using the GLC2000 Land Cover Product' by Gibbs (2006). Our aim is to evaluate the ability of ESMs to reproduce the global vegetation carbon pool at different scales and what are the possible causes for the bias. We found that the performance CMIP5 ESMs is very scale-dependent. While CESM1-BGC, CESM1-CAM5, CESM1-FASTCHEM and CESM1-WACCM, and NorESM1-M and NorESM1-ME (they share the same model structure) have very similar global sums with the observation data but they usually perform poorly at grid cell and biome scale. In contrast, MIROC-ESM and MIROC-ESM-CHEM simulate the best on at grid cell and biome scale but have larger differences in global sums than others. Our results will help improve CMIP5 ESMs for more reliable prediction.

  7. Image-Based Computational Fluid Dynamics in Blood Vessel Models: Toward Developing a Prognostic Tool to Assess Cardiovascular Function Changes in Prolonged Space Flights

    NASA Technical Reports Server (NTRS)

    Chatzimavroudis, George P.; Spirka, Thomas A.; Setser, Randolph M.; Myers, Jerry G.

    2004-01-01

    One of NASA's objectives is to be able to perform a complete, pre-flight, evaluation of cardiovascular changes in astronauts scheduled for prolonged space missions. Computational fluid dynamics (CFD) has shown promise as a method for estimating cardiovascular function during reduced gravity conditions. For this purpose, MRI can provide geometrical information, to reconstruct vessel geometries, and measure all spatial velocity components, providing location specific boundary conditions. The objective of this study was to investigate the reliability of MRI-based model reconstruction and measured boundary conditions for CFD simulations. An aortic arch model and a carotid bifurcation model were scanned in a 1.5T Siemens MRI scanner. Axial MRI acquisitions provided images for geometry reconstruction (slice thickness 3 and 5 mm; pixel size 1x1 and 0.5x0.5 square millimeters). Velocity acquisitions provided measured inlet boundary conditions and localized three-directional steady-flow velocity data (0.7-3.0 L/min). The vessel walls were isolated using NIH provided software (ImageJ) and lofted to form the geometric surface. Constructed and idealized geometries were imported into a commercial CFD code for meshing and simulation. Contour and vector plots of the velocity showed identical features between the MRI velocity data, the MRI-based CFD data, and the idealized-geometry CFD data, with less than 10% differences in the local velocity values. CFD results on models reconstructed from different MRI resolution settings showed insignificant differences (less than 5%). This study illustrated, quantitatively, that reliable CFD simulations can be performed with MRI reconstructed models and gives evidence that a future, subject-specific, computational evaluation of the cardiovascular system alteration during space travel is feasible.

  8. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  9. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  10. Automation reliability in unmanned aerial vehicle control: a reliance-compliance model of automation dependence in high workload.

    PubMed

    Dixon, Stephen R; Wickens, Christopher D

    2006-01-01

    Two experiments were conducted in which participants navigated a simulated unmanned aerial vehicle (UAV) through a series of mission legs while searching for targets and monitoring system parameters. The goal of the study was to highlight the qualitatively different effects of automation false alarms and misses as they relate to operator compliance and reliance, respectively. Background data suggest that automation false alarms cause reduced compliance, whereas misses cause reduced reliance. In two studies, 32 and 24 participants, including some licensed pilots, performed in-lab UAV simulations that presented the visual world and collected dependent measures. Results indicated that with the low-reliability aids, false alarms correlated with poorer performance in the system failure task, whereas misses correlated with poorer performance in the concurrent tasks. Compliance and reliance do appear to be affected by false alarms and misses, respectively, and are relatively independent of each other. Practical implications are that automated aids must be fairly reliable to provide global benefits and that false alarms and misses have qualitatively different effects on performance.

  11. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion.

    PubMed

    Fröhlich, Fabian; Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J; Grima, Ramon; Hasenauer, Jan

    2016-07-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity.

  12. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion

    PubMed Central

    Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J.; Grima, Ramon; Hasenauer, Jan

    2016-01-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity. PMID:27447730

  13. Spread prediction model of continuous steel tube based on BP neural network

    NASA Astrophysics Data System (ADS)

    Zhai, Jian-wei; Yu, Hui; Zou, Hai-bei; Wang, San-zhong; Liu, Li-gang

    2017-07-01

    According to the geometric pass of roll and technological parameters of three-roller continuous mandrel rolling mill in a factory, a finite element model is established to simulate the continuous rolling process of seamless steel tube, and the reliability of finite element model is verified by comparing with the simulation results and actual results of rolling force, wall thickness and outer diameter of the tube. The effect of roller reduction, roller rotation speed and blooming temperature on the spread rule is studied. Based on BP(Back Propagation) neural network technology, a spread prediction model of continuous rolling tube is established for training wall thickness coefficient and spread coefficient of the continuous rolling tube, and the rapid and accurate prediction of continuous rolling tube size is realized.

  14. On fitting generalized linear mixed-effects models for binary responses using different statistical packages.

    PubMed

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M

    2011-09-10

    The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Bioresorbable polymer coated drug eluting stent: a model study.

    PubMed

    Rossi, Filippo; Casalini, Tommaso; Raffa, Edoardo; Masi, Maurizio; Perale, Giuseppe

    2012-07-02

    In drug eluting stent technologies, an increased demand for better control, higher reliability, and enhanced performances of drug delivery systems emerged in the last years and thus offered the opportunity to introduce model-based approaches aimed to overcome the remarkable limits of trial-and-error methods. In this context a mathematical model was studied, based on detailed conservation equations and taking into account the main physical-chemical mechanisms involved in polymeric coating degradation, drug release, and restenosis inhibition. It allowed highlighting the interdependence between factors affecting each of these phenomena and, in particular, the influence of stent design parameters on drug antirestenotic efficacy. Therefore, the here-proposed model is aimed to simulate the diffusional release, for both in vitro and the in vivo conditions: results were verified against various literature data, confirming the reliability of the parameter estimation procedure. The hierarchical structure of this model also allows easily modifying the set of equations describing restenosis evolution to enhance model reliability and taking advantage of the deep understanding of physiological mechanisms governing the different stages of smooth muscle cell growth and proliferation. In addition, thanks to its simplicity and to the very low system requirements and central processing unit (CPU) time, our model allows obtaining immediate views of system behavior.

  16. Future warming patterns linked to today’s climate variability

    DOE PAGES

    Dai, Aiguo

    2016-01-11

    The reliability of model projections of greenhouse gas (GHG)-induced future climate change is often assessed based on models’ ability to simulate the current climate, but there has been little evidence that connects the two. In fact, this practice has been questioned because the GHG-induced future climate change may involve additional physical processes that are not important for the current climate. Here I show that the spatial patterns of the GHG-induced future warming in the 21 st century is highly correlated with the patterns of the year-to-year variations of surface air temperature for today’s climate, with areas of larger variations duringmore » 1950–1979 having more GHG-induced warming in the 21 st century in all climate models. Such a relationship also exists in other climate fields such as atmospheric water vapor, and it is evident in observed temperatures from 1950–2010. The results suggest that many physical processes may work similarly in producing the year-to-year climate variations in the current climate and the GHG-induced long-term changes in the 21 st century in models and in the real world. Furthermore, they support the notion that models that simulate present-day climate variability better are likely to make more reliable predictions of future climate change.« less

  17. Future warming patterns linked to today’s climate variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Aiguo

    The reliability of model projections of greenhouse gas (GHG)-induced future climate change is often assessed based on models’ ability to simulate the current climate, but there has been little evidence that connects the two. In fact, this practice has been questioned because the GHG-induced future climate change may involve additional physical processes that are not important for the current climate. Here I show that the spatial patterns of the GHG-induced future warming in the 21 st century is highly correlated with the patterns of the year-to-year variations of surface air temperature for today’s climate, with areas of larger variations duringmore » 1950–1979 having more GHG-induced warming in the 21 st century in all climate models. Such a relationship also exists in other climate fields such as atmospheric water vapor, and it is evident in observed temperatures from 1950–2010. The results suggest that many physical processes may work similarly in producing the year-to-year climate variations in the current climate and the GHG-induced long-term changes in the 21 st century in models and in the real world. Furthermore, they support the notion that models that simulate present-day climate variability better are likely to make more reliable predictions of future climate change.« less

  18. Dynamical downscaling of regional climate over eastern China using RSM with multiple physics scheme ensembles

    NASA Astrophysics Data System (ADS)

    Peishu, Zong; Jianping, Tang; Shuyu, Wang; Lingyun, Xie; Jianwei, Yu; Yunqian, Zhu; Xiaorui, Niu; Chao, Li

    2017-08-01

    The parameterization of physical processes is one of the critical elements to properly simulate the regional climate over eastern China. It is essential to conduct detailed analyses on the effect of physical parameterization schemes on regional climate simulation, to provide more reliable regional climate change information. In this paper, we evaluate the 25-year (1983-2007) summer monsoon climate characteristics of precipitation and surface air temperature by using the regional spectral model (RSM) with different physical schemes. The ensemble results using the reliability ensemble averaging (REA) method are also assessed. The result shows that the RSM model has the capacity to reproduce the spatial patterns, the variations, and the temporal tendency of surface air temperature and precipitation over eastern China. And it tends to predict better climatology characteristics over the Yangtze River basin and the South China. The impact of different physical schemes on RSM simulations is also investigated. Generally, the CLD3 cloud water prediction scheme tends to produce larger precipitation because of its overestimation of the low-level moisture. The systematic biases derived from the KF2 cumulus scheme are larger than those from the RAS scheme. The scale-selective bias correction (SSBC) method improves the simulation of the temporal and spatial characteristics of surface air temperature and precipitation and advances the circulation simulation capacity. The REA ensemble results show significant improvement in simulating temperature and precipitation distribution, which have much higher correlation coefficient and lower root mean square error. The REA result of selected experiments is better than that of nonselected experiments, indicating the necessity of choosing better ensemble samples for ensemble.

  19. Hydraulic Fracturing and Production Optimization in Eagle Ford Shale Using Coupled Geomechanics and Fluid Flow Model

    NASA Astrophysics Data System (ADS)

    Suppachoknirun, Theerapat; Tutuncu, Azra N.

    2017-12-01

    With increasing production from shale gas and tight oil reservoirs, horizontal drilling and multistage hydraulic fracturing processes have become a routine procedure in unconventional field development efforts. Natural fractures play a critical role in hydraulic fracture growth, subsequently affecting stimulated reservoir volume and the production efficiency. Moreover, the existing fractures can also contribute to the pressure-dependent fluid leak-off during the operations. Hence, a reliable identification of the discrete fracture network covering the zone of interest prior to the hydraulic fracturing design needs to be incorporated into the hydraulic fracturing and reservoir simulations for realistic representation of the in situ reservoir conditions. In this research study, an integrated 3-D fracture and fluid flow model have been developed using a new approach to simulate the fluid flow and deliver reliable production forecasting in naturally fractured and hydraulically stimulated tight reservoirs. The model was created with three key modules. A complex 3-D discrete fracture network model introduces realistic natural fracture geometry with the associated fractured reservoir characteristics. A hydraulic fracturing model is created utilizing the discrete fracture network for simulation of the hydraulic fracture and flow in the complex discrete fracture network. Finally, a reservoir model with the production grid system is used allowing the user to efficiently perform the fluid flow simulation in tight formations with complex fracture networks. The complex discrete natural fracture model, the integrated discrete fracture model for the hydraulic fracturing, the fluid flow model, and the input dataset have been validated against microseismic fracture mapping and commingled production data obtained from a well pad with three horizontal production wells located in the Eagle Ford oil window in south Texas. Two other fracturing geometries were also evaluated to optimize the cumulative production and for the three wells individually. Significant reduction in the production rate in early production times is anticipated in tight reservoirs regardless of the fracturing techniques implemented. The simulations conducted using the alternating fracturing technique led to more oil production than when zipper fracturing was used for a 20-year production period. Yet, due to the decline experienced, the differences in cumulative production get smaller, and the alternating fracturing is not practically implementable while field application of zipper fracturing technique is more practical and widely used.

  20. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  1. Application of regional physically-based landslide early warning model: tuning of the input parameters and validation of the results

    NASA Astrophysics Data System (ADS)

    D'Ambrosio, Michele; Tofani, Veronica; Rossi, Guglielmo; Salvatici, Teresa; Tacconi Stefanelli, Carlo; Rosi, Ascanio; Benedetta Masi, Elena; Pazzi, Veronica; Vannocci, Pietro; Catani, Filippo; Casagli, Nicola

    2017-04-01

    The Aosta Valley region is located in North-West Alpine mountain chain. The geomorphology of the region is characterized by steep slopes, high climatic and altitude (ranging from 400 m a.s.l of Dora Baltea's river floodplain to 4810 m a.s.l. of Mont Blanc) variability. In the study area (zone B), located in Eastern part of Aosta Valley, heavy rainfall of about 800-900 mm per year is the main landslides trigger. These features lead to a high hydrogeological risk in all territory, as mass movements interest the 70% of the municipality areas (mainly shallow rapid landslides and rock falls). An in-depth study of the geotechnical and hydrological properties of hillslopes controlling shallow landslides formation was conducted, with the aim to improve the reliability of deterministic model, named HIRESS (HIgh REsolution Stability Simulator). In particular, two campaigns of on site measurements and laboratory experiments were performed. The data obtained have been studied in order to assess the relationships existing among the different parameters and the bedrock lithology. The analyzed soils in 12 survey points are mainly composed of sand and gravel, with highly variable contents of silt. The range of effective internal friction angle (from 25.6° to 34.3°) and effective cohesion (from 0 kPa to 9.3 kPa) measured and the median ks (10E-6 m/s) value are consistent with the average grain sizes (gravelly sand). The data collected contributes to generate input map of parameters for HIRESS (static data). More static data are: volume weight, residual water content, porosity and grain size index. In order to improve the original formulation of the model, the contribution of the root cohesion has been also taken into account based on the vegetation map and literature values. HIRESS is a physically based distributed slope stability simulator for analyzing shallow landslide triggering conditions in real time and in large areas using parallel computational techniques. The software runs in real-time by assimilating weather data and uses Monte Carlo simulation techniques to manage the geotechnical and hydrological input parameters. In this context, an assessment of the factors controlling the geotechnical and hydrological features is crucial in order to understand the occurrence of slope instability mechanisms and to provide reliable forecasting of the hydrogeological hazard occurrence, especially in relation to weather events. In particular, the model and the soil characterization were applied in back analysis, in order to assess the reliability of the model through validation of the results with landslide events that occurred during the period. The validation was performed on four past events of intense rainfall that have affected Valle d'Aosta region between 2008 and 2010 years triggering fast shallows landslides. The simulations show substantial improvement of the reliability of the results compared to the use of literature parameters. A statistical analysis of the HIRESSS outputs in terms of failure probability has been carried out in order to define reliable alert levels for regional landslide early warning systems.

  2. A Co-modeling Method Based on Component Features for Mechatronic Devices in Aero-engines

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Zhao, Haocen; Ye, Zhifeng

    2017-08-01

    Data-fused and user-friendly design of aero-engine accessories is required because of their structural complexity and stringent reliability. This paper gives an overview of a typical aero-engine control system and the development process of key mechatronic devices used. Several essential aspects of modeling and simulation in the process are investigated. Considering the limitations of a single theoretic model, feature-based co-modeling methodology is suggested to satisfy the design requirements and compensate for diversity of component sub-models for these devices. As an example, a stepper motor controlled Fuel Metering Unit (FMU) is modeled in view of the component physical features using two different software tools. An interface is suggested to integrate the single discipline models into the synthesized one. Performance simulation of this device using the co-model and parameter optimization for its key components are discussed. Comparison between delivery testing and the simulation shows that the co-model for the FMU has a high accuracy and the absolute superiority over a single model. Together with its compatible interface with the engine mathematical model, the feature-based co-modeling methodology is proven to be an effective technical measure in the development process of the device.

  3. Mapping the ecological networks of microbial communities.

    PubMed

    Xiao, Yandong; Angulo, Marco Tulio; Friedman, Jonathan; Waldor, Matthew K; Weiss, Scott T; Liu, Yang-Yu

    2017-12-11

    Mapping the ecological networks of microbial communities is a necessary step toward understanding their assembly rules and predicting their temporal behavior. However, existing methods require assuming a particular population dynamics model, which is not known a priori. Moreover, those methods require fitting longitudinal abundance data, which are often not informative enough for reliable inference. To overcome these limitations, here we develop a new method based on steady-state abundance data. Our method can infer the network topology and inter-taxa interaction types without assuming any particular population dynamics model. Additionally, when the population dynamics is assumed to follow the classic Generalized Lotka-Volterra model, our method can infer the inter-taxa interaction strengths and intrinsic growth rates. We systematically validate our method using simulated data, and then apply it to four experimental data sets. Our method represents a key step towards reliable modeling of complex, real-world microbial communities, such as the human gut microbiota.

  4. a Numerical Model for Flue Gas Desulfurization System.

    NASA Astrophysics Data System (ADS)

    Kim, Sung Joon

    The purpose of this work is to develop a reliable numerical model for spray dryer desulfurization systems. The shape of the spray dryer requires that a body fitted orthogonal coordinate system be used for the numerical model. The governing equations are developed in the general orthogonal coordinates and discretized to yield a system of algebraic equations. A turbulence model is also included in the numerical program. A new second order numerical scheme is developed and included in the numerical model. The trajectory approach is used to simulate the flow of the dispersed phase. Two-way coupling phenomena is modeled by this scheme. The absorption of sulfur dioxide into lime slurry droplets is simulated by a model based on gas -phase mass transfer. The program is applied to a typical spray dryer desulfurization system. The results show the capability of the program to predict the sensitivity of system performance to changes in operational parameters.

  5. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures.

    PubMed

    Zhan, Yijian; Meschke, Günther

    2017-07-08

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense.

  6. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures

    PubMed Central

    Zhan, Yijian

    2017-01-01

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense. PMID:28773130

  7. On the convergence of a fully discrete scheme of LES type to physically relevant solutions of the incompressible Navier-Stokes

    NASA Astrophysics Data System (ADS)

    Berselli, Luigi C.; Spirito, Stefano

    2018-06-01

    Obtaining reliable numerical simulations of turbulent fluids is a challenging problem in computational fluid mechanics. The large eddy simulation (LES) models are efficient tools to approximate turbulent fluids, and an important step in the validation of these models is the ability to reproduce relevant properties of the flow. In this paper, we consider a fully discrete approximation of the Navier-Stokes-Voigt model by an implicit Euler algorithm (with respect to the time variable) and a Fourier-Galerkin method (in the space variables). We prove the convergence to weak solutions of the incompressible Navier-Stokes equations satisfying the natural local entropy condition, hence selecting the so-called physically relevant solutions.

  8. On the validation of cloud parametrization schemes in numerical atmospheric models with satellite data from ISCCP

    NASA Astrophysics Data System (ADS)

    Meinke, I.

    2003-04-01

    A new method is presented to validate cloud parametrization schemes in numerical atmospheric models with satellite data of scanning radiometers. This method is applied to the regional atmospheric model HRM (High Resolution Regional Model) using satellite data from ISCCP (International Satellite Cloud Climatology Project). Due to the limited reliability of former validations there has been a need for developing a new validation method: Up to now differences between simulated and measured cloud properties are mostly declared as deficiencies of the cloud parametrization scheme without further investigation. Other uncertainties connected with the model or with the measurements have not been taken into account. Therefore changes in the cloud parametrization scheme based on such kind of validations might not be realistic. The new method estimates uncertainties of the model and the measurements. Criteria for comparisons of simulated and measured data are derived to localize deficiencies in the model. For a better specification of these deficiencies simulated clouds are classified regarding their parametrization. With this classification the localized model deficiencies are allocated to a certain parametrization scheme. Applying this method to the regional model HRM the quality of forecasting cloud properties is estimated in detail. The overestimation of simulated clouds in low emissivity heights especially during the night is localized as model deficiency. This is caused by subscale cloudiness. As the simulation of subscale clouds in the regional model HRM is described by a relative humidity parametrization these deficiencies are connected with this parameterization.

  9. A novel model for simulating the racing effect in capillary-driven underfill process in flip chip

    NASA Astrophysics Data System (ADS)

    Zhu, Wenhui; Wang, Kanglun; Wang, Yan

    2018-04-01

    Underfill is typically applied in flip chips to increase the reliability of the electronic packagings. In this paper, the evolution of the melt-front shape of the capillary-driven underfill flow is studied through 3D numerical analysis. Two different models, the prevailing surface force model and the capillary model based on the wetted wall boundary condition, are introduced to test their applicability, where level set method is used to track the interface of the two phase flow. The comparison between the simulation results and experimental data indicates that, the surface force model produces better prediction on the melt-front shape, especially in the central area of the flip chip. Nevertheless, the two above models cannot simulate properly the racing effect phenomenon that appears during underfill encapsulation. A novel ‘dynamic pressure boundary condition’ method is proposed based on the validated surface force model. Utilizing this approach, the racing effect phenomenon is simulated with high precision. In addition, a linear relationship is derived from this model between the flow front location at the edge of the flip chip and the filling time. Using the proposed approach, the impact of the underfill-dispensing length on the melt-front shape is also studied.

  10. Efficient EM Simulation of GCPW Structures Applied to a 200-GHz mHEMT Power Amplifier MMIC

    NASA Astrophysics Data System (ADS)

    Campos-Roca, Yolanda; Amado-Rey, Belén; Wagner, Sandrine; Leuther, Arnulf; Bangert, Axel; Gómez-Alcalá, Rafael; Tessmann, Axel

    2017-05-01

    The behaviour of grounded coplanar waveguide (GCPW) structures in the upper millimeter-wave range is analyzed by using full-wave electromagnetic (EM) simulations. A methodological approach to develop reliable and time-efficient simulations is proposed by investigating the impact of different simplifications in the EM modelling and simulation conditions. After experimental validation with measurements on test structures, this approach has been used to model the most critical passive structures involved in the layout of a state-of-the-art 200-GHz power amplifier based on metamorphic high electron mobility transistors (mHEMTs). This millimeter-wave monolithic integrated circuit (MMIC) has demonstrated a measured output power of 8.7 dBm for an input power of 0 dBm at 200 GHz. The measured output power density and power-added efficiency (PAE) are 46.3 mW/mm and 4.5 %, respectively. The peak measured small-signal gain is 12.7 dB (obtained at 196 GHz). A good agreement has been obtained between measurements and simulation results.

  11. Difficulties in applying numerical simulations to an evaluation of occupational hazards caused by electromagnetic fields

    PubMed Central

    Zradziński, Patryk

    2015-01-01

    Due to the various physical mechanisms of interaction between a worker's body and the electromagnetic field at various frequencies, the principles of numerical simulations have been discussed for three areas of worker exposure: to low frequency magnetic field, to low and intermediate frequency electric field and to radiofrequency electromagnetic field. This paper presents the identified difficulties in applying numerical simulations to evaluate physical estimators of direct and indirect effects of exposure to electromagnetic fields at various frequencies. Exposure of workers operating a plastic sealer have been taken as an example scenario of electromagnetic field exposure at the workplace for discussion of those difficulties in applying numerical simulations. The following difficulties in reliable numerical simulations of workers’ exposure to the electromagnetic field have been considered: workers’ body models (posture, dimensions, shape and grounding conditions), working environment models (objects most influencing electromagnetic field distribution) and an analysis of parameters for which exposure limitations are specified in international guidelines and standards. PMID:26323781

  12. Genetic data simulators and their applications: an overview

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Gillanders, Elizabeth; Feuer, Eric J.

    2016-01-01

    Computer simulations have played an indispensable role in the development and application of statistical models and methods for genetic studies across multiple disciplines. The need to simulate complex evolutionary scenarios and pseudo-datasets for various studies has fueled the development of dozens of computer programs with varying reliability, performance, and application areas. To help researchers compare and choose the most appropriate simulators for their studies, we have created the Genetic Simulation Resources (GSR) website, which allows authors of simulation software to register their applications and describe them with more than 160 defined attributes. This article summarizes the properties of 93 simulators currently registered at GSR and provides an overview of the development and applications of genetic simulators. Unlike other review articles that address technical issues or compare simulators for particular application areas, we focus on software development, maintenance, and features of simulators, often from a historical perspective. Publications that cite these simulators are used to summarize both the applications of genetic simulations and the utilization of simulators. PMID:25504286

  13. Incorporation of RAM techniques into simulation modeling

    NASA Astrophysics Data System (ADS)

    Nelson, S. C., Jr.; Haire, M. J.; Schryver, J. C.

    1995-01-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model to represent the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army's next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through 'what if' questions, sensitivity studies, and battle scenario changes.

  14. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  15. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  16. Development Status of Low-Shock Payload Separation Mechanism for H-IIA Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Terashima, Keita; Kamita, Toru; Horie, Youichi; Kobayashi, Masakazu; Onikura, Hiroki

    2013-09-01

    This paper presents the design, analysis and test results of the low-shock payload separation mechanism for the H-IIA launch vehicle. The mechanism is based on a simple and reliable four-bar linkage, which makes the release speed of the marman clamp band tension lower than the current system.The adequacy of the principle for low-shock mechanism was evaluated by some simulations and results of fundamental tests. Then, we established the reliability design model of this mechanism, and the adequacy of this model was evaluated by elemental tests.Finally, we conducted the system separation tests using the payload adapter to which the mechanism was assembled, to confirm that the actual separation shock level satisfied our target.

  17. Nesting large-eddy simulations within mesoscale simulations for wind energy applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundquist, J K; Mirocha, J D; Chow, F K

    2008-09-08

    With increasing demand for more accurate atmospheric simulations for wind turbine micrositing, for operational wind power forecasting, and for more reliable turbine design, simulations of atmospheric flow with resolution of tens of meters or higher are required. These time-dependent large-eddy simulations (LES), which resolve individual atmospheric eddies on length scales smaller than turbine blades and account for complex terrain, are possible with a range of commercial and open-source software, including the Weather Research and Forecasting (WRF) model. In addition to 'local' sources of turbulence within an LES domain, changing weather conditions outside the domain can also affect flow, suggesting thatmore » a mesoscale model provide boundary conditions to the large-eddy simulations. Nesting a large-eddy simulation within a mesoscale model requires nuanced representations of turbulence. Our group has improved the Weather and Research Forecasting model's (WRF) LES capability by implementing the Nonlinear Backscatter and Anisotropy (NBA) subfilter stress model following Kosovic (1997) and an explicit filtering and reconstruction technique to compute the Resolvable Subfilter-Scale (RSFS) stresses (following Chow et al, 2005). We have also implemented an immersed boundary method (IBM) in WRF to accommodate complex terrain. These new models improve WRF's LES capabilities over complex terrain and in stable atmospheric conditions. We demonstrate approaches to nesting LES within a mesoscale simulation for farms of wind turbines in hilly regions. Results are sensitive to the nesting method, indicating that care must be taken to provide appropriate boundary conditions, and to allow adequate spin-up of turbulence in the LES domain.« less

  18. Simulated CONUS Flash Flood Climatologies from Distributed Hydrologic Models

    NASA Astrophysics Data System (ADS)

    Flamig, Z.; Gourley, J. J.; Vergara, H. J.; Kirstetter, P. E.; Hong, Y.

    2016-12-01

    This study will describe a CONUS flash flood climatology created over the period from 2002 through 2011. The MRMS reanalysis precipitation dataset was used as forcing into the Ensemble Framework For Flash Flood Forecasting (EF5). This high resolution 1-sq km 5-minute dataset is ideal for simulating flash floods with a distributed hydrologic model. EF5 features multiple water balance components including SAC-SMA, CREST, and a hydrophobic model all coupled with kinematic wave routing. The EF5/SAC-SMA and EF5/CREST water balance schemes were used for the creation of dual flash flood climatologies based on the differing water balance principles. For the period from 2002 through 2011 the daily maximum streamflow, unit streamflow, and time of peak streamflow was stored along with the minimum soil moisture. These variables are used to describe the states of the soils right before a flash flood event and the peak streamflow that was simulated during the flash flood event. The results will be shown, compared and contrasted. The resulting model simulations will be verified on basins less than 1,000-sq km with USGS gauges to ensure the distributed hydrologic models are reliable. The results will also be compared spatially to Storm Data flash flood event observations to judge the degree of agreement between the simulated climatologies and observations.

  19. State-of-the-Art Review on Physiologically Based Pharmacokinetic Modeling in Pediatric Drug Development.

    PubMed

    Yellepeddi, Venkata; Rower, Joseph; Liu, Xiaoxi; Kumar, Shaun; Rashid, Jahidur; Sherwin, Catherine M T

    2018-05-18

    Physiologically based pharmacokinetic modeling and simulation is an important tool for predicting the pharmacokinetics, pharmacodynamics, and safety of drugs in pediatrics. Physiologically based pharmacokinetic modeling is applied in pediatric drug development for first-time-in-pediatric dose selection, simulation-based trial design, correlation with target organ toxicities, risk assessment by investigating possible drug-drug interactions, real-time assessment of pharmacokinetic-safety relationships, and assessment of non-systemic biodistribution targets. This review summarizes the details of a physiologically based pharmacokinetic modeling approach in pediatric drug research, emphasizing reports on pediatric physiologically based pharmacokinetic models of individual drugs. We also compare and contrast the strategies employed by various researchers in pediatric physiologically based pharmacokinetic modeling and provide a comprehensive overview of physiologically based pharmacokinetic modeling strategies and approaches in pediatrics. We discuss the impact of physiologically based pharmacokinetic models on regulatory reviews and product labels in the field of pediatric pharmacotherapy. Additionally, we examine in detail the current limitations and future directions of physiologically based pharmacokinetic modeling in pediatrics with regard to the ability to predict plasma concentrations and pharmacokinetic parameters. Despite the skepticism and concern in the pediatric community about the reliability of physiologically based pharmacokinetic models, there is substantial evidence that pediatric physiologically based pharmacokinetic models have been used successfully to predict differences in pharmacokinetics between adults and children for several drugs. It is obvious that the use of physiologically based pharmacokinetic modeling to support various stages of pediatric drug development is highly attractive and will rapidly increase, provided the robustness and reliability of these techniques are well established.

  20. FE Simulation Models for Hot Stamping an Automobile Component with Tailor-Welded High-Strength Steels

    NASA Astrophysics Data System (ADS)

    Tang, Bingtao; Wang, Qiaoling; Wei, Zhaohui; Meng, Xianju; Yuan, Zhengjun

    2016-05-01

    Ultra-high-strength in sheet metal parts can be achieved with hot stamping process. To improve the crash performance and save vehicle weight, it is necessary to produce components with tailored properties. The use of tailor-welded high-strength steel is a relatively new hot stamping process for saving weight and obtaining desired local stiffness and crash performance. The simulation of hot stamping boron steel, especially tailor-welded blanks (TWBs) stamping, is more complex and challenging. Information about thermal/mechanical properties of tools and sheet materials, heat transfer, and friction between the deforming material and the tools is required in detail. In this study, the boron-manganese steel B1500HS and high-strength low-alloy steel B340LA are tailor welded and hot stamped. In order to precisely simulate the hot stamping process, modeling and simulation of hot stamping tailor-welded high-strength steels, including phase transformation modeling, thermal modeling, and thermal-mechanical modeling, is investigated. Meanwhile, the welding zone of tailor-welded blanks should be sufficiently accurate to describe thermal, mechanical, and metallurgical parameters. FE simulation model using TWBs with the thickness combination of 1.6 mm boron steel and 1.2 mm low-alloy steel is established. In order to evaluate the mechanical properties of the hot stamped automotive component (mini b-pillar), hardness and microstructure at each region are investigated. The comparisons between simulated results and experimental observations show the reliability of thermo-mechanical and metallurgical modeling strategies of TWBs hot stamping process.

  1. Sensitivity of tumor motion simulation accuracy to lung biomechanical modeling approaches and parameters.

    PubMed

    Tehrani, Joubin Nasehi; Yang, Yin; Werner, Rene; Lu, Wei; Low, Daniel; Guo, Xiaohu; Wang, Jing

    2015-11-21

    Finite element analysis (FEA)-based biomechanical modeling can be used to predict lung respiratory motion. In this technique, elastic models and biomechanical parameters are two important factors that determine modeling accuracy. We systematically evaluated the effects of lung and lung tumor biomechanical modeling approaches and related parameters to improve the accuracy of motion simulation of lung tumor center of mass (TCM) displacements. Experiments were conducted with four-dimensional computed tomography (4D-CT). A Quasi-Newton FEA was performed to simulate lung and related tumor displacements between end-expiration (phase 50%) and other respiration phases (0%, 10%, 20%, 30%, and 40%). Both linear isotropic and non-linear hyperelastic materials, including the neo-Hookean compressible and uncoupled Mooney-Rivlin models, were used to create a finite element model (FEM) of lung and tumors. Lung surface displacement vector fields (SDVFs) were obtained by registering the 50% phase CT to other respiration phases, using the non-rigid demons registration algorithm. The obtained SDVFs were used as lung surface displacement boundary conditions in FEM. The sensitivity of TCM displacement to lung and tumor biomechanical parameters was assessed in eight patients for all three models. Patient-specific optimal parameters were estimated by minimizing the TCM motion simulation errors between phase 50% and phase 0%. The uncoupled Mooney-Rivlin material model showed the highest TCM motion simulation accuracy. The average TCM motion simulation absolute errors for the Mooney-Rivlin material model along left-right, anterior-posterior, and superior-inferior directions were 0.80 mm, 0.86 mm, and 1.51 mm, respectively. The proposed strategy provides a reliable method to estimate patient-specific biomechanical parameters in FEM for lung tumor motion simulation.

  2. Sensitivity of Tumor Motion Simulation Accuracy to Lung Biomechanical Modeling Approaches and Parameters

    PubMed Central

    Tehrani, Joubin Nasehi; Yang, Yin; Werner, Rene; Lu, Wei; Low, Daniel; Guo, Xiaohu

    2015-01-01

    Finite element analysis (FEA)-based biomechanical modeling can be used to predict lung respiratory motion. In this technique, elastic models and biomechanical parameters are two important factors that determine modeling accuracy. We systematically evaluated the effects of lung and lung tumor biomechanical modeling approaches and related parameters to improve the accuracy of motion simulation of lung tumor center of mass (TCM) displacements. Experiments were conducted with four-dimensional computed tomography (4D-CT). A Quasi-Newton FEA was performed to simulate lung and related tumor displacements between end-expiration (phase 50%) and other respiration phases (0%, 10%, 20%, 30%, and 40%). Both linear isotropic and non-linear hyperelastic materials, including the Neo-Hookean compressible and uncoupled Mooney-Rivlin models, were used to create a finite element model (FEM) of lung and tumors. Lung surface displacement vector fields (SDVFs) were obtained by registering the 50% phase CT to other respiration phases, using the non-rigid demons registration algorithm. The obtained SDVFs were used as lung surface displacement boundary conditions in FEM. The sensitivity of TCM displacement to lung and tumor biomechanical parameters was assessed in eight patients for all three models. Patient-specific optimal parameters were estimated by minimizing the TCM motion simulation errors between phase 50% and phase 0%. The uncoupled Mooney-Rivlin material model showed the highest TCM motion simulation accuracy. The average TCM motion simulation absolute errors for the Mooney-Rivlin material model along left-right (LR), anterior-posterior (AP), and superior-inferior (SI) directions were 0.80 mm, 0.86 mm, and 1.51 mm, respectively. The proposed strategy provides a reliable method to estimate patient-specific biomechanical parameters in FEM for lung tumor motion simulation. PMID:26531324

  3. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describesmore » current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.« less

  4. Simulation of Grouting Process in Rock Masses Under a Dam Foundation Characterized by a 3D Fracture Network

    NASA Astrophysics Data System (ADS)

    Deng, Shaohui; Wang, Xiaoling; Yu, Jia; Zhang, Yichi; Liu, Zhen; Zhu, Yushan

    2018-06-01

    Grouting plays a crucial role in dam safety. Due to the concealment of grouting activities, complexity of fracture distribution in rock masses and rheological properties of cement grout, it is difficult to analyze the effects of grouting. In this paper, a computational fluid dynamics (CFD) simulation approach of dam foundation grouting based on a 3D fracture network model is proposed. In this approach, the 3D fracture network model, which is based on an improved bootstrap sampling method and established by VisualGeo software, can provide a reliable and accurate geometric model for CFD simulation of dam foundation grouting. Based on the model, a CFD simulation is performed, in which the Papanastasiou regularized model is used to express the grout rheological properties, and the volume of fluid technique is utilized to capture the grout fronts. Two sets of tests are performed to verify the effectiveness of the Papanastasiou regularized model. When applying the CFD simulation approach for dam foundation grouting, three technical issues can be solved: (1) collapsing potential of the fracture samples, (2) inconsistencies in the geometric model in actual fractures under complex geological conditions, and (3) inappropriate method of characterizing the rheological properties of cement grout. The applicability of the proposed approach is demonstrated by an illustrative case study—a hydropower station dam foundation in southwestern China.

  5. 76 FR 66220 - Automatic Underfrequency Load Shedding and Load Shedding Plans Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ..., EPRI Power Systems Dynamics Tutorial, Chapter 4 at page 4-78 (2009), available at http://www.epri.com.... Power systems consist of static components (e.g., transformers and transmission lines) and dynamic... decisions on simulations, both static and dynamic, using area power system models to meet requirements in...

  6. Empirically Derived Optimal Growth Equations For Hardwoods and Softwoods in Arkansas

    Treesearch

    Don C. Bragg

    2002-01-01

    Accurate growth projections are critical to reliable forest models, and ecologically based simulators can improve siivicultural predictions because of their sensitivity to change and their capacity to produce long-term forecasts. Potential relative increment (PRI) optimal diameter growth equations for loblolly pine, shortleaf pine, sweetgum, and white oak were fit to...

  7. On the Simulation of Sea States with High Significant Wave Height for the Validation of Parameter Retrieval Algorithms for Future Altimetry Missions

    NASA Astrophysics Data System (ADS)

    Kuschenerus, Mieke; Cullen, Robert

    2016-08-01

    To ensure reliability and precision of wave height estimates for future satellite altimetry missions such as Sentinel 6, reliable parameter retrieval algorithms that can extract significant wave heights up to 20 m have to be established. The retrieved parameters, i.e. the retrieval methods need to be validated extensively on a wide range of possible significant wave heights. Although current missions require wave height retrievals up to 20 m, there is little evidence of systematic validation of parameter retrieval methods for sea states with wave heights above 10 m. This paper provides a definition of a set of simulated sea states with significant wave height up to 20 m, that allow simulation of radar altimeter response echoes for extreme sea states in SAR and low resolution mode. The simulated radar responses are used to derive significant wave height estimates, which can be compared with the initial models, allowing precision estimations of the applied parameter retrieval methods. Thus we establish a validation method for significant wave height retrieval for sea states causing high significant wave heights, to allow improved understanding and planning of future satellite altimetry mission validation.

  8. Radio Evolution of Supernova Remnants Including Nonlinear Particle Acceleration: Insights from Hydrodynamic Simulations

    NASA Astrophysics Data System (ADS)

    Pavlović, Marko Z.; Urošević, Dejan; Arbutina, Bojan; Orlando, Salvatore; Maxted, Nigel; Filipović, Miroslav D.

    2018-01-01

    We present a model for the radio evolution of supernova remnants (SNRs) obtained by using three-dimensional hydrodynamic simulations coupled with nonlinear kinetic theory of cosmic-ray (CR) acceleration in SNRs. We model the radio evolution of SNRs on a global level by performing simulations for a wide range of the relevant physical parameters, such as the ambient density, supernova (SN) explosion energy, acceleration efficiency, and magnetic field amplification (MFA) efficiency. We attribute the observed spread of radio surface brightnesses for corresponding SNR diameters to the spread of these parameters. In addition to our simulations of Type Ia SNRs, we also considered SNR radio evolution in denser, nonuniform circumstellar environments modified by the progenitor star wind. These simulations start with the mass of the ejecta substantially higher than in the case of a Type Ia SN and presumably lower shock speed. The magnetic field is understandably seen as very important for the radio evolution of SNRs. In terms of MFA, we include both resonant and nonresonant modes in our large-scale simulations by implementing models obtained from first-principles, particle-in-cell simulations and nonlinear magnetohydrodynamical simulations. We test the quality and reliability of our models on a sample consisting of Galactic and extragalactic SNRs. Our simulations give Σ ‑ D slopes between ‑4 and ‑6 for the full Sedov regime. Recent empirical slopes obtained for the Galactic samples are around ‑5, while those for the extragalactic samples are around ‑4.

  9. A computational workflow for designing silicon donor qubits

    DOE PAGES

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...

    2016-09-19

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less

  10. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  11. Development of a Numerical Model for Orthogonal Cutting. Discussion about the Sensitivity to Friction Problem

    NASA Astrophysics Data System (ADS)

    San Juan, M.; de la Iglesia, J. M.; Martín, O.; Santos, F. J.

    2009-11-01

    In despite of the important progresses achieved in the knowledge of cutting processes, the study of certain aspects has undergone the very limitations of the experimental means: temperature gradients, frictions, contact, etc… Therefore, the development of numerical models is a valid tool as a first approach to study of those problems. In the present work, a calculation model under Abaqus Explicit code is developed to represent the orthogonal cutting of AISI 4140 steel. A bidimensional simulation under plane strain conditions, which is considered as adiabatic due to the high speed of the material flow, is chosen. The chip separation is defined by means of a fracture law that allows complex simulations of tool penetration in the workpiece. The strong influence of friction on cutting is proved, therefore a very good definition of materials behaviour laws could be obtained, but an erroneous value of friction coefficient could notably reduce the reliability. Considering the difficulty of checking the friction models used in the simulation, from the tests carried out habitually, the most efficacious way to characterize the friction would be to combine simulation models with cutting tests.

  12. Data-Driven Risk Assessment from Small Scale Epidemics: Estimation and Model Choice for Spatio-Temporal Data with Application to a Classical Swine Fever Outbreak

    PubMed Central

    Gamado, Kokouvi; Marion, Glenn; Porphyre, Thibaud

    2017-01-01

    Livestock epidemics have the potential to give rise to significant economic, welfare, and social costs. Incursions of emerging and re-emerging pathogens may lead to small and repeated outbreaks. Analysis of the resulting data is statistically challenging but can inform disease preparedness reducing potential future losses. We present a framework for spatial risk assessment of disease incursions based on data from small localized historic outbreaks. We focus on between-farm spread of livestock pathogens and illustrate our methods by application to data on the small outbreak of Classical Swine Fever (CSF) that occurred in 2000 in East Anglia, UK. We apply models based on continuous time semi-Markov processes, using data-augmentation Markov Chain Monte Carlo techniques within a Bayesian framework to infer disease dynamics and detection from incompletely observed outbreaks. The spatial transmission kernel describing pathogen spread between farms, and the distribution of times between infection and detection, is estimated alongside unobserved exposure times. Our results demonstrate inference is reliable even for relatively small outbreaks when the data-generating model is known. However, associated risk assessments depend strongly on the form of the fitted transmission kernel. Therefore, for real applications, methods are needed to select the most appropriate model in light of the data. We assess standard Deviance Information Criteria (DIC) model selection tools and recently introduced latent residual methods of model assessment, in selecting the functional form of the spatial transmission kernel. These methods are applied to the CSF data, and tested in simulated scenarios which represent field data, but assume the data generation mechanism is known. Analysis of simulated scenarios shows that latent residual methods enable reliable selection of the transmission kernel even for small outbreaks whereas the DIC is less reliable. Moreover, compared with DIC, model choice based on latent residual assessment correlated better with predicted risk. PMID:28293559

  13. Smoldyn on graphics processing units: massively parallel Brownian dynamics simulations.

    PubMed

    Dematté, Lorenzo

    2012-01-01

    Space is a very important aspect in the simulation of biochemical systems; recently, the need for simulation algorithms able to cope with space is becoming more and more compelling. Complex and detailed models of biochemical systems need to deal with the movement of single molecules and particles, taking into consideration localized fluctuations, transportation phenomena, and diffusion. A common drawback of spatial models lies in their complexity: models can become very large, and their simulation could be time consuming, especially if we want to capture the systems behavior in a reliable way using stochastic methods in conjunction with a high spatial resolution. In order to deliver the promise done by systems biology to be able to understand a system as whole, we need to scale up the size of models we are able to simulate, moving from sequential to parallel simulation algorithms. In this paper, we analyze Smoldyn, a widely diffused algorithm for stochastic simulation of chemical reactions with spatial resolution and single molecule detail, and we propose an alternative, innovative implementation that exploits the parallelism of Graphics Processing Units (GPUs). The implementation executes the most computational demanding steps (computation of diffusion, unimolecular, and bimolecular reaction, as well as the most common cases of molecule-surface interaction) on the GPU, computing them in parallel on each molecule of the system. The implementation offers good speed-ups and real time, high quality graphics output

  14. Product component genealogy modeling and field-failure prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Caleb; Hong, Yili; Meeker, William Q.

    Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less

  15. Product component genealogy modeling and field-failure prediction

    DOE PAGES

    King, Caleb; Hong, Yili; Meeker, William Q.

    2016-04-13

    Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less

  16. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    PubMed

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  17. Protein modeling and molecular dynamics simulation of the two novel surfactant proteins SP-G and SP-H.

    PubMed

    Rausch, Felix; Schicht, Martin; Bräuer, Lars; Paulsen, Friedrich; Brandt, Wolfgang

    2014-11-01

    Surfactant proteins are well known from the human lung where they are responsible for the stability and flexibility of the pulmonary surfactant system. They are able to influence the surface tension of the gas-liquid interface specifically by directly interacting with single lipids. This work describes the generation of reliable protein structure models to support the experimental characterization of two novel putative surfactant proteins called SP-G and SP-H. The obtained protein models were complemented by predicted posttranslational modifications and placed in a lipid model system mimicking the pulmonary surface. Molecular dynamics simulations of these protein-lipid systems showed the stability of the protein models and the formation of interactions between protein surface and lipid head groups on an atomic scale. Thereby, interaction interface and strength seem to be dependent on orientation and posttranslational modification of the protein. The here presented modeling was fundamental for experimental localization studies and the simulations showed that SP-G and SP-H are theoretically able to interact with lipid systems and thus are members of the surfactant protein family.

  18. Lattice Boltzmann Method for 3-D Flows with Curved Boundary

    NASA Technical Reports Server (NTRS)

    Mei, Renwei; Shyy, Wei; Yu, Dazhi; Luo, Li-Shi

    2002-01-01

    In this work, we investigate two issues that are important to computational efficiency and reliability in fluid dynamics applications of the lattice, Boltzmann equation (LBE): (1) Computational stability and accuracy of different lattice Boltzmann models and (2) the treatment of the boundary conditions on curved solid boundaries and their 3-D implementations. Three athermal 3-D LBE models (D3QI5, D3Ql9, and D3Q27) are studied and compared in terms of efficiency, accuracy, and robustness. The boundary treatment recently developed by Filippova and Hanel and Met et al. in 2-D is extended to and implemented for 3-D. The convergence, stability, and computational efficiency of the 3-D LBE models with the boundary treatment for curved boundaries were tested in simulations of four 3-D flows: (1) Fully developed flows in a square duct, (2) flow in a 3-D lid-driven cavity, (3) fully developed flows in a circular pipe, and (4) a uniform flow over a sphere. We found that while the fifteen-velocity 3-D (D3Ql5) model is more prone to numerical instability and the D3Q27 is more computationally intensive, the 63Q19 model provides a balance between computational reliability and efficiency. Through numerical simulations, we demonstrated that the boundary treatment for 3-D arbitrary curved geometry has second-order accuracy and possesses satisfactory stability characteristics.

  19. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  20. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

Top